Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Administration (2000)
 Monitoring Memory Usage

Author  Topic 

TonyTheDBA
Posting Yak Master

121 Posts

Posted - 2006-04-25 : 09:41:02
Hi all,

Following on from somecode that was posted here some time ago, I am now monitoring My Servers (2000) CPU and IO Utilisation. Due to a supplier stating that "As you have three instances of the database running on one server You need Three Times the recomended memory or you will run into performance problems". The suppliers recomendation is 4Gb of Ram for one instance, which means that we need to fit 12Gb. Personally I suspect that they are doing this to avoid having to sort out poorly written code.

We currently run two instances side by side with what appears to be no problems, Certainly it is not CPU or Disk Bound, I would like to monitor what the memory situation is to see if the 'performance problems' are present. I know that I could Use perfmon to check it in real time (And Probably Profiler), but I would really like to log it to a database so that I can do trend Analysis on it to see what sort of patterns emerge.

Are there any Hidden sp_ or xp_ that would allow me to do this?



--
Regards
Tony The DBA

jen
Master Smack Fu Yak Hacker

4110 Posts

Posted - 2006-04-25 : 09:45:46
profiler can save the data in a table

--------------------
keeping it simple...
Go to Top of Page

MichaelP
Jedi Yak

2489 Posts

Posted - 2006-04-25 : 10:40:09
Perfmon can log to a file, but of course you take a slight performance penalty for doing such.
In perfmon, look at the "Performance Logs and Alerts" and specifically "Counter Logs." It allows you to log counters to a file, and then open that file up later on with perfmon so you can see Min, Max, Avg for each of your counters.

Michael

<Yoda>Use the Search page you must. Find the answer you will. Cursors, path to the Dark Side they are. Avoid them, you must. Use Order By NewID() to get a random record you will.</Yoda>

Opinions expressed in this post are not necessarily those of TeleVox Software, inc. All information is provided "AS IS" with no warranties and confers no rights.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-04-25 : 10:47:16
"The suppliers recomendation is 4Gb of Ram"

Are you running SQL Enterprise Version - 'coz SQL aint going to use 4GB otherwise, and if that's the case clearly the supplier knows diddly-squat - which will save you the bother of checking their code quality!

Kristen
Go to Top of Page

derrickleggett
Pointy Haired Yak DBA

4184 Posts

Posted - 2006-04-26 : 01:04:27
We usually send our suppliers a box of poisonous chocolates after this kind of conversation. Would you like some by any chance?

MeanOldDBA
derrickleggett@hotmail.com

When life gives you a lemon, fire the DBA.
Go to Top of Page

TonyTheDBA
Posting Yak Master

121 Posts

Posted - 2006-04-26 : 03:40:11
Sorry But Rant Comming on here, I have the serious need to vent my spleen!!

Yes we are using enterprise edition as we also want failover clustering and mirroring. . . . We are also pushing the supplier to move to SQL 2005 for the same reasons. The supplier just doesn't understand the SQL Server Paradigm, they have a suite of applications running on a mainframe that we as customers have told them needs to be moved to another environment as the leasing costs are killing us. So they have spent the last 10 years comming up with the replacement application. . . . All they have done is move the mainframe application to a SQL Server n Tier environment . . . We have TP during the Day and Overnight Batch during the night!! This is a 'web application' written in Java that requires 512Mb in the Client!, Each web server will only support 25 users max and we have around 150 users! Another client has 200 Users.

The database design is truly awful, every time the batch runs they create a complete set of tables for the results, however to their credit they have supplied an archiving tool that moves older tables to an archive.

Their tech support people told me that I should shrink the log files as they are too big at 3GB! I pointed out that when I do this they just grow again, If I leave them set at a sensible size they occupy a contiguous space on the disk which means less head movements and therefore faster log operations, hence better performance from their crap software!

Our problem is that this software delivers legislative requirements, which are extremely complex, and there are very few supplier in the marketplace, and because those suppliers have us over a barrel they can dictate what they want! Nobody wants to take a risk with developing their own (and we are not really allowed to - legislative!), or using a small company that really knows what they are doing, so we are left with mediocrity.

During this conversation the Technical Consultant got their DBA to talk to us and explain why they needed 12GB, Apparently "SQL Server loads all the data into the cache first and then uses that, If the memory cache is too small it has to swap it to disk, which leads to poor performance" . . . Well Duh! It does if you let the Crap development team write it like that!

The suppliers name Starts with A and rhymes with sh*te

I really miss the days of hand crafting code to fit it into 512Mb of memory, We had big systems with hundrerds of users, and a couple of thousand transactions per minute running on SQL Server on a dual processor 386 box with 1Gb of RAM (State of the art back then ;) ). Developers today are so divorced from what really goes on in the boxes by the operating system that they get away with writing such crap code knowing that all they have to do is throw bigger/more hardware at it. Bring Back Dos 3.3 I say!!

The views expressed in this email are truly my own, and not that of my employer


--
Regards
Tony The DBA
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-04-26 : 15:57:59
"Sorry But Rant Comming on here ..."

Disappointing to see you holding back mate!

I think its pretty normal when an application is first migrated to SQL Server (or probably any other SQL database) that it just mimics whatever it came from - in our case we mimicked ISAM ...

... we stuck the following code behind our Next Record button:
[code]
SELECT *
FROM MyTable
WHERE PKCol1 > @MyPKCol1
OR (PKCol1 = @MyPKCol1 AND PKCol2 > @MyPKCol2
OR (PKCol1 = @MyPKCol1 AND PKCol2 = @MyPKCol2 AND ...
...
ORDER BY PKCol1, PKCol2, ...
[code]
then we retrieved the first row from the result and closed the resultset!

We had probably 500,000 rows in the table. In fairness SQL Server handled this pretty well (a second or two), whereas Oracle took a minute or more to give the result.

We also put an Index on the PK, as well as declaring the PK itself, and put indexes on Boolean columns - 'coz they worked really well in ISAM.

Hmmm ... "bit crap really" if I stop to think about it!

Now we have a perfectly tuned SQL Server application. Pah! <fx:RubsHandsWithGlee!>

"I really miss the days of hand crafting code to fit it into 512Mb of memory ..."

Well Boo Hoo Sucks to that! What happened to 4K RAM boards, eh????

"Developers today are so divorced from what really goes on in the boxes by the operating system that they get away with writing such crap code knowing that all they have to do is throw bigger/more hardware at it"

Seconded. "Bolt together a couple of DDE thingies, OLE Objects, COM objects, ActiveX controls" - what a crap idea that is - great for submarining all the problems, to a level where the developers have no chance of discovering why the thing is misbehaving, and then Company A just quietly goes bust ...

Kristen
Go to Top of Page
   

- Advertisement -