Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 SQL Optimization Advice

Author  Topic 

sqlec
Starting Member

4 Posts

Posted - 2006-04-29 : 11:24:17
Hey All,
I started a new job a while back and one of the requests from management was to speed up the public search.
Initial research showed that some of the larger queries where taking as much as 30 seconds to run(including page load times), while only returning 2500 results, Not very impressive if you ask me considering that the main db server is a quad xeon with 4 gigs of ram.
Not to mention that the CPU usage was at a minimum of 85% for at least 15 hours a day with 100% spikes lasting for 2-4 seconds at times.

I have done all that I can think of to improve performance.
Reindex tables using profiler and the index tuning wizard.

The CPU usage is now down to an average of 45% for the same duration of time and minimal spikes.
Load page times are way down, nothing that I have found to be in excess of 10 seconds at peak times with more than 1000 active users on.
Average page load times are now under 5 seconds.
I have also optimized as best that I can think of the asp side of things as well.

My question is, what else can I do to speed performance. I know I can squeak more performance out of this and I am hoping to get some of the smaller page loads down to under 2 seconds.

Could it just be that we need more/better hardware? I have proposed this but didnt get the favorable attention that I think it deserved.

Keep in mind that most of the queries are inline/embedded, the reason for this, firstly is that, this is how I came into it, and most of the other developers are totally against stored procedures for some reason, another reason why is that most of the larger queries *MIGHT* require up to 15 params and I have yet to find a way to create a stored procedure that is quicker than the inline/embedded sql that I rewrote. Dont get me wrong I prefer stored procedures and have attempted to rewrite it using SP's but just couldnt match the speed for some reason.......

I am considering proposing some type of caching system to management. Any ideas? Thoughts?

Any advise would be appreciated.

Kristen
Test

22859 Posts

Posted - 2006-04-29 : 11:39:06
"My question is, what else can I do to speed performance"

Improve the application code ...

"Could it just be that we need more/better hardware?"

I doubt it ... I would still tend to "improve the application code" ...

However, I am assuming that you have SQL Server on a box on its own, separate disk channels for Data and Logs (and preferably additional disk channels for O/S and Backup files), and that the Log/Backup channels are Mirrored and the Data is RAID10 - and there is no stinking RAID5 anywhere! If that ain't the case then your hardware is wrongly specified and I'd appreciate an invitation to the nice, upcoming, Public Execution you will be holding shortly!

"Keep in mind that most of the queries are inline/embedded"

There's most probably the lion's share of your problem... comes back to "improve the application code" ...

Post the code of one of your "average worst case" queries and see what we can propose for you to improve it - which might be a better way of doing your dynamic SQL (which is not necessarily "Bad", but at the very least it does mean that [compare to Stored Procedures] your security model has to allow SELECT permission [at the very least!] on all the tables, which is presumably a risk that management have decided is acceptable?)

Please post some CREATE TABLE statements, a few INSERTs of sample (can be "fake"!) data, the query to be improved, and an indication of what the output is intended to be (if that's not immediately obvious)

Kristen
Go to Top of Page

Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)

7020 Posts

Posted - 2006-04-29 : 17:57:48
I agree with Kristen that the application performance problems should be attacked by looking at the queries. The problem may not be as bad as it seems.

My experience is that only a small percentage of the activity takes up most of the processor power. This is one reason why stored procedures are good. You can run performance monitor with output into a table, use the Textdata column to identify the procedures being run, and then find the sum of the CPU usage for each procedure to see which is using the most CPU time. We had a system that had the kind of performance problems you described, and were able to reduce CPU usage 80% by optimizing just 4 stored procedures.

You should still be able to use Profiler data to identify the queries that are really killing your system. You may be surprised what is causing the problem. A small, fast query that is run 6 times per second may use more total CPU time than those long running queries.


Turning to the hardware/software side:
What version and edition of SQL Server are you using? How big is the database?

I would look at Performance Monitor statistics to see what is causing the bottlenecks.

Look at Performance Monitor Processor queue lengths to see if the processors are keeping up with the workload. If the processor queue lengths stay above 0 for long periods of time, your application is overloading the CPU.

Look at Performance Monitor IO statistics on your disks, especially disk queue lengths to see if the disks are keeping up with the workload. If the disk queue lengths stay above 0 for long periods of time, your application is overloading the disk system.

Look at Performance Monitor SQL Server statistics on Page Reads and Page Writes per Second to see how much SQL Server is going to disk. If SQL Server is reading and writing a lot of pages per second, you are running Enterprise Edition, and the server can accept more memory, you should be able to improve performance by increasing the RAM on the server.

Max out the memory if you can, it is fairly cheap compared to other hardware upgrades. More memory will also help to remove a disk bottleneck, if you have one, and should cut down processor usage, because disk IO is CPU expensive. The ideal is that there is no disk read activity and SQL Server is using the CPU only for lookups of pages already in memory and to write data to disk.





CODO ERGO SUM
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-04-30 : 05:24:12
"Max out the memory if you can, it is fairly cheap compared to other hardware upgrades"

Yup, that's one hardware upgrade I would definitely want, independently of any application optimisation. Even if you don't need the RAM right-now it will help on the occasions when you do, and chances are you do need it right now!

Unless you have Enterprise Edition you are limited to 2GB for SQL Server (but a bit more than that - say 3~4GB - would give SQL Server its full 2GB and let the O/S have some too!

Kristen
Go to Top of Page

sqlec
Starting Member

4 Posts

Posted - 2006-04-30 : 07:26:21
Thanks for the replies everybody, Here is a sample of a trace that I did the other day.

SELECT TOP 25 CPU, Duration, Reads, Writes
FROM favoritestrace
WHERE (EventClass = 10)
ORDER BY CPU DESC, Duration DESC, Reads, Writes

CPU Duration Reads Writes
6782 1920 27382 11
6750 1910 27016 11
5812 5906 2555 0
4594 4623 2173 2
3093 1186 28891 15
3047 3360 5705 0
2954 800 26553 10
2782 1060 26734 15
2359 2516 4660 0
2327 610 13334 4
2297 593 13328 5
2282 610 13350 5
2281 593 13342 6
2250 580 13335 4
2235 936 5829 3
2219 640 13310 6
1922 2000 4477 2
1657 1000 13109 6
1625 843 5116 0
1610 560 5110 0
1594 860 13135 1
1469 1470 2031 2
1469 546 4381 0
1422 420 2243 2
1390 1423 17334 4


We are running a quad xeon box with 4 gigs of ram, enterprise edition sql server. Using profiler I have removed/optimized ALL the major queries that where causing any issues. For instance the previous developer had too many indices on a smaller table and typical queries where taking 2 seconds to run for 2 records and 8 or so columns, I ran the index tuning wizard against this table and now the queries against this table take less than a second which is what I expect.

Also the memory typically stays put at around 2 gigs ALL day no matter what. How do I optimize my queries to run in memory? VS use CPU? is that possible? Is that what Sproc Caching is all about? If so then I definitley need to move my main query to at least a partial sproc..... Problem is, the query can have up to 15 params and is built on the fly, along with different "order by's" so the only way that I could get it to work in a sproc is to build it dynamically in the sproc and execution times actually diminished.

Like I said before I have optimized everything as much as I can already utilizing profiler etc, and have reduced execution times from as high as 30 seconds down to around 5 seconds. AND cpu usage is also down in a big way as well.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-04-30 : 08:22:55
"Problem is, the query can have up to 15 params and is built on the fly, along with different "order by's" so the only way that I could get it to work in a sproc is to build it dynamically in the sproc and execution times actually diminished"

There are two or three ways to tackle that - but as I requested above I'd prefer to see some of your code otherwise I'm making suggestions blind, and may well offer a different solution if I could actually see the "problem"

1) Use sp_ExecuteSQL for your query, and parametrise it, so that the general dynamic SQL you use can be cached

2) Build the WHERE clauses like this:

WHERE (@Param1 IS NULL OR MyColumn1 = @Param1)
AND (@Param2 IS NULL OR MyColumn2 like @Param2 + '%')
...
ORDER BY CASE WHEN @MySortOrder = 1 THEN MyColumn1 ELSE NULL END,
CASE WHEN @MySortOrder = 2 THEN MyColumn2 ELSE NULL END,
...

The second approach is not always more efficient, depends on indexes and all sorts of things.

A third approach is to make an initial cut of the data, storing the PKs in a temporary table, based on what criteria are available for a given instance - you might even EXEC a subsidiary SProc to do that depending on which parameters your main Sproc gets values in (this is to avoid the execution path of the main SProc being so complex that the cached query plan is not representative of normal operation). And then JOIN that temporary table to all the necessary tables, with a full WHERE clause as above, but this time it is only operating on a much smaller dataset, so the efficiency of indexes etc becomes less important.

So it depends on your problem!

Simple approach: try one in Query Analyser. Dig out the code that you want to improve and surround it with some timing code (see below), then replace the dynamic query with an SProc call and see how that compares instead. If the Sproc is "generic" and executes with the same number of logical operations it will perform as well, BUT will have the query plan cached too and all the other benefits of SProcs.

-- Comment in the SHOWPLAN to see the Query Plan,
-- OR the STATISTICS to see the logical stats
-- SET SHOWPLAN_TEXT ON
GO
-- SET STATISTICS IO ON; SET STATISTICS TIME ON

-- ... put query here - e.g.:

SELECT * FROM Northwind.dbo.Products

SET STATISTICS IO OFF; SET STATISTICS TIME OFF
GO
SET SET SHOWPLAN_TEXT OFF
GO

Its the LOGICAL counts in the STATISTICS outputs you want, not the Physical ones - work on them to get them as low as you can.

See http://www.sql-server-performance.com/statistics_io_time.asp for more details

Kristen
Go to Top of Page

sqlec
Starting Member

4 Posts

Posted - 2006-04-30 : 08:53:02
Thanks Kristen,
Unfortunatley I cannot post any code (legal reasons).
Lets just say that the main query is a zipcode proximity search.
I already tried your first two suggestions, 1st suggestion was much slower, 2nd suggestion, couldnt get it to work as the order by's are something like "Order By ID DESC, ZipCode DESC, Price DESC". Cant run those with a select case. I am gonna try your third approach.
Thanks again for your help.
Go to Top of Page

PSamsig
Constraint Violating Yak Guru

384 Posts

Posted - 2006-04-30 : 14:26:40
quote:
Originally posted by Kristen

1) Use sp_ExecuteSQL for your query, and parametrise it, so that the general dynamic SQL you use can be cached

And make sure referances to tables and views are prefixed with 'dbo.' for same reason, as stated by Kristen in an other thread (which may be obvious to you, but wasn't to me).

--
This one's tricky. You have to use calculus and imaginary numbers for this. You know, eleventeen, thirty-twelve and all those.
Go to Top of Page

Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)

7020 Posts

Posted - 2006-04-30 : 14:39:43
quote:
Originally posted by sqlec
...Also the memory typically stays put at around 2 gigs ALL day no matter what. How do I optimize my queries to run in memory?...



SQL Server uses as much memory as it needs, and doesn't give it up unless the OS is running short of memory.

Since is already using 2 GB of memory, it can probably use more.

Unless you configure SQL Server to use more memory, it will only use 2 GB. Probably the first thing you should do is set the boot.ini /3GB switch to allow SQL Server to use up to 3 GB of memory. That should give you some help right away.

After that, you could add more memory to increase the size of the cache available. You will have to convigure SQL Server to use AWE memory to take advantage of it. No reason to get too picky about the amount, just add as much as you can, because the cost is a lot lower than any other hardware upgrade you can do, and SQL Server loves lots of memory.

Did you look at the performance monitor statistics that I mentioned? What was the result?





CODO ERGO SUM
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-05-01 : 05:29:45
"Unfortunatley I cannot post any code (legal reasons)"

So just change the object names, and goof-up the data. I can't be any help by trying to guess what your code does ... and I won't be prepared to spend the time trying to either ...

"1st suggestion was much slower"

Using sp_ExecuteSQL is slower? That's not possible, at the very least it would be the same speed, so I'm struggling to believe that.

"Cant run those with a select case"

I can't see that your example prevents that ...

having said that I very much doubt my third example is the most likely candidate for your situation, but as I said without a worked example I'm just just guessing what your problem actually is.

Kristen
Go to Top of Page

MichaelP
Jedi Yak

2489 Posts

Posted - 2006-05-01 : 05:51:15
SQLEC,
You've described the CPU and RAM of your server, but not the disk subsystem yet. Can you tell us what your disk configutation is?
Please tell us the type of drive, RAID configurations, and what data is on what array.

I agree with what Michael and Kristen mention about the memory. You could probably use more RAM at this point. If the CPU utilization has come down as much as it has, you've probably now got a bottleneck at the disks. Adding RAM will help solve that problem, but depending on your current disk config you may need more / faster disks.

Thanks!

Michael

<Yoda>Use the Search page you must. Find the answer you will. Cursors, path to the Dark Side they are. Avoid them, you must. Use Order By NewID() to get a random record you will.</Yoda>

Opinions expressed in this post are not necessarily those of TeleVox Software, inc. All information is provided "AS IS" with no warranties and confers no rights.
Go to Top of Page

Arnold Fribble
Yak-finder General

1961 Posts

Posted - 2006-05-01 : 05:51:25
quote:
Originally posted by sqlec
Lets just say that the main query is a zipcode proximity search.



It'll be the spatial part of the join that's the problem, then.
Go to Top of Page

sqlec
Starting Member

4 Posts

Posted - 2006-05-01 : 09:42:53
Thanks for the reply's everybody, However I now know that the main issue is that the company is running sql server standard edition. Which is why the server is only using 2 of the 4 gigs that we have on the server, I have emailed managament requesting that we upgrade to sql server 2000 enterprise, which should immeditiatly improve performace. Thanks again everybody. Always the small things!
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-05-01 : 10:05:06
"requesting that we upgrade to sql server 2000 enterprise, which should immeditiatly improve performace"

I doubt that alone will make a big difference, unless you really need a shed-load more RAM. My guess would be that you are only going to be able to get up to 3GB with your current hardware. And if your SQL Database is for a public-facing web site then the upgrade will be serious money! (I forget what, exactly, but probably something like $20,000 for your 4 CPU box - and you may not be able to get the Enterprise Version of SQL 2000 anymore - and whilst a licence for SQL2500 would allow you to use SQL version 2000 you would have to already own the Enterprise media, or be able to borrow some! - hopefully I'm overly pessimistic on those points, and its definitely not my strong suit, but would be worth just checking that I've got that all wrong!!

Kristen
Go to Top of Page

MichaelP
Jedi Yak

2489 Posts

Posted - 2006-05-01 : 10:14:06
I agree with Kristen than going from 2GB to 3GB isn't going to help much (remember, you need to keep about 1GB for the OS). If you were able to go to 6-8GB's of RAM, that would make a pretty huge change.

List price on SQL Server EE is like $8k, with a per processor license at $20k per processor. Granted, you can find the licenses MUCH cheaper than that, it's still going to be $50-40k in software to upgrade to SQL Server EE.

It may be cheaper to add more / faster disks unfortunatly.

Michael

<Yoda>Use the Search page you must. Find the answer you will. Cursors, path to the Dark Side they are. Avoid them, you must. Use Order By NewID() to get a random record you will.</Yoda>

Opinions expressed in this post are not necessarily those of TeleVox Software, inc. All information is provided "AS IS" with no warranties and confers no rights.
Go to Top of Page

Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)

7020 Posts

Posted - 2006-05-01 : 10:33:32
It might actually be cheaper to upgrade to new hardware with SQL 2005 64 bit with twin dual-core processors, because that only counts as two processors for license purposes, and 2005 Standard 64 bit edition is not limited in the amount of memory it can use. You can buy a lot of hardware for the $40K or more the Enterprise Edition upgrade would cost.

I really don't think the original poster has done enough work yet to identify what the real bottleneck is. It's a bad mistake to launch into this kind of major upgrade without knowing the true situation. Nothing like spending $40K and then finding out you have exactly the same performance to make management skeptical of your next great idea. Or to make them decide the money they are spending on you would be better spent on hardware upgrades.



CODO ERGO SUM
Go to Top of Page

MichaelP
Jedi Yak

2489 Posts

Posted - 2006-05-01 : 10:42:38
Mr. Jones,
Well said sir.

We still have not discussed the disk configuration on this server. If it's a Quad Xeon with just a pair of SCSI disks in a RAID 1, then he needs more disks.

Michael

<Yoda>Use the Search page you must. Find the answer you will. Cursors, path to the Dark Side they are. Avoid them, you must. Use Order By NewID() to get a random record you will.</Yoda>

Opinions expressed in this post are not necessarily those of TeleVox Software, inc. All information is provided "AS IS" with no warranties and confers no rights.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-05-02 : 03:53:36
"We still have not discussed the disk configuration on this server"

Hehehe ... or the coding of any specific slow-running dynamic SQL embedded queries ...

Kristen
Go to Top of Page

PSamsig
Constraint Violating Yak Guru

384 Posts

Posted - 2006-05-02 : 05:20:46
How is the SQL Servers configured in regard to use of the CPUs?

--
This one's tricky. You have to use calculus and imaginary numbers for this. You know, eleventeen, thirty-twelve and all those.
Go to Top of Page

Arnold Fribble
Yak-finder General

1961 Posts

Posted - 2006-05-02 : 06:12:27
I still reckon there's a bit of code like this:

SELECT *
FROM (
Vendor AS V
INNER JOIN ZipLocation AS VLoc
ON V.zip = VLoc.zip
)
INNER JOIN (
Customer AS C
INNER JOIN ZipLocation AS CLoc
ON C.zip = CLoc.zip
)
ON dbo.Distance(VLoc.lat, VLoc.long, CLoc.lat, CLoc.long) < 500

Go to Top of Page
   

- Advertisement -