Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
shamking
Starting Member
7 Posts |
Posted - 2005-01-13 : 06:14:02
|
| Referring to the item http://www.sqlteam.com/item.asp?ItemID=11499I have been experiening some performance issues.Here is the queriesSET STATISTICS IO ONSET STATISTICS TIME ONDeclare @list varchar(100)Declare @startDate datetimeDeclare @endDate datetimeselect @list = '3586,3587,3588,3589,3590'select @startDate = '08/01/04'select @endDate = '08/10/04'select rid from tk_mu_data m JOIN dbo.CsvToInt(@list) von v.IntValue = m.vehicle_ridand m.data_date between @startDate and @endDateand ( alarm_rid = 1 or alarm_rid = 2 ) Table '#0A19E0AE'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0.Table 'tk_mu_data'. Scan count 5, logical reads 2139968, physical reads 0, read-ahead reads 0.SQL Server Execution Times: CPU time = 4609 ms, elapsed time = 1477 ms.select rid from tk_mu_data m where vehicle_rid in (3586,3587,3588,3589,3590)and m.data_date between @startDate and @endDateand ( alarm_rid = 1 or alarm_rid = 2 ) Table 'tk_mu_data'. Scan count 28, logical reads 4452, physical reads 0, read-ahead reads 0.SQL Server Execution Times: CPU time = 2250 ms, elapsed time = 1377 ms.I have noticed the logical reads on the 2nd query is much lower and as a result the CPU time is much lower.Can anyone explain to me how I can achieve this performance of the 2nd query using dynamic sql.I have tried all option from forcing indexes, temp tables, dynamic exec statements. Some did reduce the logical reads but increased the scan count but the CPU time also increased dramically.Thanks in advancePaddy |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2005-01-13 : 16:38:24
|
In the comment section of that article that you are referring to is a faster solution. Take a look at Joe Celko's and Page47's solutions:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=20675The part that you would need from theirs is this:JOIN( select nullif(substring(@sep+@csv+@sep,n,charindex(@sep,@sep+@csv+@sep,n)-n),'') as element from dbo.numbers where n<=datalength(@sep+@csv+@sep) and n-datalength(@sep)>0 and substring(@sep+@csv+@sep,n-datalength(@sep),datalength(@sep))=@sep and charindex(@sep,@sep+@csv+@sep,n)-n>0) csvON CSV.element = p.ProductID So you'll have this:DECLARE @Sep char(1)SET @Sep = ','Then your SELECT joined to their solution and changing the second part of the join condition (p.ProductID becomes your column). You'll need a Numbers table for this solution. This table has one column, n which is an int and has data from 1-8000.Tara |
 |
|
|
shamking
Starting Member
7 Posts |
Posted - 2005-02-24 : 11:28:43
|
| I found my solution here.http://www.sommarskog.se/arrays-in-sql.htmlI used the 'Fixed-Length Array Elements' solutionvery fast compared to above solution or anything else I tried.Paddy |
 |
|
|
|
|
|
|
|