Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Transact-SQL (2000)
 Dynamic Sql - Performance

Author  Topic 

shamking
Starting Member

7 Posts

Posted - 2005-01-13 : 06:14:02
Referring to the item

http://www.sqlteam.com/item.asp?ItemID=11499

I have been experiening some performance issues.

Here is the queries

SET STATISTICS IO ON
SET STATISTICS TIME ON

Declare @list varchar(100)
Declare @startDate datetime
Declare @endDate datetime

select @list = '3586,3587,3588,3589,3590'
select @startDate = '08/01/04'
select @endDate = '08/10/04'


select rid from tk_mu_data m
JOIN dbo.CsvToInt(@list) v
on v.IntValue = m.vehicle_rid
and m.data_date between @startDate and @endDate
and ( alarm_rid = 1 or alarm_rid = 2 )

Table '#0A19E0AE'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0.
Table 'tk_mu_data'. Scan count 5, logical reads 2139968, physical reads 0, read-ahead reads 0.

SQL Server Execution Times:
CPU time = 4609 ms, elapsed time = 1477 ms.


select rid from tk_mu_data m
where vehicle_rid in (3586,3587,3588,3589,3590)
and m.data_date between @startDate and @endDate
and ( alarm_rid = 1 or alarm_rid = 2 )

Table 'tk_mu_data'. Scan count 28, logical reads 4452, physical reads 0, read-ahead reads 0.

SQL Server Execution Times:
CPU time = 2250 ms, elapsed time = 1377 ms.


I have noticed the logical reads on the 2nd query is much lower and as a result the CPU time is much lower.

Can anyone explain to me how I can achieve this performance of the 2nd query using dynamic sql.
I have tried all option from forcing indexes, temp tables, dynamic exec statements. Some did reduce the logical reads but increased the scan count but the CPU time also increased dramically.

Thanks in advance
Paddy

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2005-01-13 : 16:38:24
In the comment section of that article that you are referring to is a faster solution. Take a look at Joe Celko's and Page47's solutions:

http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=20675

The part that you would need from theirs is this:



JOIN
(
select nullif(substring(@sep+@csv+@sep,n,charindex(@sep,@sep+@csv+@sep,n)-n),'') as element
from dbo.numbers
where
n<=datalength(@sep+@csv+@sep) and
n-datalength(@sep)>0 and
substring(@sep+@csv+@sep,n-datalength(@sep),datalength(@sep))=@sep and
charindex(@sep,@sep+@csv+@sep,n)-n>0
) csv
ON CSV.element = p.ProductID



So you'll have this:

DECLARE @Sep char(1)

SET @Sep = ','

Then your SELECT joined to their solution and changing the second part of the join condition (p.ProductID becomes your column). You'll need a Numbers table for this solution. This table has one column, n which is an int and has data from 1-8000.

Tara
Go to Top of Page

shamking
Starting Member

7 Posts

Posted - 2005-02-24 : 11:28:43
I found my solution here.

http://www.sommarskog.se/arrays-in-sql.html

I used the 'Fixed-Length Array Elements' solution
very fast compared to above solution or anything else I tried.

Paddy
Go to Top of Page
   

- Advertisement -