| Author |
Topic |
|
Lopaka
Starting Member
48 Posts |
Posted - 2010-03-15 : 15:33:24
|
| I have been trying to write a IN statement using a Variable. DECLARE @IDS VARCHAR(255)SET @IDS = '''A'''+ ',' + '''B'''+ ',' + '''C'''SELECT *FROM TABLEWHERE FIELD IN (% + @IDS + %)Can anybody tell me what I am doing wrong?ThanksRobert R. Barnes |
|
|
Lamprey
Master Smack Fu Yak Hacker
4614 Posts |
Posted - 2010-03-15 : 15:46:37
|
You can't do it like that unless you use dynamic sql. You can take your delimited string are parse it with a fucntion that returns a table. Or, you can switch things around and use a LIKE clause:DECLARE @IDS VARCHAR(255)SET @IDS = 'A,B,C'DECLARE @Table TABLE (Field VARCHAR(10))INSERT @TableSELECT 'A'UNION ALL SELECT 'Z'UNION ALL SELECT 'C'UNION ALL SELECT 'X'SELECT *FROM @TableWHERE ',' + @IDs + ',' LIKE '%,' + Field + ',%' |
 |
|
|
khtan
In (Som, Ni, Yak)
17689 Posts |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2010-03-16 : 04:01:10
|
Wow Lamprey...I have seen a bunch of these posts before but all of them suggests using a function to get things in to a table variable. I have never ever seen your solution with LIKE but i LIKE it! The performance is probably better using a function but I like the simplicity...a keeper!- LumbagoIf the facts don't fit the theory, change the facts. Albert Einstein |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-03-16 : 04:52:34
|
It will not use indexes either ... so usually only useful for quick-&-direct Has the interesting side effect that it correctly handles embedded "," in FIELD - which Function doesn't ... I'll get my coat! |
 |
|
|
michael.appleton
Posting Yak Master
160 Posts |
Posted - 2010-03-16 : 06:26:25
|
I think the problem is that you should be creating the variable as a table, like this:DECLARE @IdsTable TABLE( Id varchar(1))--variableINSERT INTO @IdsTableSELECT 'A'UNION ALL SELECT 'B'UNION ALL SELECT 'C' But if you have no choice and the variable is already given to you in that for I would suggest transforming it into a table using this function:--Creates an 'InLine' Table Valued Function (TVF)--Splits delimited strings up into different rows.--Can be used with 'CROSS APPLY'CREATE FUNCTION dbo.SplitDelimitedString( @Delimiter varchar(5), @List varchar(8000)) RETURNS @TableOfValues TABLE( RowID smallint IDENTITY(1,1), Value varchar(50) ) AS BEGIN DECLARE @LenString int DECLARE @LenDelimeter int=len(@Delimiter) WHILE len( @List ) > 0 BEGIN SET @LenString = CASE charindex( @Delimiter, @List ) WHEN 0 THEN len( @List ) ELSE ( charindex( @Delimiter, @List ) -1 ) END INSERT INTO @TableOfValues SELECT substring( @List, 1, @LenString ) SET @List = CASE ( len( @List ) - @LenString ) WHEN 0 THEN '' ELSE right( @List, len( @List ) - @LenString - @LenDelimeter ) END END RETURN END Used like this:DECLARE @IDS VARCHAR(255)SET @IDS = '''A'''+ ',' + '''B'''+ ',' + '''C'''SET @IDS=REPLACE(@IDS,'''','') --get ride of extra charactors that are not neededDECLARE @IdsTable TABLE( Id varchar(1))INSERT INTO @IdsTableSELECT ValueFROM dbo.SplitDelimitedString(',',@IDS)Using either method you can now run the query in an ordinary set based manner:DECLARE @Table TABLE (Field VARCHAR(10))INSERT @TableSELECT 'A'UNION ALL SELECT 'Z'UNION ALL SELECT 'C'UNION ALL SELECT 'X'SELECT FieldFROM @Table T JOIN @IdsTable I ON I.Id=T.Field Or if you really want to use IN:DECLARE @Table TABLE (Field VARCHAR(10))INSERT @TableSELECT 'A'UNION ALL SELECT 'Z'UNION ALL SELECT 'C'UNION ALL SELECT 'X'SELECT *FROM @Table TWHERE T.Field IN(SELECT Id FROM @IdsTable) |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-03-16 : 06:59:56
|
The Split Function is a good approach, but there are more efficient methods (i.e. set-based methods) than "looping" to do the split. if there are only a few items in the delimited list it won't make much difference, but for long lists (hundreds of items, or more) the performance difference becomes much more significant.So ... my suggestion is that if you are going to use a Split Function put some time into finding the best performing function, then you can forget about it thereafter as it will scale well.More info in this thread:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=50648but its three pages long, and the "nuggets" are a bit buried ... |
 |
|
|
michael.appleton
Posting Yak Master
160 Posts |
Posted - 2010-03-16 : 07:05:01
|
| I'm open to updating my split function! Any links? |
 |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
|
|
michael.appleton
Posting Yak Master
160 Posts |
Posted - 2010-03-16 : 07:45:07
|
Ah, an XML solution. Seen the XML functions in SQL Server used for some other interesting string manipulation functions. I've made this into a general function that does the equivalent of my old one:--Splits delimited strings up into different rows.--Can be used with 'CROSS APPLY'ALTER FUNCTION [dbo].[SplitDelimitedString]( @Delimiter varchar(10), @List varchar(max)) RETURNS @TableOfValues TABLE( RowID smallint IDENTITY(1,1), Value varchar(255) ) AS BEGIN DECLARE @XML XML= cast( '<a>' + REPLACE(@List,@Delimiter,'</a><a>') + '</a>' as XML) INSERT INTO @TableOfValues(Value) SELECT nref.value('.','varchar(255)') FROM @XML.nodes('/a') AS R(nref) RETURN END Thanks for that! |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-03-16 : 09:58:54
|
quote: Originally posted by michael.appleton I'm open to updating my split function! Any links?
There is the link in my earlier message. That wibbles on about several different techniques, and there is a comparison of the various methods discussed (at this point in the thread: http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=50648&whichpage=2#305425) times (on servers used back then) varied from 57ms to 171ms (excluding the SPLIT function I was using originally which took 2,000ms ) Its an inciteful read if you want to consider the various ins-and-outs, but separate from that I'm afraid that I don't have a "best-performer" Split function that has been recently peer-reviewed with latest SQL versions.In my tests the XML "trick" was very slow for large lists (my notes say that my tests on SQL 2008 on a modern, fast, server on a list of 1,372 items ranged from 0ms to a max of about 55ms (on all the tests listed in the link above), but the XML version took 1,778ms average |
 |
|
|
Lamprey
Master Smack Fu Yak Hacker
4614 Posts |
Posted - 2010-03-16 : 11:10:44
|
| I didn't see this link posted, but this has many different methods of spliting and their perfomance numbers:http://www.sommarskog.se/arrays-in-sql.htmlThe performance stuff might be a little burried in there. But, there is an excel spread sheet that has the break down. |
 |
|
|
Lamprey
Master Smack Fu Yak Hacker
4614 Posts |
Posted - 2010-03-16 : 11:18:14
|
quote: Originally posted by Lumbago Wow Lamprey...I have seen a bunch of these posts before but all of them suggests using a function to get things in to a table variable. I have never ever seen your solution with LIKE but i LIKE it! The performance is probably better using a function but I like the simplicity...a keeper!- LumbagoIf the facts don't fit the theory, change the facts. Albert Einstein
Thanks! :)As Kristen points out, it'll not use indexes. But, for something simple or a quick one off it can be handy. Additionally, it's always nice to have a bigger bag-o-tricks.. :) |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-03-16 : 11:35:44
|
| One passing comment. I have separate functions for splitting (string) data to return Char and Int values. Reason being that when you then JOIN the results from the SPLIT function it avoids a further cast, and that seems to make a difference to performance in some instances (and there was grief when SQL 2000 SP4 was introduced giving poor query plans where implicit CAST was used in the JOIN, so I now avoid implicit CAST altogether. |
 |
|
|
Lamprey
Master Smack Fu Yak Hacker
4614 Posts |
Posted - 2010-03-16 : 12:20:55
|
| One more comment. If you are going to join the result of a Split/Parse fucntion to a table of any size.. Say 1 million or more rows.. Double check your performance. I have not done much with arrays in 2008. But, in previous versions the optimizer, it seems, had no clue what the function was returning so queries were very poor performaing. Just populating a table variable from the function and using that table variable to join to the lager table was orders of magnatude faster than join to the funciton itself. Jut an FYI.. |
 |
|
|
|