SQL Server Forums
Profile | Register | Active Topics | Members | Search | Forum FAQ
 
Register Now and get your question answered!
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 Site Related Forums
 Article Discussion
 Article: Implementing a Dynamic WHERE Clause
 New Topic  Reply to Topic
 Printer Friendly
Previous Page
Author Previous Topic Topic Next Topic
Page: of 2

Page47
Flowing Fount of Yak Knowledge

USA
2878 Posts

Posted - 10/04/2002 :  08:09:15  Show Profile  Reply with Quote
Sam,

Take a look at the execution plans of

declare @lname varchar(50)

select *
from dbo.authors
where au_lname = coalesce(@lname,au_lname)

select *
from dbo.authors
where @lname is null or
au_lname = @lname

 
You should see that they both use the same index scan and filter, and the coalesce has a 0.0001 improvement on subtree cost. A very minimal difference. I usually use the coalesce because it is easier to type and easier to comment out for debugging. No other reason.

As to the question on Dynamic SQL vs. the Coalesce...
The best way to know which is better is to simply write it both ways and give it a try. There are far to many factors involved to make the decision without trying (IMHO). My experience has been that a dynamic SQL solution only performs better when there are many conditions in the where clause. If there are just one or two conditions in the where clause I will likely use the coalesce solution. If there are 4 or 10 I would likely try both. If there are 50, I would go with dynamic sql.

Jay White
{0}
Go to Top of Page

skjoldet
Starting Member

1 Posts

Posted - 12/24/2002 :  04:51:38  Show Profile  Reply with Quote
Great article. What if I want to control the expression operators, for example:

Cus_City = COALESCE(@Cus_City,Cus_City)
Cus_City <> COALESCE(@Cus_City,Cus_City)

How can send even = or <> to the SP and implement it in my SP?

/Lars

Go to Top of Page

Robwhittaker
Yak Posting Veteran

United Kingdom
85 Posts

Posted - 12/24/2002 :  05:12:33  Show Profile  Reply with Quote
I have a query for the artical writer.

I beleive although correct me if I'm wrong, being newish to this sql lark, that if you do something like

select blah
from table
where col1 = col1

you wont return the rows where col1 has values of null.

If you take this in to your COALESCE where clause and the user wants to search on col2, but col1 has some null values wont you end up dropping these rows from your result set?

Does this make sense

Rob


Go to Top of Page

verronep
Starting Member

USA
15 Posts

Posted - 12/24/2002 :  09:15:35  Show Profile  Reply with Quote
Very interesting article, but the way I usually do these types of queries is like so:

SELECT Cus_Name,
Cus_City,
Cus_Country
FROM @Customers
WHERE (Cus_Name = @Cus_Name OR @Cus_Name IS NULL) AND
(Cus_City = @Cus_City OR @Cus_City IS NULL) AND
(Cus_Country = @Cus_Country OR @Cus_Country IS NULL)

The thing to be careful with when using the COALESCE function as in the article is when the table to be searched on contains NULL values. THe COALESCE function will not return such rows.

For example, try allowing NULL values into the Customers table in the article, and insert the following, then rerun the query.
INSERT @Customers VALUES ('Acme', 'Paris', NULL)

You'll see that the new row is not returned, though it should be.
The problem lies in the equality of NULLS with SQL Server.
IF NULL=NULL SELECT 1 ELSE SELECT 0 returns 0. SQL Server doesn't equate NULL with anything, not even NULL. The above method solves this problem.

I did some testing on a fairly large table to see if I could find any difference in speed or performance. (The table had 4.5 million rows, no NULL value are allowed in the table.)

Interestingly enough, @Param IS NULL approach and the COALESCE approach used the same query plan, but different indexes. (Not sure why, to be honest.)

@Param IS NULL took 38% of the time, while COALESCE took up 62% when I ran the selects as a batch.
Forcing both queries to use @Param IS NULL approach's index only made the disparity greater. 12% for the @Param IS NULL approach, with 88% for COALESCE.
Forcing both queries to use the COALESCE approach's index even things up a bit, with the @Param IS NULL approach taking 49% to COALESCEs 51%. Not much difference there...

Interesting stuff.
Paul

"I have not failed. I have just found 10,000 ways that won't work."
- Thomas Alva Edison (1847-1931)
Go to Top of Page

verronep
Starting Member

USA
15 Posts

Posted - 12/24/2002 :  09:49:26  Show Profile  Reply with Quote
SQL Server just loves to confuse me...

In my previous post, I tested the following query:
SELECT Cus_Name,
Cus_City,
Cus_Country
FROM @Customers
WHERE (Cus_Name = @Cus_Name OR @Cus_Name IS NULL) AND
(Cus_City = @Cus_City OR @Cus_City IS NULL) AND
(Cus_Country = @Cus_Country OR @Cus_Country IS NULL)

I then realized it seemed to make more sense for the @Param IS NULL to be first. (With the premise that if the first condition in an OR statement is true, it wouldn't process the second half of the statement.)
So I ran this query side by side to see, and was a bit surprised by the results.
SELECT Cus_Name,
Cus_City,
Cus_Country
FROM @Customers
WHERE (@Cus_Name IS NULL OR Cus_Name = @Cus_Name) AND
(@Cus_City IS NULL OR Cus_City = @Cus_City) AND
(@CusCountry IS NULL OR Cus_Country = @Cus_Country)

The first query actually runs 1.6 times faster, and the actual execution plans are completely different, as are the indexes used.
For the second time ever I saw a query with parallelism in the execution plan. (The first one uses parallelism.)

I don't understand at all why the vast difference in execution plans and indexes at this point. If anyone could enlighten me, it'd be much appreciated.

(Just a note, I used teh same 4.5 million row table as in the first query, and it's not a customer table, just used the customer table to illustrate the syntax of my queries.)

Paul

"I have not failed. I have just found 10,000 ways that won't work."
- Thomas Alva Edison (1847-1931)

Edited by - verronep on 12/24/2002 09:52:23
Go to Top of Page

gully
Starting Member

France
2 Posts

Posted - 10/28/2003 :  10:54:50  Show Profile  Reply with Quote
I have tryied COALESCE and the IS NULLL approach on 200 000 records
EXEC (SQL) is still faster ........


Sir code a lot
Go to Top of Page

jkincaid
Starting Member

1 Posts

Posted - 11/22/2003 :  23:49:46  Show Profile  Reply with Quote
Is it possible to pass the operator (+,-,>=, LIKE) to use in a COALESCE SP??


quote:
Originally posted by skjoldet

Great article. What if I want to control the expression operators, for example:

Cus_City = COALESCE(@Cus_City,Cus_City)
Cus_City <> COALESCE(@Cus_City,Cus_City)

How can send even = or <> to the SP and implement it in my SP?

/Lars



Go to Top of Page

zelk
Starting Member

Sweden
6 Posts

Posted - 02/04/2004 :  12:28:09  Show Profile  Visit zelk's Homepage  Reply with Quote
I check the value against NULL and achieve good performance and flexibility. No problem with LIKE values either:

WHERE
(@title IS NULL OR title LIKE @title)
AND
(@min_release_date IS NULL OR release_date >= @min_release_date)
AND
(@document_id IS NULL or document_id = @document_id)

Great performance and flexibility! All indexes are used as expected.
Go to Top of Page

joseph_hurley
Starting Member

4 Posts

Posted - 06/04/2004 :  14:01:38  Show Profile  Reply with Quote
To allow NULL values using coalesce you could use.
WHERE ISNULL(Table.Column,0) = COALESCE (@Variable, ISNULL(Table.Column,0))

Much easier to read than breaking out in a Case statement. I haven't checked the performance difference, but it does work. This is neat too:

WHERE (Table.Column IN(SELECT Column FROM dbo.ParseFunction(@Variable,','))
OR (ISNULL(Table.Column,0) =
CASE
WHEN @Variable IS NULL then ISNULL(Table.Column,0)
END
))

dbo.ParseFunction is a function that splits a string using the delineator passed into multiple rows.

This allows you to check against multiple values input in CSV format.
Go to Top of Page

gnguyen
Starting Member

1 Posts

Posted - 08/15/2005 :  21:00:21  Show Profile  Reply with Quote
Thanks guys,

This article is excellent. It has really helped me solving a problem that has been "bugging" me for the last 3-4 days.

X amount of brains is by far better than 1/2 a brain
Go to Top of Page

love
Starting Member

2 Posts

Posted - 11/03/2005 :  10:03:56  Show Profile  Reply with Quote
Its good article and good info when there is only one table involved.

How about when there is Join Present. How whould you build Dynamic SQL.

For Eg.

Table1 = Customers
Table2 = Interests
Table3 = Customers_Interests

Table3 holds Customers Interests.

Lets say Customers Table has Data: C1, C2, C3, C4
Interests Table has Data: I1, I2, I3, I4
Cusomters_Interests has Data:
C1, I1
C1, I3
C1, I4
C2, I2
C2, I4
C3, I3

Now If I want to run a query:

Now lets say on my webpage I represent Interests as CheckBoxes. I select Interests I1, I3, I4

So query will be:

Give me list of Customers with Interests I1, I3, I4

How would you run such Dynamic Query?
Go to Top of Page

coolerbob
Aged Yak Warrior

United Kingdom
841 Posts

Posted - 11/28/2005 :  09:25:52  Show Profile  Reply with Quote
I agree with the comments here about using coalesce for searches. It caused an index scan instead of seek when passing in null for any of the params. So I've gone back to checking (inside the sp) what param values have been passed in and then calling the relative SELECT statement. That way I get my SEEKS back (and no dynamic sql required) and I'm a happy man :-)
Go to Top of Page

caractacus
Starting Member

2 Posts

Posted - 05/24/2006 :  12:12:22  Show Profile  Reply with Quote
An alternative that solved my problem, and produced good query plans is as follows:

CREATE PROCEDURE SampleProc
@a uniqueidentifier,
@b uniqueidentifier,
@c uniqueidentifier
AS

-- convert NULL parameters to 'empty' values
-- NULLs cannot be correctly compared using this pattern
SET @a = ISNULL(@a, 0x00)
SET @b = ISNULL(@b, 0x00)
SET @c = ISNULL(@c, 0x00)

SELECT a.col, b.col, c.col
FROM a inner join b inner join c
WHERE
CASE WHEN @a = 0x00
THEN @a
ELSE a
END = @a
AND
CASE WHEN @b = 0x00
THEN @b
ELSE b
END = @b
AND
CASE WHEN @c = 0x00
THEN c
ELSE @c
END = @c

What's the thinking, why do this?

The idea is that the CASE statements are dynamically evaluated by the query optimizer. When the GUIDs are 0x00 then the filtered column does not participate in the query _AT ALL_.

The problem with using COALESCE, and ISNULL, is that the filtered column name always participates in the query, which has impact on the query optimizer. This particular technique excludes the non-participating columns and allows for selection of an optimal query plan.

When all the GUIDs are zero, the WHERE clause evaluates to

WHERE 0x00 = 0x00 AND 0x00 = 0x00 and 0x00 = 0x00

These terms are optimized out of the query.

James Caradoc-Davies
caractacus
Go to Top of Page

Dhananjay3
Starting Member

1 Posts

Posted - 06/22/2006 :  07:13:50  Show Profile  Reply with Quote
Neat trick with coalesce that! And the comments were just as informative as the article. Thanks Guys!
Go to Top of Page

net_prog
Starting Member

1 Posts

Posted - 12/26/2006 :  19:05:54  Show Profile  Reply with Quote
I agree that static query is more efficient than the dynamic one, but I suppose it is better to use dynamic one when each search parameter adds an INNER JOINed table to the query. So, if you have 5 parameters, in static query you will have to join all 5 tables regardless of parameter values. And that could mean resource wasting if those 5 tables have tons of records to be joined. With dynamic query you can specify not to join any of them if there are no search values and add INNER JOINs one by one according to the values.
Go to Top of Page

sandesh
Starting Member

1 Posts

Posted - 02/25/2009 :  02:51:50  Show Profile  Visit sandesh's Homepage  Reply with Quote
It is of great use

Thank you very much


Sandesh
http://sanshark.com
Go to Top of Page
Page: of 2 Previous Topic Topic Next Topic  
Previous Page
 New Topic  Reply to Topic
 Printer Friendly
Jump To:
SQL Server Forums © 2000-2009 SQLTeam Publishing, LLC Go To Top Of Page
This page was generated in 0.19 seconds. Powered By: Snitz Forums 2000