Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2008 Forums
 Transact-SQL (2008)
 clustered index scan optimization

Author  Topic 

winman
Starting Member

26 Posts

Posted - 2013-04-24 : 09:17:31
the below mentioned query is having a clustered index scan of 95% in the execution plan.there is only a index on PK.is there any way to reduce the cost



SELECT [Reminder].* FROM [Reminder].dbo.[Reminder]
WHERE ( Charindex('nvt',CAST([Reminder].dbo.[Reminder].[AllotedTo] AS VARCHAR(MAX)))>0
AND CAST([Reminder].dbo.[Reminder].[Date] as smalldatetime) <= '23-Aug-2012'
AND (Charindex('~!',CAST([Reminder].dbo.[Reminder].[F6Row] AS VARCHAR(MAX)))=0 or [Reminder].dbo.[Reminder].[F6Row] Is NULL ) )
OR
( Charindex('nvt',CAST([Reminder].dbo.[Reminder].[DiscussWith] AS VARCHAR(MAX)))>0
AND (CAST([Reminder].dbo.[Reminder].[FollowUpDate] as smalldatetime) Is Null OR CAST([Reminder].dbo.[Reminder].[FollowUpDate] as smalldatetime) = '' )
AND CAST([Reminder].dbo.[Reminder].[Date] as smalldatetime) <= '23-Aug-2012'
AND (Charindex('~!',CAST([Reminder].dbo.[Reminder].[F6Row] AS VARCHAR(MAX)))=0 or [Reminder].dbo.[Reminder].[F6Row] Is NULL ) )
OR
( Charindex('nvt',CAST([Reminder].dbo.[Reminder].[DiscussWith] AS VARCHAR(MAX)))>0
AND CAST([Reminder].dbo.[Reminder].[FollowUpDate] as smalldatetime) <= '23-Aug-2012'
AND (Charindex('~!',CAST([Reminder].dbo.[Reminder].[F6Row] AS VARCHAR(MAX)))=0 or [Reminder].dbo.[Reminder].[F6Row] Is NULL ) )

ORDER BY cast([Reminder].[Date] as smalldatetime) Desc


bandi
Master Smack Fu Yak Hacker

2242 Posts

Posted - 2013-04-24 : 09:26:59
The factors you need to consider are:
1) What is the data type for Date columns( FollowUpDate,.....)? I think you have VARCHAR types... why can't to use DATETIME/DATE types?
2) If you use scalar functions in the WHERE clause SQL Engine won't consider the indices...

Not sure.. just try this and let us know the performance
Use of COLUMNSTORE Index
http://blog.sqlauthority.com/2011/10/29/sql-server-fundamentals-of-columnstore-index/

--
Chandu
Go to Top of Page

MIK_2008
Master Smack Fu Yak Hacker

1054 Posts

Posted - 2013-04-24 : 10:10:37
Additionally, I would like to know which column (used in Where condition(s)) is the primary key?

Cheers
MIK
Go to Top of Page

winman
Starting Member

26 Posts

Posted - 2013-04-25 : 00:54:11
below given is the table...Autonumber is the primary key...but this is not used in the query....but there are no other unique columns for primary key

CREATE TABLE [dbo].[Reminder](
[Autonumber] [int] NOT NULL,
[Date] [nvarchar](50) NULL,
[Works] [nvarchar](max) NULL,
[Designation] [nvarchar](225) NULL,
[AllotedTo] [nvarchar](225) NULL,
[By] [nvarchar](225) NULL,
[Remind Whom] [nvarchar](225) NULL,
[Done_By] [nvarchar](225) NULL,
[Last_date] [nvarchar](50) NULL,
[Dept] [nvarchar](225) NULL,
[Periodicity] [nvarchar](225) NULL,
[Renewal] [nvarchar](50) NULL,
[Completed_On] [nvarchar](50) NULL,
[Remarks] [nvarchar](max) NULL,
[Priority] [nvarchar](225) NULL,
[Instructor] [nvarchar](225) NULL,
[Time] [nvarchar](50) NULL,
[Original_Work_Date] [nvarchar](50) NULL,
[Original_Last_Date] [nvarchar](50) NULL,
[Oldautonumber] [int] NULL,
[FollowUpDate] [smalldatetime] NULL,
[Updated] [nvarchar](50) NULL,
[RelatedData] [nvarchar](max) NULL,
[pro_remarks] [nvarchar](max) NULL,
[F6Row] [nvarchar](max) NULL,
[F6GivenBy] [nvarchar](max) NULL,
[F8Row] [nvarchar](max) NULL,
[F8GivenBy] [nvarchar](max) NULL,
[ListCol] [nvarchar](max) NULL
) ON [PRIMARY]

GO
Go to Top of Page

winman
Starting Member

26 Posts

Posted - 2013-04-25 : 06:16:50
I also wanted to know which are the properties I need to consider in the execution plan if I need to concentrate on network load and server load
Go to Top of Page

ScottPletcher
Aged Yak Warrior

550 Posts

Posted - 2013-04-25 : 12:43:16
First, [Date] should be changed to an actual date/datetime type, not nvarchar.

My best guess based on the limited info available is that Date should also be the clustered index key, but that will only be useful after it's changed to an actual date/datetime.
Go to Top of Page

winman
Starting Member

26 Posts

Posted - 2013-04-25 : 23:56:21
Thanks to all for your reply....but can anyone please help me with which property cost in the execution plan do i concentrate to reduce if my aim is reduce the network load and server load
Go to Top of Page

bandi
Master Smack Fu Yak Hacker

2242 Posts

Posted - 2013-04-26 : 00:24:16
[code]There are so many conditions in the WHERE clause... so you have to concentrate on creating Indexes to those columns..
1) Follow the ScottPletcher's suggestion for columns Date, Last_date
2) Why are you casting FollowUpDate to smalldatetime. Its already smalldatetime type only...
3) change Charindex('~!',CAST([Reminder].dbo.[Reminder].[F6Row] AS VARCHAR(MAX)))=0 condition to
[Reminder].dbo.[Reminder].[F6Row] LIKE '%~!%'
Do 3rd step for each Charindex(...) condition...[/code]

--
Chandu
Go to Top of Page

winman
Starting Member

26 Posts

Posted - 2013-04-26 : 01:20:51
quote:
Originally posted by bandi

There are so many conditions in the WHERE clause... so you have to concentrate on creating Indexes to those columns..
1) Follow the ScottPletcher's suggestion for columns Date, Last_date
2) Why are you casting FollowUpDate to smalldatetime. Its already smalldatetime type only...
3) change Charindex('~!',CAST([Reminder].dbo.[Reminder].[F6Row] AS VARCHAR(MAX)))=0 condition to
[Reminder].dbo.[Reminder].[F6Row] LIKE '%~!%'
Do 3rd step for each Charindex(...) condition...


--
Chandu



thanks
I tried the above steps and I had some reduction in cost like the subtree cost....but I'm not being able to predict that cost reduction of which property from among that in execution plan(like i mentionned below) will serve my purpose(of reducing network load and server load)

•Estimated I/O Cost
•Estimated CPU Cost
•Estimated Operator Cost
•Estimated Subtree Cost
•Estimated Number of Rows
•Estimated Row Size

before i had clustered index scan -95% filter-4% sort-1% and Query cost (Relative to batch)100%
now i had clustered index scan -96% filter-4% and Query cost (Relative to batch)100%
Go to Top of Page

bandi
Master Smack Fu Yak Hacker

2242 Posts

Posted - 2013-04-26 : 02:24:50
Have you created indexes on specific column and include date columns and F6Row so on........

Read "Tuning Performance" in the following link
https://www.simple-talk.com/sql/performance/sql-server-performance-crib-sheet/

--
Chandu
Go to Top of Page

winman
Starting Member

26 Posts

Posted - 2013-04-26 : 05:55:16
quote:
Originally posted by bandi

Have you created indexes on specific column and include date columns and F6Row so on........

Read "Tuning Performance" in the following link
https://www.simple-talk.com/sql/performance/sql-server-performance-crib-sheet/

--
Chandu



i have created a clustered index on date and follow up date....
In the link you suggested there is way to tunning performance by configuring server but I need suggestion to reduce load by optimizing the cost of query....so which property cost reduction should i concentrate on
Go to Top of Page

Lamprey
Master Smack Fu Yak Hacker

4614 Posts

Posted - 2013-04-26 : 12:29:22
I have no issue with trying to eek our some performance to help with the factors you mentions (network load, etc..). But, the basic problem is that you have predicates that are not sargable. My guess is that indexing isn't going to help (much). But, it's hard to see without more information. How large is the table? How much of that data falls within that date predicate ('23-Aug-2012')?

If that date predicate will filter out the vast majority of the data then, as Scott mentioned, you might be able to do some smarter indexing (clustered index or filtered index). But, that is only part of the equation. If this is a common query, then you may want to create a derived column (or a different table or view to support this query) where you can persist a check for the other non-sargable predicates (i.e.: F6Row] LIKE '%~!%').

Again without knowing anything about your system, other options might be available: table structure changes, better data validation, etc.. who knows.. Additionally, I don’t know about your query load, etc.. so does it make sense to tune for just this query or is this just a small example?

Due to the sargability of your predicates, you are probably stuck with a poor performing query unless you do something "drastic."
Go to Top of Page
   

- Advertisement -