Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2005 Forums
 Transact-SQL (2005)
 Order by is taking more time to get the result

Author  Topic 

vaibhavktiwari83
Aged Yak Warrior

843 Posts

Posted - 2010-08-13 : 10:58:39
Hello everyone,

I have one table having more than 4 crore rows.
When I execute the query

SELECT TOP 10 * FROM Pth_PatchDownload WITH(NOLOCK)

Then result came in less than 1 second.
But when execute the query

SELECT TOP 10 * FROM Pth_PatchDownload WITH(NOLOCK)
ORDER BY DC_Received_Date

Its taking more than 15 mins.

I cant change the database objects like indexing
as dont have permission on live server.
but can i do something to get the result faster.
Please suggest

Vaibhav T

To walk FAST walk ALONE
To walk FAR walk TOGETHER

webfred
Master Smack Fu Yak Hacker

8781 Posts

Posted - 2010-08-13 : 11:41:19
Try to use a where clause if you have an idea which DC_Received_Date can be excluded.


No, you're never too old to Yak'n'Roll if you're too young to die.
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2010-08-14 : 02:17:55
can i ask the reason for using NOLOCK? do you know the effects of using it?

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/

Go to Top of Page

vaibhavktiwari83
Aged Yak Warrior

843 Posts

Posted - 2010-08-14 : 03:13:53
Thanks webfred for your reply
You mean to say

SELECT top 10* FROM pth_patchDownload WITH(NOLOCK)
WHERE DC_Received_Date < '01-Jan-2009'
ORDER BY DC_Received_Date

I tried that but again its taking that much time.
I think SQL Server is parsing ORDER BY DC_Received_Date
THEN WHERE Clause I am not sure.

quote:
Originally posted by webfred

Try to use a where clause if you have an idea which DC_Received_Date can be excluded.


No, you're never too old to Yak'n'Roll if you're too young to die.



Vaibhav T

To walk FAST walk ALONE
To walk FAR walk TOGETHER
Go to Top of Page

vaibhavktiwari83
Aged Yak Warrior

843 Posts

Posted - 2010-08-14 : 03:17:46
Hi Visakh,

As per my knowledge with(nolock) is used
for getting uncommited rows also in result set.

Means if any transaction is going on with the table
still it will give the output rather than hang on the query.

Moreover I am using with(nolock) here
because we are restricted to use with(nolock) to execute query
on our web based query analyzer.

otherwise query will not be parsed.

quote:
Originally posted by visakh16

can i ask the reason for using NOLOCK? do you know the effects of using it?

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/





Vaibhav T

To walk FAST walk ALONE
To walk FAR walk TOGETHER
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2010-08-14 : 04:29:59
Ok. But do you know that it can include result which may get rollbacked later as well?

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/

Go to Top of Page

vaibhavktiwari83
Aged Yak Warrior

843 Posts

Posted - 2010-08-14 : 04:35:20
quote:
Originally posted by visakh16

Ok. But do you know that it can include result which may get rollbacked later as well?

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/





Yes, I do.
Actually we are concern about monitoring of process not the data.
so there is not a problem.
Thanks

Vaibhav T

To walk FAST walk ALONE
To walk FAR walk TOGETHER
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2010-08-14 : 04:36:51
ok..then fine with it

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/

Go to Top of Page

namman
Constraint Violating Yak Guru

285 Posts

Posted - 2010-08-15 : 18:50:49


SELECT TOP 10 * FROM Pth_PatchDownload WITH(NOLOCK)
ORDER BY DC_Received_Date


1 - so, assume you want to select top 10 rows (with ALL columns) from ALL rows in the table with order by DC_Received_Date


I cant change the database objects like indexing
as dont have permission on live server.


2 - so, assume there is no index on DC_Received_Date, and do not want to create one.

With 1 and 2, I think you have no way to improve performance for the query : SELECT TOP 10 * FROM Pth_PatchDownload WITH(NOLOCK)
ORDER BY DC_Received_Date.

Optimizer seems to have no choice but choose to scan the table, then sort by DC_Received_Date, then select top 10 rows to return. This is a lot of work for the engine.

Your first query is fast because, even also scan the table, engine will stop right away after it gets 10 rows as the requirement. However as you can see, this top 10 rows is meaningless, because top 10 but based on nothing. More importantly, with the same query, the top 10 rows may be different for different execution.

You can tell the DBA to redesign the database.
Go to Top of Page

vaibhavktiwari83
Aged Yak Warrior

843 Posts

Posted - 2010-08-16 : 03:07:35
quote:
Originally posted by namman



SELECT TOP 10 * FROM Pth_PatchDownload WITH(NOLOCK)
ORDER BY DC_Received_Date


1 - so, assume you want to select top 10 rows (with ALL columns) from ALL rows in the table with order by DC_Received_Date


I cant change the database objects like indexing
as dont have permission on live server.


2 - so, assume there is no index on DC_Received_Date, and do not want to create one.

With 1 and 2, I think you have no way to improve performance for the query : SELECT TOP 10 * FROM Pth_PatchDownload WITH(NOLOCK)
ORDER BY DC_Received_Date.

Optimizer seems to have no choice but choose to scan the table, then sort by DC_Received_Date, then select top 10 rows to return. This is a lot of work for the engine.

Your first query is fast because, even also scan the table, engine will stop right away after it gets 10 rows as the requirement. However as you can see, this top 10 rows is meaningless, because top 10 but based on nothing. More importantly, with the same query, the top 10 rows may be different for different execution.

You can tell the DBA to redesign the database.



Thanks for the reply.

Vaibhav T

To walk FAST walk ALONE
To walk FAR walk TOGETHER
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2010-08-16 : 03:19:17
I don't know how many rows "4 crore rows" are but I once needed to create some reports based on date ranges on a table with more than 500 mill rows in it. The date column was not indexed and creating the index was considered too expensive so I ended up creating a metadata table that held the first ID (which was a clustered primary key) and the date-column for every day. The I could use the metadata table to select the date range I needed and then join the metadata with the actual data table on the primary key column. It actually worked quite well!

- Lumbago

My blog (yes, I have a blog now! just not that much content yet)
-> www.thefirstsql.com
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2010-08-16 : 03:26:40
...but you will of course need access to the live server to do something like this. Just thought I'd share an idea that worked well in a somewhat similar scenario.

- Lumbago

My blog (yes, I have a blog now! just not that much content yet)
-> www.thefirstsql.com
Go to Top of Page

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2010-08-16 : 03:43:25
"lakh" is 100,000
"crore" is 10,000,000

See http://en.wikipedia.org/wiki/South_Asian_numbering_system



N 56°04'39.26"
E 12°55'05.63"
Go to Top of Page

vaibhavktiwari83
Aged Yak Warrior

843 Posts

Posted - 2010-08-16 : 04:30:34
quote:
Originally posted by Lumbago

...but you will of course need access to the live server to do something like this. Just thought I'd share an idea that worked well in a somewhat similar scenario.

- Lumbago

My blog (yes, I have a blog now! just not that much content yet)
-> www.thefirstsql.com




Yeah thank you very much.
I will try to apply if there is any modification in my database
or any new developement will come for the same scenario.


Vaibhav T

To walk FAST walk ALONE
To walk FAR walk TOGETHER
Go to Top of Page

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2010-08-16 : 04:30:51
quote:
Originally posted by vaibhavktiwari83

I have one table having more than 4 crore rows.
It's because first query takes the first 10 record it finds, no matter which order.
Second query need to sort all 40 million records to get the 10 records you want. An index would help tremendously. Ask your DBA to add an index to support your query.




N 56°04'39.26"
E 12°55'05.63"
Go to Top of Page

vaibhavktiwari83
Aged Yak Warrior

843 Posts

Posted - 2010-08-16 : 04:47:55
quote:
Originally posted by Peso

quote:
Originally posted by vaibhavktiwari83

I have one table having more than 4 crore rows.
It's because first query takes the first 10 record it finds, no matter which order.
Second query need to sort all 40 million records to get the 10 records you want. An index would help tremendously. Ask your DBA to add an index to support your query.




N 56°04'39.26"
E 12°55'05.63"




Yes Peso you are correct but its not the requirement of business logic its the troubleshooting of issues.
so we cant ask to add index.
anyways thanks for the suggestion.

Vaibhav T

To walk FAST walk ALONE
To walk FAR walk TOGETHER
Go to Top of Page

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2010-08-16 : 05:02:19
quote:
Originally posted by vaibhavktiwari83

Yes Peso you are correct but its not the requirement of business logic its the troubleshooting of issues.
That's the way ANY relational database works. There are no automatic sorting in tables, not even using a clustered index.

To get the data you want, according to business rules, you have to apply the database's rules to get the data you want.



N 56°04'39.26"
E 12°55'05.63"
Go to Top of Page
   

- Advertisement -