Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
Dinesh
Starting Member
10 Posts |
Posted - 2004-08-02 : 10:26:59
|
| I'm having a table with millions of rows,I'm using a simple TSQL delete statement with some criteria to delete rows, but it takes too much time.Is there any better way to delete rows with in a minuteexcept using truncate and populating the table with recordsThanks in advanceDineshDinesh |
|
|
JasonGoff
Posting Yak Master
158 Posts |
Posted - 2004-08-02 : 10:28:07
|
| How many rows are you trying to delete ? Are you using a WHERE clause on your DELETE statement ? |
 |
|
|
JasonGoff
Posting Yak Master
158 Posts |
Posted - 2004-08-02 : 11:23:20
|
| Post your SQL Statement here, it will give us some idea of what you are doing. |
 |
|
|
spirit1
Cybernetic Yak Master
11752 Posts |
Posted - 2004-08-02 : 12:05:54
|
| maybe disabling some indices...Go with the flow & have fun! Else fight the flow :) |
 |
|
|
kselvia
Aged Yak Warrior
526 Posts |
Posted - 2004-08-02 : 17:00:42
|
| You could also create a SQL Agent Task called "Delete My Rows" that is enabled but not scheduled. In it, addSET ROWCOUNT 1000WHILE 1 = 1BEGIN DELETE FROM MyTable WHERE whatever... IF @@ROWCOUNT = 0 BREAKENDThat will create a job that deletes small chunks of data without locking large parts of the table.Then you can start that job from your code or procedure withexec msdb..sp_start_job @job_name = "Delete My Rows"and return to the front end immediatly. The job will run on in the background.(Code not tested for syntax but it should be close)--Ken"Knowledge is a process of piling up facts; wisdom lies in their simplification." |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2004-08-02 : 17:09:16
|
| Deleting tons of data is going to take time no matter how efficient your query is. The alternative though is to delete rows in a manner that doesn't affect other users. Here is what we do:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=37901See my solution about 3/4 of the way down the thread. You don't have to do it over multiple days though. You can do it all in one day by removing TOP 100000 from the temp table creation query. The point is that you should do it in small batches (my code does it in 5000 batches). Doing it this way allows other users to be able to access the table and also puts less of a strain on the server.Tara |
 |
|
|
Dinesh
Starting Member
10 Posts |
Posted - 2004-08-04 : 03:14:36
|
quote: Originally posted by tduggan Deleting tons of data is going to take time no matter how efficient your query is. The alternative though is to delete rows in a manner that doesn't affect other users. Here is what we do:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=37901See my solution about 3/4 of the way down the thread. You don't have to do it over multiple days though. You can do it all in one day by removing TOP 100000 from the temp table creation query. The point is that you should do it in small batches (my code does it in 5000 batches). Doing it this way allows other users to be able to access the table and also puts less of a strain on the server.Tara
"This above suggestion is quite helpful, I've implemented a solution similar to Tara's one"Thanks a lotDinesh |
 |
|
|
|
|
|