| Author |
Topic |
|
Skydolphin
Starting Member
43 Posts |
Posted - 2009-10-19 : 17:49:03
|
| Hi all,I am trying to write a stored procedure that can be run every 6 months to clean up a table in the database. Basically I want it to delete records that are old or have a certain status. Something like this. Delete from mytable where myDate between @beginDateTime and @endDateTime Or myStatus = 7That's the easy part. The problem is that there are 1000s of records created in the table every day and there could be close to several hundred thousands records to delete. This causes the transaction log to fill up and the query fails. What I would like to do is add a loop to the stored procedure that has a counter. After say 1000 records it does a commit then keeps going so it isn't trying to process all these deletes as one transaction. However, I'm not sure this will work and also not sure exactly how to write such a thing. I am invisioning something like this but I could be way off base.set @counter = 0while @counter < 1000 begin transaction set @counter = @counter + 1 Delete from mytable where myDate between @beginDateTime and @endDateTime Or myStatus = 7 if @counter = 1000 begin commit set @counter = 0 endendAny ideas or suggestions on how to solve this problem would be greatly appreciated.(BTW. There are several of these databases located in different regions and at some point every day these records get written into a central database. They are kept around for 6 months then deleted. They also get backed up every night.)Thanks,RhondaRhonda |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
jimf
Master Smack Fu Yak Hacker
2875 Posts |
Posted - 2009-10-19 : 18:12:15
|
| You may want to break it up in to sections of @beginDateTime and @endDateTime. Since you want to do it every 6 months maybe something like:SET @endDate = dateadd(month,1,@beginDateTime) while @endate < @endDateimebegin transactionDelete from mytable where myDate between @beginDateTime and @endDate set @begindatetime = @endDateset @endDate = DATEADD(month,1,@endDate)end DELETE FROM myTable where myStatus = 7JimFEveryday I learn something that somebody else already knew |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
Skydolphin
Starting Member
43 Posts |
Posted - 2009-10-19 : 18:32:36
|
| Tara, I'm not 100% sure what you mean. Do you mean like this?set @counter = 0while @counter < 1000begin transactionset @counter = @counter + 1Delete top @counter from mytable where myDate between @beginDateTime and @endDateTime Or myStatus = 7if @counter = 1000begin commit set @counter = 0endendTx,Rhonda |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2009-10-19 : 18:37:20
|
Here's a purge template I created a while back to pass out to developers:CREATE PROC dbo.isp_Table1_Purge(@purgeDate datetime, @batch int, @purgeSuccess int OUTPUT, @totalRowsPurged int OUTPUT)ASSET NOCOUNT ONDECLARE @error int, @rc intSELECT @purgeSuccess = 1, @totalRowsPurged = 0, @rc = 1 WHILE @rc <> 0BEGIN BEGIN TRAN DELETE TOP (@batch) FROM Table1 WHERE CreateDate < @purgeDate SELECT @rc = @@ROWCOUNT, @totalRowsPurged = @totalRowsPurged + @rc, @error = @@ERROR IF @error <> 0 GOTO EXIT_PROC COMMIT TRANENDRETURNEXIT_PROC:ROLLBACK TRANSET @purgeSuccess = 0 RETURN I now instead use TRY/CATCH to handle deadlocks while purging, but I don't have a template ready to post here.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/Subscribe to my blog"Let's begin with the premise that everything you've done up until this point is wrong." |
 |
|
|
Skydolphin
Starting Member
43 Posts |
Posted - 2009-10-19 : 19:16:11
|
| Wow. This database is definatley a mess. It just took 27 minutes to delete 100 records. It needs to be indexed on the date I think. How efficient is it to try to index a table that already has a bazillion records in it?RhondaRhonda |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
Skydolphin
Starting Member
43 Posts |
Posted - 2009-10-19 : 19:40:23
|
| It has roughly 150,337,364 records. New records get created every minute.It is currently SQL 2005.TxRhonda |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
LoztInSpace
Aged Yak Warrior
940 Posts |
Posted - 2009-10-20 : 02:05:31
|
| If you want "old" records, why don't you run the query every day with a smaller date range, i.e. the ones that expired yesterday?The other thing is just to extend the transaction log so it's big enough to do the jobs you need it for. What are the implications of not having a transactional delete? Will it matter if one query has 1000 less rows than the same query done before? If it does, then you need the delete in a transaction. If you look at it like that then it's just a configuration thing really. If it doesn't matter then just do it more frequently as I suggest and save yourself some headaches. |
 |
|
|
Skydolphin
Starting Member
43 Posts |
Posted - 2009-10-20 : 13:04:14
|
| This is a R&D project that I enherited and it was not very well architected from the beginning. I am not a DBA so I am just learning all this stuff. As for deleting smaller sets that is my plan. In fact there is already a SQL Agent job in place to do just that but it had been failing for over 3 months before anyone even noticed. But before I can fix any of that I have to clean up the existing mess first. Unfortunately extending the size of the transaction log is not an option. I have extended as much as I can based on the disk space I have available. I realize this is not an optimal situation but it is what I have to work with. RhondaRhonda |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2009-10-20 : 13:29:10
|
| Still waiting on your edition level.I wouldn't recommend extending your transaction log to accomodate the larger transaction. The longer/bigger your transaction is, the longer it is holding locks and blocking users. Speaking of locking/blocking, you should look into READ_COMMITTED_SNAPSHOT isolation level as that is now the recommended level for OLTP systems even though the default is READ_COMMITTED.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/Subscribe to my blog"Let's begin with the premise that everything you've done up until this point is wrong." |
 |
|
|
|