Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2008 Forums
 Transact-SQL (2008)
 Transactions and tracking with in while loop

Author  Topic 

naveenjv
Starting Member

9 Posts

Posted - 2015-04-15 : 04:31:39
Hi Experts,

I am running the following query, which deletes around 800 million rows from the table.

I would like to introduce transactions with in this code and also, if success entire deletion process should be committed and number of rows deleted, table name and success should be inserted to the log table.

If there is a failure, transaction should be rolled back and table name and error message should be inserted to the same log file

Select 1
While @@ROWCOUNT > 0
Begin
DELETE Top(100000) FROM [dbo].[Table1]
FROM [dbo].[Table2]
INNER JOIN [dbo].[Table3] ON [Table2].[PracticeID] = [Table3].[PracticeID]
INNER JOIN [dbo].[Table1] ON [Table3].[InputDevicePracticeID] = [Table1].[InputDevicePracticeID]
WHERE [Table2].PracticeID =55;
End
Please do let me know how to achieve this?

Thanks,

Naveen

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2015-04-15 : 11:52:10
If you put a transaction around the entire thing, you are likely going to have a transaction log problem as it'll be one large transaction that'll need to be stored while it's running or until the next log backup. I would not advise doing that. There's a reason why you are doing it in batches, so keep it as is.

Now having said that, you could use a TRY/CATCH.

Tara Kizer
SQL Server MVP since 2007
http://weblogs.sqlteam.com/tarad/
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2015-04-17 : 07:59:23
Would the "table" be more availble to other users if deleted (with a transaction) in batches (perhaps with WAITFOR) ?

or is the delete just going to escalate to a Table Lock?

Perhaps deleting in batches would be less strain on server (again, with WAITFOR) such that,m even as a single transaction, it would mean the server was more responsive. (i.e. for anyone NOT blocked on this table)

Otherwise seems better to just delete in a single operation.

Either way, I would NOT code it as per the O/P's example as I have found that is very inefficient. We get PKeys to temporary table (with an IDENTITY) first and then JOIN that, using a Range for the PKey to select "next 100,000" rows

We also use WAITFOR to provide a short gap between each loop batch

We would next test @@ROWCOUNT in this way, we would use a @Variable. Far too great a risk that someone adds some code which provides intermediate @@ROWCOUNT value - such as a statement to assist with Debugging

We would also use a parameter for the 100,000

(and in practice we would adjust the parameter, dynamically, if we found that a batch took more/less time than expected so that we dynamically adjust the performance according to other work that the server is doing)
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2015-04-17 : 08:02:09
It also begs the question:

Is this a one-off delete? or something that will be scheduled to run regularly (every day, or once a month/year etc.)

for a one-off it may be better to copy retained-rows to a temporary table, and the drop/rename the tables (and deal with any foreign keys) rather than attempting to delete the rows
Go to Top of Page
   

- Advertisement -