Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 Other Forums
 Other Topics
 Backup and delete data overheat

Author  Topic 

alejo46
Posting Yak Master

157 Posts

Posted - 2012-08-08 : 17:42:46

Good evening, i need your advise
before deleting some data from a huge table (not partionated by date) i run a script that creates a temp table, for instance Ive got a table called tablexx with a column named date_start that comprises data between 1998 and 20012.
if i want to delete date from 1998 to 2001 i run this script and the 1st thing it always does is to create a temporary table (select into command) called tablexx_1998_2000 hosted in the default FG primary, after creating the temp table, it extracts the data from a temp table to a flat file to backup. 3td. last, the script then delete the data from the original (tablexx)

i dont have the code for this script right now,but in my humble opinion, this sctipt is likely ineffeccient ?

(Wouldnt it better to extract data directly from tablexx to a flat file to backup instead of creating a temp table that could take so much time?, besides it
could cause the tansaction logs to fill up and FG primary to run out of space ?

Id apprciate your help in advanced

jackv
Master Smack Fu Yak Hacker

2179 Posts

Posted - 2012-08-09 : 01:52:39
It depends on the situation. In a Datawarehouse Archiving situation , you might create a perm table ,where you extract the data - in effect a local backup , and keep the table available for a few days in case of troubleshooting etc.
The scenario you've suggested in slightly inefficient . What about just doing a BULK export ? There are some other performance considerations such as table locking , triggers, as you've mentioned Transaction Logs if Full Recovery.

Jack Vamvas
--------------------
http://www.sqlserver-dba.com
Go to Top of Page

alejo46
Posting Yak Master

157 Posts

Posted - 2012-08-09 : 18:55:15
Thnks a lot, what did you mean when you said What about just doing a BULK export?

you meant using bcp or bulk insert ? if so which is the best for performane ?

Thanks in advance
Go to Top of Page

jackv
Master Smack Fu Yak Hacker

2179 Posts

Posted - 2012-08-10 : 11:50:08
The performance of these will depend on the way you employ them.
Consider such factors as :
a)Recovery model
b) batch sizes
c) target IO performance

Jack Vamvas
--------------------
http://www.sqlserver-dba.com
Go to Top of Page
   

- Advertisement -