Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
kowani1
Starting Member
42 Posts |
Posted - 2006-11-27 : 00:04:37
|
Hi All,I have a Staging Database which is a temporary area where data is stored when porting data over from Oracle 9i environment.Since my local drive is getting flogged at about the midnight mark when the overnight job runs, I want to now if I can turn off the transaction log and only have the data available to be copied over the another SQL Server 2000 DB called Repository.As it is, it is taking over 8hrs to run this job, and I hope to bring it down to at least 5hrs.Thanks All.Jungle DBA |
|
harsh_athalye
Master Smack Fu Yak Hacker
5581 Posts |
Posted - 2006-11-27 : 01:03:43
|
Which method are you using to copy data?For BCP, you can set your database recovery model to Bulk-Logged or simple and specify TABLOCK hint on the target table. Also, you can set SELECT INTO/BULKCOPY option for the database.Harsh AthalyeIndia."Nothing is Impossible" |
 |
|
Kristen
Test
22859 Posts |
Posted - 2006-11-27 : 04:25:50
|
I expect the database can be set to Recovery model = SIMPLE too, if not already. (You won't need to recovery to point-in-time, right?)If you import millions of rows in one go they will all get logged. Doing them in smaller batches will mean that as each batch completes the TLog can be reused - provided you don't have a explicit transaction covering the whole job!Kristen |
 |
|
kowani1
Starting Member
42 Posts |
Posted - 2006-11-27 : 19:58:50
|
Thanks guys.Yeah, seeing millions of records are being copied over by DTS packages overnight, I was hoping if there was a way to momentarilly disable the logging - hence reducing the overal time it's take every to copy, coz I am suspecting that my my server is getting hammered.Jungle DBA |
 |
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2006-11-27 : 23:15:09
|
quote: Originally posted by Kristen I expect the database can be set to Recovery model = SIMPLE too, if not already. (You won't need to recovery to point-in-time, right?)If you import millions of rows in one go they will all get logged. Doing them in smaller batches will mean that as each batch completes the TLog can be reused - provided you don't have a explicit transaction covering the whole job!Kristen
Exactly. you can set the batchsize with the -b flag in bcp, or with BATCHSIZE if using BULK INSERT - I ususally set it to something like 100k rows. I find that it's necessary to do this when dealing with very large datasets, say over 100m rows, otherwise the tlog really bloats. Not sure how to set the batchsize in DTS but there's likely a way. SqlSpec: a fast and comprehensive data dictionary generator for SQL Server 2000/2005, Analysis Server 2005, Access 97/2000/XP/2003 http://www.elsasoft.org |
 |
|
|
|
|
|
|