Author |
Topic |
sqlserverdeveloper
Posting Yak Master
243 Posts |
Posted - 2007-11-28 : 16:01:32
|
We have a SQL server 2000 DB, where every night we load the data using DTS packages from MAS90 SOA files to the db tables, here are the steps in sequence:1. Drop tables2. Create tables3. Load the data from MAS90 SOA files.4. Create indexesAll the above 3 steps ran fine and when the create indexes step started, the tran log which was about 10MB size increased to 46.5 GB and while the create index step is running I recvd following alert:Alert: CPU Load on servername-E:\ Database 5cd02b4f is now 36 %So I am thinking that create indexes step which is almost doing reindexing on the tables is causing that. Please let me know how to handle the tran log size issue. Thanks. |
|
anonymous1
Posting Yak Master
185 Posts |
Posted - 2007-11-28 : 16:10:31
|
you are using the full recovery model? how often do you run log backups? are they scheduled during the period this is running? |
 |
|
sqlserverdeveloper
Posting Yak Master
243 Posts |
Posted - 2007-11-28 : 16:19:49
|
Yes it's full recovery model, log backed up once at 6.30PM, no they are scheduled at a different time. |
 |
|
Zoroaster
Aged Yak Warrior
702 Posts |
Posted - 2007-11-28 : 16:25:16
|
If you have enough drive space to accommodate the growth you should just do a log backup as the final step in the job to clear up the logical log space and then it will just reuse the space the each night without having to regrow. If you don't have the space you might be able to build in some log backups in between the steps of the job if they can be separated out like that, this will keep it trim. Future guru in the making. |
 |
|
sqlserverdeveloper
Posting Yak Master
243 Posts |
Posted - 2007-11-28 : 16:33:38
|
I do'nt think I have enough drive space, currently I have just left with 18GB disk space. |
 |
|
anonymous1
Posting Yak Master
185 Posts |
Posted - 2007-11-28 : 16:43:03
|
you can include log backups in the some/all of the steps 2, 3 & 4. this prolongs the time to complete, but reduces the overall space usage. depending on you willingness to loss data and hours of operation, you can also change the recovery model before and after the process, but generally this is not practiced. |
 |
|
Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)
7020 Posts |
Posted - 2007-11-28 : 16:54:21
|
You should schedule log backups to run much more often. Every 15 minutes is a good place to start.You can modify to datapump steps of your DTS packages to commit the data in smaller batches; the default is to do the datapump step in a single transaction. Setting the batch size to about 50,000 rows is a good starting point.CODO ERGO SUM |
 |
|
sqlserverdeveloper
Posting Yak Master
243 Posts |
Posted - 2007-11-28 : 17:10:15
|
I am not sure if I need the log backups every 15 minutes, beacause once the data gets loaded into the sql database, it does'nt change until the sql dts job runs the following night. When the sql job runs in the night it drops/creates tables etc. and then loads the data into the tables for the following day, that's when we have new data. So I am just taking db backup @4PM and then tran log backup @6.30OM once every day. Please correct me if I have to do something different.Currently in the transform data task properties, I have the follow options checked:1. Use fast load2. Always commit final batch3. Check constraints4. I am using 1000 for insert batch size.Thanks for all the responses. |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2007-11-28 : 17:22:52
|
If the data doesn't change until the DTS job runs and you can recreate all of this data by rerunning the DTS job, then you should be using SIMPLE recovery model instead.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/ |
 |
|
sqlserverdeveloper
Posting Yak Master
243 Posts |
Posted - 2007-11-28 : 22:39:27
|
quote: Originally posted by tkizer If the data doesn't change until the DTS job runs and you can recreate all of this data by rerunning the DTS job, then you should be using SIMPLE recovery model instead.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/
Thanks Tara, I shall change it to Simple tonight and will check the tran log tommorrow morning once the DTS job completes. And also since it's Simple, I shall take just the db backup once a day. |
 |
|
sqlserverdeveloper
Posting Yak Master
243 Posts |
Posted - 2007-11-29 : 14:46:04
|
The SQL DTS job completed and the log size(LDF file) was 13.2 GB, actually the log size increased from 10MB to 13.2 GB with Simple recovery model. When I had Full recovery model before the log increased to almost 41 GB. With Simple though there was a increase but it was'nt much comparatively. Is the tran log .LDF file increase is due to the create indexes step that's happening in the SQL job? Thx. |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2007-11-29 : 14:48:08
|
Even with SIMPLE recovery model, the transaction log is used. You can not avoid this. What happens with SIMPLE though is that after the transaction completes, it is cleared from the tlog (unlike BULK or FULL which waits until a tlog backup). So if you want the system to use even less tlog space, you can perform the load in smaller batches (that's if the load is causing it). You'll need to do some research to see which step is using the tlog file the most.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/ |
 |
|
sqlserverdeveloper
Posting Yak Master
243 Posts |
Posted - 2007-11-29 : 15:22:37
|
quote: Originally posted by tkizer Even with SIMPLE recovery model, the transaction log is used. You can not avoid this. What happens with SIMPLE though is that after the transaction completes, it is cleared from the tlog (unlike BULK or FULL which waits until a tlog backup). So if you want the system to use even less tlog space, you can perform the load in smaller batches (that's if the load is causing it). You'll need to do some research to see which step is using the tlog file the most.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/
Thanks for quick response, transaction completes: does it mean committed transactions |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2007-11-29 : 15:28:53
|
Committed or rolled back.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/ |
 |
|
sqlserverdeveloper
Posting Yak Master
243 Posts |
Posted - 2007-11-29 : 15:40:44
|
quote: Originally posted by tkizer Committed or rolled back.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/
Thanks! |
 |
|
|