Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
joebickley
Starting Member
4 Posts |
Posted - 2006-01-11 : 06:32:41
|
HiI learnt all the stuff about transaction logs an age ago and i seem to have forgotten it all!I Have a database which holds around 1.5 million rows in 3 tables and is purley used as a data warehouse, it is in simple recovery mode. When i run my SSIS(DTS) package to drop all the data in the table and import it all again the log file still keeps wanting vast amounts of space to let the package run. Having monitiored the package the majority of this space is taken when 3 "delete from tablex" statements run to clear down the tables.Any ideas of a way round this? A 1.5gig log file for a DB that has no data edits seems a tad odd!JoePS im starting with an empty log each time. |
|
khtan
In (Som, Ni, Yak)
17689 Posts |
Posted - 2006-01-11 : 07:08:45
|
use truncate table <table name>From Books OnLinequote: TRUNCATE TABLE is functionally identical to DELETE statement with no WHERE clause: both remove all rows in the table. But TRUNCATE TABLE is faster and uses fewer system and transaction log resources than DELETE. The DELETE statement removes rows one at a time and records an entry in the transaction log for each deleted row. TRUNCATE TABLE removes the data by deallocating the data pages used to store the table's data, and only the page deallocations are recorded in the transaction log.
-----------------'KH'Only two things are infinite, the universe and human stupidity, and I'm not sure about the former. |
 |
|
|
|
|