Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
pjstreiff
Starting Member
12 Posts |
Posted - 2008-10-30 : 14:40:06
|
| I'm looking for TSQL code snippet that demonstrates how to use TRANCOUNT in a while loop that commits a transaction every X number of records.The idea is to break up an insert of say 3million rows into batches of 100,000 rows each, in order to enhance performance and reduce logging.Can anyone help me out?Thanks,-pjstreiff |
|
|
hanbingl
Aged Yak Warrior
652 Posts |
Posted - 2008-10-30 : 14:53:14
|
| From flat file or another table. |
 |
|
|
pjstreiff
Starting Member
12 Posts |
Posted - 2008-10-30 : 14:58:12
|
| From another table.Thanks,-pjstreiff |
 |
|
|
hanbingl
Aged Yak Warrior
652 Posts |
Posted - 2008-10-30 : 15:20:41
|
| Consider Import/Export wizard, these things you've mentioned is already built in. |
 |
|
|
Lamprey
Master Smack Fu Yak Hacker
4614 Posts |
Posted - 2008-10-30 : 15:25:43
|
| There is no way that I know of to make use of TRANCOUNT in this way.The basic idea is to do a loop and insert your batch amount begining and commiting each transaction until there are no more records left to process.You can control your batch size with SET ROWCOUNT or by using some ID and selecting the top X and using that ID as a starting point for the next batch. That all depends on your data. |
 |
|
|
pjstreiff
Starting Member
12 Posts |
Posted - 2008-10-30 : 16:03:21
|
| Lamprey:You are correct. I meant to say ROWCOUNT rather than TRANCOUNT.I found the following bit of code I may be able to adapt for my solution:DECLARE @batch_size INT;DECLARE @batch_nbr INT;SET @batch_size = 10000;SET @batch_nbr = 1;WHILE EXISTS(SELECT keycol FROM Source EXCEPT SELECT keycol FROM Stage)BEGIN WITH SourceCTE ( keycol, col1, col2, col3, rk ) AS (SELECT keycol, col1, col2, col3, ROW_NUMBER() OVER (ORDER BY keycol) FROM Source) INSERT INTO Stage (keycol, col1, col2, col3) SELECT keycol, col1, col2, col3 FROM SourceCTE WHERE rk > (@batch_nbr - 1 ) * @batch_size AND rk <= @batch_nbr * @batch_size; SET @batch_nbr = @batch_nbr + 1;END-pjstreiff |
 |
|
|
jimf
Master Smack Fu Yak Hacker
2875 Posts |
Posted - 2008-10-30 : 20:08:00
|
| If there's many records, a permanent table may be better, instead of #table, you'd just have to truncate it or drop and re-create it.DECLARE @start int,@stop intSET @start = 1SET @stop = 100000CREATE TABLE #t1(id int identity(1,1), keycol datatype, col1 datatype, col2 datatype, col3 datatype)CREATE CLUSTERED INDEX idx1 ON #t1(id)INSERT INTO #t1SELECT keycol, col1, col2, col3 FROM sourceWHILE exists (select * from #t1 where id > @start and id <@stop)BEGIN INSERT INTO Stage SELECT keycol, col1, col2, col3 FROM #t1 WHERE id >= @start and id < @stop SET @Start = @start +100000 SET @stop = @stop + 99999ENDJim |
 |
|
|
|
|
|
|
|