Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2005 Forums
 Transact-SQL (2005)
 Transaction Space and Bulk Insertion of Data

Author  Topic 

jonathans
Starting Member

40 Posts

Posted - 2009-10-19 : 08:10:07
Hey guys

Ok, im in a little pickle, i can't seem to think my way out of this problem I have.

Essentially ive got to import, and process 500 000 rows of data per table, with multiple tables per data set.

Now due to company red tape, i've been limited to a 2gig transaction log file, which obviously isnt much.

Now, I've set the logging to simple (as I dont really need it anyway), but when I do the bulk importing of data, in some area's it fills up the log file, and kills the query. [Msg 9002, Level 17, State 4, Procedure %procname%, Line 53]

I was told by the sql admins to do the insertion in batches, which I already am doing, but its just alot of data that has to be cross normalized and so forth.

So, my question, how do I fix this? And what else can I do?

WARNING: Running on cold coffee!

Ifor
Aged Yak Warrior

700 Posts

Posted - 2009-10-19 : 09:00:48
You could try doing the initial import onto your PC. (SQL Express would probably do.)
The data could then be processed and copied in batches to the main server.
Go to Top of Page

jonathans
Starting Member

40 Posts

Posted - 2009-10-19 : 09:06:21
Already done this, there is just to much data.

Im thinking of maybe having a SP call multiple sub SB's to do the insertion of data. But I dont think it would help, as the calling SP would still be within the transaction scope.

PS. This whole importing and normalization is a daily process so it needs to be sustainable.

WARNING: Running on cold coffee!
Go to Top of Page

YellowBug
Aged Yak Warrior

616 Posts

Posted - 2009-10-19 : 09:57:51
quote:
I've set the logging to simple (as I dont really need it anyway)

This refers to the RECOVERY mode of the database. Setting it to SIMPLE means that the transaction log is not usable for point-in-time recovery. It affects you backup/recovery plan. Transactions will still be logged, in the transaction log file, as you have seen.

Are you using BCP / BULK INSERT /SSIS?
How are you batching the inserts? Have a look at this: http://msdn.microsoft.com/en-us/library/ms188267(SQL.90).aspx
Are you using staging tables?
Go to Top of Page

jonathans
Starting Member

40 Posts

Posted - 2009-10-19 : 10:04:42
1. Getting the data in, is done via BCP (this all works no problems)
2. Processing of Data is done via SP's per section, and makes use of staging tables (this mostly works but crashes with transaction log full message on occasion)



WARNING: Running on cold coffee!
Go to Top of Page

YellowBug
Aged Yak Warrior

616 Posts

Posted - 2009-10-19 : 10:19:55
How are you batching in the SP? WHILE loop? TOP?
What is the batch size? Have you tried smaller batches?
Go to Top of Page

jonathans
Starting Member

40 Posts

Posted - 2009-10-19 : 10:45:25
The SP does batches by doing something like this...

INSERT INTO [Table1]
( [COl1], [Col2]... )
OUTPUT Inserted.Ids, Inserted.LinkKey
INTO @InsertedIds
SELECT MyValue1, MyValue2
FROM [SourceTable]
WHERE [MyValue3] = @MyValue3

INSERT INTO [Table2]
( [Col1], [Col2]...)
SELECT @InsertedIds.Id, MyValue3
FROM @InsertedIds
INNER JOIN [SourceTable] ON @InsertedIds.LinkKey = SourceTable.UniqueImportedKey

Something like that, so there's no WHILE's or CURSORS.

But it seems the problem is I had a table with to many indexes (3 index's) and this was filling the transaction log up, so I've got to rethink how I do that little bit

WARNING: Running on cold coffee!
Go to Top of Page

YellowBug
Aged Yak Warrior

616 Posts

Posted - 2009-10-19 : 11:13:57
Yes, that could be a problem. What is the max batch size, in this method?
I'm guessing it varies per @MyValue3.
Go to Top of Page

jonathans
Starting Member

40 Posts

Posted - 2009-10-20 : 04:06:03
It varies on a lot of different things, some times for every row it would need an additional 2 rows, other times, 20 rows.

Depends on the type of data (based on business requirement rules)

WARNING: Running on cold coffee!
Go to Top of Page
   

- Advertisement -