Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
zicosql
Starting Member
5 Posts |
Posted - 2008-01-24 : 17:43:34
|
| Is there an ideal and quick way to bulk insert huge amount of data into an already large table. The table I have has about 350 columns and currently 1 million rows, but this will increase to potentially 50 million rows.I do all my processing in temp tables which is quick, but when I have to insert between 50,000 and 150,000 rows at a go it takes 10-30 minutes to do so. From what I can see this data amounts to about 100-300mb in data size.When I analyse sys.sysprocesses and sp_lock I can see the query goes into a suspended state and then back to runnable, and also there are no other processes accessing this table. Performance is key and I was wondering what technqiues I could use to bulk insert this data. Would indexing help? Or would it be best to insert 1000 rows at a time? All I'm looking for is pointers or suggestions on how to improve this. Thanks. |
|
|
spirit1
Cybernetic Yak Master
11752 Posts |
|
|
zicosql
Starting Member
5 Posts |
Posted - 2008-01-24 : 19:39:46
|
| Thanks, however I'm trying to insert data from a temp table and not a flat file into the main table. Is there any bulk insert syntax equivalent to do this? |
 |
|
|
Lamprey
Master Smack Fu Yak Hacker
4614 Posts |
Posted - 2008-01-24 : 19:56:56
|
| You might try doing it in batches. In order to do this you can use a loop to insert some number of rows (say 1000, 5000, 10,000, 1,000,000 whatever) at a time. you can try differnt combinations of sizes to see waht works best for your system load and transaction log. |
 |
|
|
|
|
|
|
|