Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 insert 20 million rows into 170 million row table

Author  Topic 

dgaylor
Yak Posting Veteran

54 Posts

Posted - 2003-03-28 : 00:11:08
What is the best way to insert 20 million new rows from one table into another table that already contains 170 million rows? The large table has a clustered index on an ID field which is a char(12), and the table is 88 gigs right now. Any suggestions would help. Thanks.

byrmol
Shed Building SQL Farmer

1591 Posts

Posted - 2003-03-28 : 00:51:33
BCP OUT the 20 million and then BCP IN..

Switches...
Use Native format (-n)
Increase Packet Size to maximum 65535 (-a)
Use the TABLOCK hint (-h)




DavidM

"SQL-3 is an abomination.."
Go to Top of Page

dgaylor
Yak Posting Veteran

54 Posts

Posted - 2003-03-28 : 01:28:59
Thanks for the information. I am a bit new and am not exactly sure of how to do this. Can you offer an example? Thanks.

Go to Top of Page

byrmol
Shed Building SQL Farmer

1591 Posts

Posted - 2003-03-28 : 02:01:13
Goto "Books On LIne" (BOL)

It has plenty of examples...

Perhaps you should do a practice on some junk data first....

DavidM

"SQL-3 is an abomination.."
Go to Top of Page

SamC
White Water Yakist

3467 Posts

Posted - 2003-03-28 : 08:26:04

Why don't you use an INSERT / SELECT for this job?

Sam

Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2003-03-28 : 13:04:20
Bcp is a command line utility to move data around. I would not suggest using bcp though if you are just moving data around in tables on the same server. Bcp is usually used when you need to move data around amongst two or more boxes. I would suggest using INSERT INTO/SELECT statement to do this as SamC mentioned. But bcp can also do this. It's just that you are going to be bcping the data out to a file and then bcping the data in from the file. There is no reason to have to use files in this scenario. You might also want to consider using DTS to move the data. You could simply right click on your database in Enterprise Manager and select import data. Then follow the wizard screens to get your data into the correct table.

Tara
Go to Top of Page

byrmol
Shed Building SQL Farmer

1591 Posts

Posted - 2003-03-28 : 17:37:01
A 20 million row INSERT/SELECT will be about 100X times slower than BCPing. The killer is the transaction log regardless of the recovery mode of the database.

I would classify 20 million rows as a BULK COPY operation....


DavidM

"SQL-3 is an abomination.."
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2003-03-28 : 17:51:07
quote:

A 20 million row INSERT/SELECT will be about 100X times slower than BCPing. The killer is the transaction log regardless of the recovery mode of the database.

I would classify 20 million rows as a BULK COPY operation....




Very true, byrmol. Since dgaylor isn't familiar with bcp, I would suggest using DTS instead though since bcp can seem tricky at first glance.

Tara
Go to Top of Page
   

- Advertisement -