Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 importing data from multiple txt files

Author  Topic 

blessedame
Starting Member

9 Posts

Posted - 2002-06-03 : 14:08:41
I am trying to import 90 different txt files into one sql 2000 table. Does anyone know how I can do this. at the moment, I am using dts but i can only do one import at a time. I dont really what to do that 90 times!.

Thanks for your help.



Page47
Master Smack Fu Yak Hacker

2878 Posts

Posted - 2002-06-03 : 14:21:00
quote:

Parallel Data Load Using bcp
Microsoft® SQL Server™ allows data to be bulk copied into a single table from multiple clients in parallel using the bcp utility or BULK INSERT statement. This can improve the performance of data load operations. To bulk copy data into SQL Server in parallel:

Set the database option select into/bulkcopy to true using sp_dboption.
Specify the TABLOCK hint. For more information, see Controlling the Locking Behavior.
Ensure that the table does not have any indexes.

--------------------------------------------------------------------------------

Note Any application based on the DB-Library client library supplied with SQL Server version 6.5 or earlier, including the bcp utility, is not able to participate in parallel data loads into SQL Server. Only applications using the ODBC or SQL OLE DB-based APIs can perform parallel data loads into a single table.


--------------------------------------------------------------------------------

After data has been bulk copied into a single table from multiple clients, any nonclustered indexes that need to be created can also be created in parallel by simply creating each nonclustered index from a different client concurrently.


--------------------------------------------------------------------------------

Note Any clustered index on the table should be created first from a single client before creating the nonclustered indexes.

--SQL Server 7 - Books Online



<O>
Go to Top of Page

blessedame
Starting Member

9 Posts

Posted - 2002-06-03 : 14:46:44
Thanks a bunch page47.

I still have a problem though, this is what I have

BULK INSERT customer
FROM 'c:\tests\data*.txt
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '|\n'
)

it wont let me use the wild card. But I need to do that so I can run the query once for all 90 files.

Any help would be appreciated.

Thanks




Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-06-03 : 15:33:40
DECLARE @sql varchar(8000)
SELECT @sql=''

CREATE TABLE #files (filename varchar(255) null)
INSERT INTO #files EXEC master..xp_cmdshell 'dir/b c:\tests\data*.txt'
DELETE FROM #files WHERE filename IS NULL

SELECT @sql=@sql + 'BULK INSERT customer FROM ''c:\tests\' + filename + ''' WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''|\n'')' + char(13) + char(10)
FROM #files

--EXEC(@sql) --uncomment this line to execute it directly, or
--PRINT @sql --copy and paste the command into query analyzer and run it

DROP TABLE #files


I don't want to sound like a grumpy old bastard, but how hard is it to get a directory list, paste it into Notepad, then paste some SQL code around each file name, then paste the whole thing into Query Analyzer and run it?

Go to Top of Page

Page47
Master Smack Fu Yak Hacker

2878 Posts

Posted - 2002-06-03 : 15:39:27
what grumpy failed to recommend is a good way to set off all 90 bcp's at the same time . . .

for that you are gonna need to write some custom multi-treading code with a 4gl de jour....

<O>
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-06-03 : 15:41:26
I can't believe that 90 parallel imports will actually run, and I certainly don't think that it'll run faster than doing the 90 files sequentially.

Go to Top of Page

Page47
Master Smack Fu Yak Hacker

2878 Posts

Posted - 2002-06-03 : 15:48:50
I agree with Rob on that....

I would bet you could get a bit more performance running more than one in at a time. It'll prolly be very hardware/load dependent. Maybe take Rob's script and split it into 3 X 30 file connections . . .

<O>
Go to Top of Page

blessedame
Starting Member

9 Posts

Posted - 2002-06-03 : 16:10:30
cool, thanks

--- and I know you are not a "grumpy old bastard".

:-)


Go to Top of Page

dataphile
Yak Posting Veteran

71 Posts

Posted - 2002-06-04 : 03:45:15
I would copy the files into one large file, in (D)OS
and import this single file.

copy file1.txt + file2.txt ... file90.txt bigfile.txt
(Save this to a batch file if you are going to do it often.
This batch file, with a cmd extension, could be executed from within DTS)

If there are headers in the files, they are easy enough to get rid of afterwards.

Go to Top of Page
   

- Advertisement -