Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 Importing Large Flat Files

Author  Topic 

AskSQLTeam
Ask SQLTeam Question

0 Posts

Posted - 2001-10-05 : 20:34:27
Bob writes "Hello,

I have an issue regarding importing very LARGE flat files into an SQL Server 7.0 database. One file is over 1.3 million lines (records). I am in an .asp environment using VBScript as my server language. I have tried this several ways and have been successful with smaller sample files but as soon as I try the live (have I mentioned very big) file it runs and runs and eventually times out without finishing the whole file. Right now I am grabbing the first line in VBScript in an array, passing the array into the stored procedure, then I take the array elements I need to insert/update into the database (I don't utilize all the fields in the file...only certain fields apply to my situation) and make the necessary insert/update. Then drop out of the stored procedure, back into VBS, grab the next line, pass it into the stored procedure etc etc...

This line by line method is proving MUCH too slow. I am considering the BULK INSERT command within a s.p. but because I don't need all the fields in the flat file, I would probably have to insert the entire file in a temp table and then loop thru all the records in the newly created temp table (with a cursor or while loop) and take only the info I need and then insert them into the 'live' table(s). Is this going to improve performance at all?? It still seems like I could make this more efficient.

I hope that I was clear enough for you to understand the basics of what I am trying to get across.

Any suggestions would be greatly appreciated!"
   

- Advertisement -