Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 DTS, stored procedure, text file import ... intranet environment

Author  Topic 

AskSQLTeam
Ask SQLTeam Question

0 Posts

Posted - 2002-04-03 : 10:02:13
Kyle writes "My task is to import records from a pipe-delimited text file to a SQL Server table. The text file will typically contain between 50,000 and 75,000 records. I need to design a process to do this on a daily basis, efficiently, using ASP. Unfortunately, scheduling a DTS job is out of the question at this point.

I have been able to use DTS via ASP to import a small test file. It's not all that fast, and I fear it may be slower on the real files. I'm not too sure how to implement a sp solution, (I am also going to test ADO using array values and AddNew,) but I'd like to hear the expert opinion on this problem, and how I'd implement your recommended solution.

Again, I need to do this in a browser (don't need to display the records), so response time is critical. I don't expect miracle times, but I'd like it to be as fast as possible.

Thanks for taking the time to read, and hopefully answer this question.

PS. If they'd just give me the Informix OLE driver so I could use a query to access the original table containing the data , I'm sure that it would be faster than processing the text file ... but they're being stubborn for the time being. :-)

Thanks for your help guys."

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-04-03 : 10:21:25
BULK INSERT and bcp are excellent alternatives for you, since you're dealing with text files. You can set the column delimeter to a pipe with either method very easily:

BULK INSERT myDatabase..myTable FROM '\\uncserver\path\filename.txt'
WITH (FIELDTERMINATOR='|')


or

bcp myDatabase..myTable in "\\uncserver\path\filename.txt" -c -t| -Sserver -Uuser -Ppassword

bcp is a command-line utility, so you would need to call it using xp_cmdshell in a T-SQL script/procedure. The problem you might encounter is the text file's location; it has to be either on a local drive for the SQL Server, or a UNC path that can be accessed from the SQL Server (if the latter, the SQL Server service account must have network permissions)

If you are doing this from an ASP page, you may first have to upload the text file to the web or SQL Server in order for this to work correctly, but that's the hardest part.

Books Online has more information on bcp, BULK INSERT, and xp_cmdshell.

Go to Top of Page

Kyle
Starting Member

7 Posts

Posted - 2002-04-03 : 10:35:51
Thanks ... I'll be uploading the file to the SQL Server. The BULK INSERT looks like it'll do the trick. You say I do that within a stored procedure? I'll do the required reading ...

Should I be concerned about error handling with this? What if it fails for any reason?
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-04-03 : 10:41:56
You can have BULK INSERT set to ignore a certain number of errors, or all of them, or stop on the first error (MAXERRORS setting). Most of the time, when I (and many other SQL Teamers) anticipate some data cleanup, I'll put the data into a "holding" or "staging" table. This staging table has the same structure as the destination table, but all of the constraints are taken out, columns allow Null, no indexes, etc. Then I perform some data cleanup or validation on the staging table, and when I'm done, simply do:

INSERT INTO destinationTable SELECT * FROM stagingTable

What kind of errors do you anticipate?

Go to Top of Page

Kyle
Starting Member

7 Posts

Posted - 2002-04-03 : 11:28:07
Well, I'm not sure ... there really should be no errors because all the data should be prevalidated, but just in case ... I'd want to make sure that an error code is returned that I can trap in the ASP page. Again, speed is my concern... I don't want the page to timeout.

Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-04-03 : 11:37:59
Unless you're dealing with 30,000+ rows and a LOT of errors in the text file, you should fit pretty comfortably inside a 60 second ASP timeout. Once the file is uploaded, there's no additional web server overhead, and the SQL Server routines for BULK INSERT are blazing fast; 500,000 rows/min is not uncommon. If you do import the data into a staging table, and you DO NOT use cursors to process the table, you'll get similar fast performance.

If you need to catch errors, you're better off using the staging table approach, and using SELECT with a WHERE clause that will return invalid rows/data. That way, you can still import the valid data and leave the exceptions for later review.

Edited by - robvolk on 04/03/2002 11:39:33
Go to Top of Page

Kyle
Starting Member

7 Posts

Posted - 2002-04-03 : 14:13:57
Thanks ... it'll be 50,000 to 75,000 records per import, around 40 fields, total bytes per record should be about 800-1K. More than 10 seconds is kinda long though. ... I wonder if there's a way to execute the job in the background and return control to the ASP page ... I have yet to test it with the real volume of records though, but once I do I'll have a real idea of what the response will be like.

Thanks for your help !!

Go to Top of Page

MichaelP
Jedi Yak

2489 Posts

Posted - 2002-04-03 : 20:24:08
The only way that I know of to execute this and then immediatly return is to Add a job that runs "now" and then return. The only problem with this there's no way for your ASP page to trap any errors.

Look up sp_add_job , sp_add_jobstep, and sp_add_jobschedule in BOL.

Michael

Go to Top of Page
   

- Advertisement -