Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 Antother DTS Q: Skipping bad input rows?

Author  Topic 

aiken
Aged Yak Warrior

525 Posts

Posted - 2002-02-27 : 13:59:09
This is an extension of my other recent post: http://www.sqlteam.com/Forums/topic.asp?TOPIC_ID=13352

The IRC nick DB is a huge file (a million lines or so), much of which has data that users have entered when they register a nick. As such, it's largely unvalidated.

The new problem I'm having is line where some user has entered practically a book for their "description". Where most people's full name is "John Doe" or even "Not gonna tell you," this one is about 200 words long. Apparently the IRC server doesn't care.

Fortunately, this is on a "F" type row, which I am perfectly happy to skip altogether, since I'm only looking at "n" rows (and hopefully "e" rows if someone can help me out in my other post).

Anyways, my problem is that DTS is parsing this row and dying because it has too many columns before my VBscript can throw the row out because Col001 isn't "n". The exact error is "Too many columns found in the current row; non-whitespace data was found after the last defined column's data."

Is there any way I can catch this exception and tell DTS to ignore it and skip that row, since my code would do that anyway? I don't want it to corrupt the entire batch, and I really don't want to set the batch size to 1.

The bad row happens after about 200,000 rows, so DTS doesn't get that far when first checking the format of the file. I can't do anything to move the row forward in the file; the next time the file is spit out by the IRC server, it'll go back where it was (or move around unpredictably).

Thanks
-b

chadmat
The Chadinator

1974 Posts

Posted - 2002-02-27 : 16:10:43
What version/SP are you using?

See:
http://support.microsoft.com/default.aspx?scid=kb;en-us;Q300181&SD=MSKB&

It is not exactly the same, but may give some insight.

-Chad

Go to Top of Page

aiken
Aged Yak Warrior

525 Posts

Posted - 2002-02-27 : 17:52:17
Thanks for the link; I'm SQL2000 SP1, but I'm not experiencing the problem the article seems to talk about. It looks like the article is saying that some version of SQL server got stuck on a single bad row and iterated it MaxErrorCount times and then failed the task.

My problem is that there are a few bad rows, and they are causing batch loads to fail. I could reduce the batch size to 1 row and set MaxErrorCount to 10000, and I'd be fine... except the job would take two weeks to finish. What I really need is for the DTS package to skip the row in the input file when it's bad, rather than throwing an error.

Thanks!
-b

Go to Top of Page
   

- Advertisement -