Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 Data Corruption Issues
 Bulk Insert load issue

Author  Topic 

leafonthewind
Starting Member

2 Posts

Posted - 2013-07-08 : 12:52:44
Hi, I've just started working with SQL Server and am having an issue with bulk insert. When I try to load data from a file into a table I created, some of the records from the file return errors when trying to load a certain part of the record. The error I'm receiving is:

Msg 4863, Level 16, State 1, Line 3
Bulk load data conversion error (truncation) for row 9295, column 214 (NR_COL_6_VAL_NR).

The code im using is:
USE [TEST]
GO

TRUNCATE TABLE ACCT_ACCT
BULK INSERT ACCT_ACCT_View
FROM 'file_name.txt'
WITH (FIELDTERMINATOR ='|',
ROWTERMINATOR ='\n',
FIRSTROW = 2,
ERRORFILE = 'file_name.err.txt')
UPDATE STATISTICS ACCT_ACCT
GO

The table definition for the column thats giving me errors is:
[NR_COL_6_VAL_NR] [decimal](18, 8) NULL,

And here's an example of the part of the faulty records thats giving an error:

3179815000.48

I've spent hours looking at this and cant find the problem. There are no delimiter errors, and no matter what I change the data type to I get the same error. Any help would be greatly appreciated! Also, sorry if I have anything formatted incorrectly I'm entirely new here :P if you need any more info feel free to let me know, thanks in advance.

James K
Master Smack Fu Yak Hacker

3873 Posts

Posted - 2013-07-08 : 14:12:36
Although you have ruled out delimiter errors, most often you see this error when there are delimiter errors.

I have seen two kinds

- the field terminator error which happens very often with comma-delimited files when there is a comma as part of the data itself. That causes the columns to be offset and a column that is not numeric gets placed where it should not be.

- the row terminator error which happens when bulk insert is expecting \r, or \n or \r\n and the file does not exactly match what bulk insert is expecting.

In your case, I am suspecting the latter because usually you don't see pipes as part of the data.

Assuming that it is not a delimiter problem, another thing you could try to do (and this is what I always do) is to import the data into a staging table that has all VARCHAR or NVARCHAR columns. That should succeed with no conversion errors. Then, use a script to move the data to your destination table. My experience has been that it is easier to debug and catch errors when you do it this way.

Yet another possibility is to use SSIS, which can process good data to the actual destination table and send bad data to a different destination.
Go to Top of Page

leafonthewind
Starting Member

2 Posts

Posted - 2013-07-08 : 14:36:14
For your first suggestion, by staging table do you mean like a table with identical column names and just change all the data types to varchar? Also, the only rows that are getting this error are all decimal numbers that have 9 or more characters, which most programs will convert to scientific notation (when i open the data file in Excel, for example, it automatically converts it to scientific notation). Does this have anything to do with it?
and thanks for your help!
Go to Top of Page

James K
Master Smack Fu Yak Hacker

3873 Posts

Posted - 2013-07-08 : 17:03:21
Yes, creating an intermediate table with identical column names and data types for all columns as varchar(512) or something similar is exactly what I meant. You would then import into that intermediate table.

In the abasence of a staging table, each time there is an issue with the data, you are going to at a loss to figure out exactly what went wrong. With staging table, you will always get the data into the table (well, in most cases). Once you have it there and if the move to the final target table fails, it is easier to examine the data and fix any problems.

As to your question about whether numeric data type is causing the problem, I don't think it is. If you are unable to use a staging table, you might consider using a format file to specify the exact nature of the expected input data. http://msdn.microsoft.com/en-us/library/ms178129.aspx
Go to Top of Page
   

- Advertisement -