| Author |
Topic |
|
Jim77
Constraint Violating Yak Guru
440 Posts |
Posted - 2007-04-20 : 06:45:24
|
Sorry to harp on about the same issues but up until yesterday my bulk insert was working fine but today it is coming back with errors so I tried to explicitly put in field and row deliminters (in red) but to no evail and am still getting the same error messages:this is my codeselect @path = '\\servername\d$\Data\Ir6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.001' select @task = 'BULK INSERT [#sta] from ''' + @path +'''' select @task = @task + "WITH (FIELDTERMINATOR = ',',ROWTERMINATOR='char(13) + char(10)')"exec(@task) and am getting this error message : quote: Server: Msg 4866, Level 17, State 66, Line 1Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.
|
|
|
dinakar
Master Smack Fu Yak Hacker
2507 Posts |
Posted - 2007-04-20 : 12:11:40
|
| Can you post a couple of lines of the text file you have..************************Life is short. Enjoy it.************************http://weblogs.sqlteam.com/dinakar/ |
 |
|
|
Jim77
Constraint Violating Yak Guru
440 Posts |
Posted - 2007-04-20 : 12:40:41
|
Yeah sure this is basically the format of the text file : -
- v1.1,REC,3029440000286,000018.0000 ,000323.82 ,000227.25 ,OR003530,756,Updated ,148380,083540,000028.0000v1.1,REC,5011449004752,000012.0000 ,000125.88 ,000096.28 ,OR003531,756,Updated ,148380,083555,000013.0000v1.1,SHT,5011449004752,000012.0000-,000125.88-,000096.28-,OR003531,756,Updated ,148380,083555,000001.0000v1.1,REC,5010314101015,000012.0000 ,000185.88 ,000152.80 ,OR003531,756,Updated ,148380,083556,000025.0000v1.1,REC,5000281005416,000006.0000 ,000140.94 ,000104.35 ,OR003531,756,Updated ,148380,083556,000006.0000v1.1,SHT,5000281005416,000006.0000-,000140.94-,000104.35-,OR003531,756,Updated ,148380,083556,000000.0000v1.1,REC,0087000006935,000006.0000 ,000074.94 ,000058.38 ,OR003531,756,Updated ,148380,083556,000012.0000v1.1,REC,5010103916196,000006.0000 ,000019.74 ,000016.31 ,OR003531,756,Updated ,148380,083556,000007.0000
-
and this is my code that I was running through a DTS quiet fine up until today but when I parse the query now I get a Deferred prepare could not be completed statement and then when I try create a SP and run a exec SP sql task instead I get the bulk insert fails error message as per above:SET QUOTED_IDENTIFIER ON GOSET ANSI_NULLS ON GODECLARE @pstore as varchar(1000)DECLARE @path as varchar(1000)DECLARE @task as varchar (1000)DECLARE @storeno as intDECLARE storecursor CURSOR FOR Select storenum From -- store number count is 43 required retailchoisestore WHERE storenum <> 300OPEN storecursorFETCH NEXT FROM storecursor INTO @storenoWHILE @@fetch_status = 0BEGINcreate table #sta ([adj] [char] (127))-- run through each stores 31 files select @path = '\\jy2003smem012\d$\Data\Iris6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.001' select @task = 'BULK INSERT [#sta] from ''' + @path + ''''Exec(@task)select @path = '\\jy2003smem012\d$\Data\Iris6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.002' select @task = 'BULK INSERT [#sta] from ''' + @path + ''''Exec(@task)select @path = '\\jy2003smem012\d$\Data\Iris6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.003' select @task = 'BULK INSERT [#sta] from ''' + @path + ''''Exec(@task)select @path = '\\jy2003smem012\d$\Data\Iris6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.004' select @task = 'BULK INSERT [#sta] from ''' + @path + ''''Exec(@task)select @path = '\\jy2003smem012\d$\Data\Iris6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.005' select @task = 'BULK INSERT [#sta] from ''' + @path + ''''Exec(@task)select @path = '\\jy2003smem012\d$\Data\Iris6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.006' select @task = 'BULK INSERT [#sta] from ''' + @path + ''''Exec(@task)select @path = '\\jy2003smem012\d$\Data\Iris6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.007' select @task = 'BULK INSERT [#sta] from ''' + @path + ''''Exec(@task)select @path = '\\jy2003smem012\d$\Data\Iris6\Data\HOSTIN\000' + cast(@storeno as char(3)) + '\Dayadjst\dayadjst.008' select @task = 'BULK INSERT [#sta] from ''' + @path + ''''Exec(@task)-- delete all rows that are in commonDELETE FROM #staWHERE substring(adj,1,2) <> 'v1'delete #sta from #sta t, StockAdjustments swhere s.store = @storeno and s.type collate database_default = substring(t.adj,6,3) and s.ean collate database_default = substring(t.adj,10,13) and s.tdate = cast(dateadd(dd,(cast(substring(t.adj,93,6) as int) - 146766), '31/10/2002') as smalldatetime) and s.ttime = substring(t.adj collate database_default,100,6)-- check if there is anything in staselect * from #staif @@rowcount <> 0begin insert into StockAdjustments Select cast(@storeno as varchar(5)) as Store, substring(adj,6,3) Type, substring(adj,10,13) EAN, Qty = case -- Check if value is minus when substring(adj,35,1) = '-' then cast(substring(adj,24,6) as int) * -1 else cast(substring(adj,24,6) as int) end, Retail = case -- Check if value is minus when substring(adj,46,1) = '-' then cast(substring(adj,37,9) as money) * -1 else cast(substring(adj,37,9) as money) end, Cost = case -- Check if value is minus when substring(adj,57,1) = '-' then cast(substring(adj,48,9) as money) * -1 else cast(substring(adj,48,9) as money) end, substring(adj,59,8) A, substring(adj,68,3) B, substring(adj,72,20) C, cast(dateadd(dd,(cast(substring(adj,93,6) as int) - 146766), '31/10/2002') as smalldatetime) tDate, substring(adj,100,6) tTime, nullfrom #staendDROP TABLE #staFETCH NEXT FROM storecursor INTO @storenoENDCLOSE storecursorDEALLOCATE storecursorGOSET QUOTED_IDENTIFIER OFF GOSET ANSI_NULLS ON GO |
 |
|
|
dinakar
Master Smack Fu Yak Hacker
2507 Posts |
Posted - 2007-04-20 : 12:46:42
|
| I tried your BULK INSERT command it worked fine for me. What I feel could be an issue is you are using the same variable over and over again. You need to initialize it/set it to NULL before you use it again. Perhaps the values are getting concatenated from previous assignments?************************Life is short. Enjoy it.************************http://weblogs.sqlteam.com/dinakar/ |
 |
|
|
Jim77
Constraint Violating Yak Guru
440 Posts |
Posted - 2007-04-21 : 11:18:46
|
cheers for looking at it DinaKar, nulling the variables doesnt work either.I have been batteling to find the problem for a while now and only sometimes it raises its ugly head which obviously means one of the files getting imported is causing the problem, When I try and parse it in a execute sQL task in DTS it gives me a 'Deferred prepare could not be completed Statments(s), could not be prepared' error message I ignore the message and try run it through DTS and it looks like it is running correctly up until about two minutes into the process it gives me the following error : quote: the statement has been terminatedoledb provider 'stream reported and error.The provider did not give any information about the error'Bulk insert fails.Column is too long in the data file for row 1 column1.Make surethe field terminator and row terminator are specified correctly
although tyhe whole import process into the stockadjustments table seems to be working... I have then tried making the variables and the field in the stockadjustments and #sta table larger and also tried creating a stored procedure and using the exec command in DTS but end up getting the same error message : quote: Server: Msg 4866, Level 17, State 66, Line 1Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.Server: Msg 7399, Level 16, State 1, Line 1OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows returned 0x80004005: The provider did not give any information about the error.].The statement has been terminated.
Will let you know my solution should I happen to stumble across it.Thanks again. |
 |
|
|
|
|
|