Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Lester Vincent
Starting Member
22 Posts |
Posted - 2006-01-08 : 01:29:44
|
Hi people. I hope you can help me overcome this challenge with a 'dynamic' BULK INSERT' stored procedure. The procedure parses OK but aborts when I try to run it in Query Analyser. Details, including the stored proc, an extract from the text file and the (very unhelpful) error message are as follows: -As my programme has to import 60-100 of these from emailed Word docs each day, I'm in a fix until this can be solved. An/all help is greatly appreciated.Now the details: -EXTRACTS FROM IMPORT SOURCE FILE, CONVERTED TO A TEXT FILE FROM A MS WORD DOCUMENT.Note: (1) VBA CODE USED TO CREATE A NUMBERED COLUMN, FOR LINE NUMBERS, AND ALL TEXT IN EACH LINE IS IN THE SECOND COLUMN. (2) Text length in second column can vary from NULL to 255 ANSI characters. (3) Apart from this, each source file follows the same kind of format. FIELD SEPARATOR IS TAB CHARACTER: -87 Access=BY APPOINTMENT ONLY88 SiteInstructions=89 Location=FRONT90 WorkInstructions=91 PropertyStatus=OCCUPIED92 OCHProperty=N93 WorkContainsDismods=N94 Programme=NA95 RoofType=CTE - CONCRETE TILES96 HotWaterSystem=ESE - ELECTRIC STORAGE EXTERNAL97 StoveType=E - ELECTRIC98 BuildingMaterial=B - BRICK99 TypeOfWindows=T - TIMBER100 RoomHeaterType=NA101 DateFirstTenanted=26-DEC-1994102 Visit=NA103 WorkProgram=REP104 Field4=NA105 Field5=NA106 Field6=NA107 Field7=NA108 Field8=NA109 Field9=NA110 Field10=NA111 JobLocValids=AIRL BALC BATH BED1 BED2 BED3 BED4 BED5 BED6 BED7 BED8 CAEXT CAINT CALDY CARP COMR DIN DWEL ENS ENT FAM FEE FRONT GAR GARBC HALL KIT LDY LGE LHS LIFT METER REAR RHS ROOF SHED STOR STW UNKN WC 112 REPDOOR ~GNC~ 1~FRONT~Inspect and Repair door~Front door - jamb broken 2 months ago - Difficulty latching door113 -----------------------------------------------------------PURPOSE OF EXERCISE: -TO IMPORT WORD DOCUMENT INTO A SQL SERVER TABLE AND FROM THERE, CREATE ROWS IN SEVERAL SQLS TABLES.STRATEGY:-1. CREATE THE SUBJECT TEXT FILE, WITH LINE NUMBERS AND TAB SEPARATORS2. USING AN ADO COMMAND IN CLIENT WORKSTATION, CREATE PARAMETER OF SOURCE-FILE NAME AND PATH AND CALL STORED PROC WITH DYNAMIC 'BULK INSERT' STRING3. STORED PROC PARSES OK, BUT NOTHING HAPPENS TO IMPORT THE DATA.HERE IS THE 'DYNAMIC' STORED PROCEDURE: ------------------------------------------CREATE PROCEDURE proc_TransferToTable_Server(@DataFile varchar(255)='', @Return int =NULL output)/*Uses @DataFile parameter as data source for BULK INSERT command, which pulls data into "tbl_OrImports_Temp" table.This table is emptied before each Order import. The table has only two fields, LineNo and Text and the field separator is TAB and assigns @@ROWCOUNT value to @Rows variable.*/ASDECLARE @Query varchar(255)SET @Query='BULK INSERT dbo.tbl_OrdImports_Temp FROM '+CHAR(39)+ @DataFile+CHAR(39)SET @Query=@Query + ' WITH (FIELDTERMINATOR='+CHAR(39)+'\t'+CHAR(39)+','SET @Query=@Query + ' ROWTERMINATOR='+CHAR(39)+'\n'+CHAR(39)+','SET @Query=@Query + ' CODEPAGE='+CHAR(39)+'RAW'+CHAR(39)+')'PRINT @Query --Prints full text of @Query EXEC (@Query)RETURN @ReturnGO-----------------------------------------WHEN TESTED IN QUERY ANALYSER, THE SPROC IS ABORTED, WITH THE FOLLOWING RESPONSE: -(PRINTED TEXT OF THE BULK INSERT STATEMENT): -"BULK INSERT dbo.tbl_OrdImports_Temp FROM 'C:\NewOrdersForTesting\4186584_1-104870880_Numbered.TXT' WITH (FIELDTERMINATOR='\t', ROWTERMINATOR='\n')" (RESPONSE):" Server: Msg 4832, Level 16, State 1, Line 1[Microsoft][ODBC SQL Server Driver][SQL Server]Bulk Insert: Unexpected end-of-file (EOF) encountered in data file.Server: Msg 7399, Level 16, State 1, Line 1[Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.The statement has been terminated.@RETURN_VALUE = 0 "I have tried BULK INSERT without any arguments, ie using only the defaults.I have then added the Fieldterminator and RowTerminator arguments.I have then added the CodePage argument, using values, 'ACP' and 'RAW'. I have also tried it, adding the "KEEPNULLS argument.None of the above arguments makes a difference to the result. I have been using the same file format in an MS Access app and importing, using the "TransferText" method without any trouble for the past 2 years. Obviously, SQLS found something it doesn't like in there. Regards, (looking forward to getting your good advice) Lester VincentSydney |
|
Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)
7020 Posts |
Posted - 2006-01-08 : 02:41:46
|
The following code worked OK for me. You should check the data file that you are trying to import to make sure there is no bad data, especially extra tabs or end of line characters.print '*** Show contents of C:\Temp\bulk_insert.txt ***'exec master.dbo.xp_cmdshell 'type C:\Temp\bulk_insert.txt' gocreate table TEST_BULK_INSERT(LINE_NUM int null,LINE_DATA VARCHAR(50) null)goprint '*** Do Bulk insert ***'bulk insert dbo.TEST_BULK_INSERTfrom 'C:\Temp\bulk_insert.txt' WITH ( FIELDTERMINATOR = '\t', ROWTERMINATOR = '\n' )print '*** Show Bulk insert Results ***'select * from TEST_BULK_INSERTgodrop table TEST_BULK_INSERT---------------------------------------------- Results -----------------------------------------------------------------------*** Show contents of C:\Temp\bulk_insert.txt ***output-----------------------------------------------------------87 Access=BY APPOINTMENT ONLY88 SiteInstructions=89 Location=FRONT90 WorkInstructions=91 PropertyStatus=OCCUPIED92 OCHProperty=N93 WorkContainsDismods=N94 Programme=NA95 RoofType=CTE - CONCRETE TILES96 HotWaterSystem=ESE - ELECTRIC STORAGE EXTERNALNULL(11 row(s) affected)*** Do Bulk insert ***(10 row(s) affected)*** Show Bulk insert Results ***LINE_NUM LINE_DATA ----------- -------------------------------------------------- 87 Access=BY APPOINTMENT ONLY88 SiteInstructions=89 Location=FRONT90 WorkInstructions=91 PropertyStatus=OCCUPIED92 OCHProperty=N93 WorkContainsDismods=N94 Programme=NA95 RoofType=CTE - CONCRETE TILES96 HotWaterSystem=ESE - ELECTRIC STORAGE EXTERNAL(10 row(s) affected) CODO ERGO SUM |
|
|
Lester Vincent
Starting Member
22 Posts |
Posted - 2006-01-08 : 06:30:20
|
Hi people. Too early to expect a reply yet, but just to advise that I have progressed a little on this matter. The problem seemed to be that the table I was importing to had a third column not receiving data (a time-stamp column,which was unnecessary, so I deleted it.) Since deleting the column, I have been able to import the text file, by executing (ie via "DEBUG") the subject stored procedure from within Query Analyser. (It works there, without the need for any of the "WITH" arguments previously tried.)However, the Stored Procedure has to be executed from an ADODB.Command in the client's workstation user-interface programme (which happens to be in Access 2002), but it won't work from there at all, and gives no reason why it won't. I have checked the ADODB.Command's ADODB.Parameter values and all the code associated with this. The same code calls a separate Stored Proc to delete all rows from the SQLS table. That works a treat !!But 2 code lines later, the BULK INSERT command does not.It seems there's another brick wall to knock over. Any clues to this, people? I know BULK INSERT requires Server permissions (eg sysadmin or bulkadmin server roles) but I seem to have these already. Otherwise, why will it work in Query Analyser and not via the ADODB.Commands which successfully run other Stored Procs?I do hope you can help me with this one.Lester VincentSydney |
|
|
Lester Vincent
Starting Member
22 Posts |
Posted - 2006-01-08 : 06:41:23
|
Thanks for your reply, Michael. I got it just after posting my second note on this matter. As you can see, I'm able to run the import within Query Analyser's DEBUG option, but there seems to be a barrier to it running from the Access UI, via ADODB.Command, even though other similar Commands will execute stored procs on the same SQLS database quite OK.I notice you have used a temp table to import into. Maybe that would behave differently. That could work as long as I stack a lot of other related code into the same stored proc - something I was trying to avoid if possible. Thanks for taking the time. If you have any other clues, I'm all ears !!!Lester VincentSydney |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2006-01-08 : 09:07:43
|
Try it with a single line, single row file and build up from there.The error you are getting is due to reaching the end of file when the bcp was still expecting data - usually due to incorrect row or field terminator or a bad line at the end of the file.Have a look athttp://www.nigelrivett.net/SQLTsql/ImportTextFiles.htmlhttp://www.nigelrivett.net/SQLTsql/BCP_quoted_CSV_Format_file.htmlhttp://www.nigelrivett.net/SQLTsql/s_ProcessAllFilesInDir.html==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)
7020 Posts |
Posted - 2006-01-08 : 12:58:22
|
quote: Originally posted by Lester Vincent Thanks for your reply, Michael. I got it just after posting my second note on this matter. As you can see, I'm able to run the import within Query Analyser's DEBUG option, but there seems to be a barrier to it running from the Access UI, via ADODB.Command, even though other similar Commands will execute stored procs on the same SQLS database quite OK.I notice you have used a temp table to import into. Maybe that would behave differently. That could work as long as I stack a lot of other related code into the same stored proc - something I was trying to avoid if possible. Thanks for taking the time. If you have any other clues, I'm all ears !!!Lester VincentSydney
I wasn't using a temp table, just a permanent table that was created in the test script.Are you sure your application is supplying the correct path to the file. The path supplied must be relative to the SQL Server, not the worlstation where it is being run. Supplying the name like 'C:\NewOrdersForTesting\4186584_1-104870880_Numbered.TXT' tells SQL Server that the file in in directory C:\NewOrdersForTesting\ on the SQL Server, not of the users workstation. The path you supply would have to be something like:\\mypcname\myshare\text_file_name.txtCODO ERGO SUM |
|
|
Lester Vincent
Starting Member
22 Posts |
Posted - 2006-01-08 : 16:00:28
|
Thanks Michael. My mistake about the temp file. Realised as soon as I had posted last reply that "Temp" referred to your folder.The path has to be correct, as I am currently doing this on the one machine (laptop), so client and server are one and the same. Your point is very valid, though difficult to test until I can do so connected to a network.My client receives approximately 100 of these Word .docs per day, as maintenance requests for govt houses. They arrive as email attachments and are programmatically saved to a folder on the Server machine's C:\ drive. Currently, my Access app. pulls the data into tables and does a lot of auto-allocation work on these tables. The volume has become so great that the network traffic has slowed the processes and the app must be upsized urgently to accommodate this. BULK INSERT has looked like being the way forward, to initially pull in the data from each Word.doc. It is obvious that to do 100 of these in a batch, plus other processes, one cannot do it by manually opening DTS or Query Analyser for each Word .doc.Getting over this initial hurdle of importing the text data into a table is the main problem. Once imported, the other processes should fall into place for us. But you can see how important it is to this 'live' project.Again thanks for your help, Michael.Lester Vincent |
|
|
Lester Vincent
Starting Member
22 Posts |
Posted - 2006-01-08 : 17:12:57
|
Thanks for your reply, Nigel. I have had a little success since initial notice. As you can see, the import works, but only if manually processed, using DEBUG in Query Analyser, one at a time.Problem is it will not respond to an ADO Command with File Name(and path) as parameter, even though other SPs are activated by similar ADO Command format.I have looked at items on your website and will study them more today.Regards,Lester VincentSydney |
|
|
Lester Vincent
Starting Member
22 Posts |
Posted - 2006-01-09 : 05:45:18
|
Thanks Nigel and Michael for your helpful suggestions. I now have BULK INSERT working a treat, and executed from an ADO Command object in MSACCESS client app. The solution was in making changes to the ADO Command object. If anyone needs to know the code that worked, please let me know!Michael's point that the path to the file must be written from the Server's perspective, is going to be important when running the app across a network, so I will store that path in a Registry string value on each workstation.Again thanks guys. Much appreciated.Lester VincentSydney |
|
|
bagavathy
Starting Member
1 Post |
Posted - 2006-01-11 : 02:15:56
|
Hi,We are facing the same problem with ADO.NET. We were able to execute the Bulk Insert Stored Procedure through query analyzer, but not thru ADO.NET. We are getting following error;"Could not bulk insert because file 'c:\testdata.csv' could not be opened. Operating system error code 123(The filename, directory name, or volume label syntax is incorrect.)."What did you modify in the ADO command object to make your code work?Moreover should we need to set any permission on DB side?We would appreciate if you reply ASAP.Regards,Bagavathy.quote: Originally posted by Lester Vincent Thanks Nigel and Michael for your helpful suggestions. I now have BULK INSERT working a treat, and executed from an ADO Command object in MSACCESS client app. The solution was in making changes to the ADO Command object. If anyone needs to know the code that worked, please let me know!Michael's point that the path to the file must be written from the Server's perspective, is going to be important when running the app across a network, so I will store that path in a Registry string value on each workstation.Again thanks guys. Much appreciated.Lester VincentSydney
|
|
|
Lester Vincent
Starting Member
22 Posts |
Posted - 2006-01-11 : 07:27:31
|
Hello Bagavathy. Firstly, on permissions, BULK INSERT requires the operator to have "sysadmin", or "bulkadmin" permissions.Secondly, I have copied both the Stored Procedure and the ADO code that calls it, as follows: -1. Here is the Stored Procedure which is used to invoke BULK INSERT. The text file has only two columns - "LineNo" (Yes, I made mistake of using Reserved Word as column heading, but that didn't stop BULK INSERT from running) and "Text"CREATE PROCEDURE proc_TransferToTable_Server(@DataFile varchar(255)='')/*Uses @DataFile parameter as data source for BULK INSERT command, which pulls data into "tbl_OrImports_Temp" table.This table is emptied before each Order import. The table has only two fields, LineNo and Text and the field separator is TAB and assigns @@ROWCOUNT value to @Rows variable.*/AS--First delete any rows existing in the table: -TRUNCATE TABLE tbl_OrdImports_TempDECLARE @Query varchar(255), @TableCounter varchar(255)SET @Query='BULK INSERT dbo.tbl_OrdImports_Temp FROM '+CHAR(39)+ @DataFile+CHAR(39)PRINT @Query --Prints full text of @Query EXEC (@Query) --execute the Bulk Insert: -IF @@ERROR=0 RETURN -1ELSE RETURN 0GO2. This is the VBA Function, within MSAccess2002, with ADODB.Command which successfully runs the Stored Procedure: - Public Function TransferToTable_Server(strTextFile As String) As Boolean'Code takes arg value, which includes name and path of source file, for import into'SQL Server table, "tbl_OrdImports_Temp". This function is called from "GetNewDOCOrders()"'Stored Procedure (a) clears rows from table, then (b) imports the rows of data from the'line-numbered text file passed as argument. The SQL Server process used is "BULK INSERT"On Error GoTo Error_HandlerDim cmd As ADODB.Command, prm As ADODB.Parameter, bTruth As Boolean10 bTruth = False12 If OpenConnection() Then 'Check that global connection object is open, and if not, re-open it: -14 Set cmd = New ADODB.Command16 With cmd18 .ActiveConnection = gcnn 'Global Connection object, opened when user session begins, and stays open for use.20 .CommandType = adCmdStoredProc22 .CommandText = "proc_TransferToTable_Server" 'uses BULK TRANSFER in Stored proc to import text file data 'Refresh parameters collection so we can use parameters: -23 .Parameters.Refresh 'Set value of parameter, using name of parameter as defined in the stored procedure: -24 .Parameters("@DataFile") = Trim(strTextFile)32 .Execute34 End With36 If cmd.Parameters("@return_value") = -1 Then38 bTruth = True39 End If40 TransferToTable_Server = bTruth42 Set cmd = Nothing44 Else46 MsgBox "Could not open connection to SQL Server database." & vbCrLf & "Please refer this problem to your System Administrator.", vbExclamation, "Connection to Database Failed"48 TransferToTable_Server = bTruth50 GoTo ExitHere52 End If54 GoTo ExitHereExitHere:56 If Not (cmd Is Nothing) Then58 Set cmd = Nothing60 End If62 Exit FunctionError_Handler: MsgBox Err.Description & ", at Line: " & Erl, vbInformation, "Error Transferring Order Data to Server" Resume ExitHereEnd FunctionComments:At first, I tried creating and appending parameters, but this method would not work. I then tried the above, whichuses "cmd.Parameters.Refresh, then cmd.Parameters("@DataFile")= [value]. According to Litwin, Getz and Gilbert's "Access 2000 Developer's Handbook" Volume 2, Chapter 6 (very detailed coverage of Command Object), the "Append" method is better as it makes only the one trip to the Server, whereas the "Refresh" Method takes 2 trips - one to "Refresh" and the other to assign Parameter values.In my case, I tried both. The "Refresh" method works, the "Append" Method does not.I hope this helps in your case, If you can get hold of the text I referred to, you will find it very helpful. Its ISBN is 0-7821-2372-4Regards, (Let's know how you get on.)Lester VincentSydneyPS Nigel Rivett's point regarding path to the file is a good one. In my case, the text file is in a folder on the Server, so "C:\ "(etc) would probably still be valid for me. What about your case? |
|
|
|
|
|
|
|