Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
x2012x
Starting Member
4 Posts |
Posted - 2009-10-01 : 15:14:55
|
| Environment: Microsoft SQL Server 2005 - 9.00.1399.06 (Intel X86)We us a stored procedure, run via a SQL Server Agent Job, to bulk insert flat files using a *.fmt file. Every few runs, the bulk insert fails with the below error.Error: "Cannot bulk load. Invalid destination table column number for source column 1 in the format file "C:\FormatFile.fmt". [SQLSTATE 42000] (Error 4828). The step failed.,00:00:00,16,4828,,,,0"A re-run of the same flat file and format file results in a successful bulk insert.I have run the following tests to try and force reproduction of the behavior in our lab but have been unsuccessful.Test Scenarios:============1. Modified permissions to the C:\FormatFile.fmt to prevent read access. An error is thrown during the bulk insert attempt but not the same that is seen in production.Error from test:~~~~~~~~~~Cannot bulk load because the file "C:\FormatFile.fmt" could not be opened. Operating system error code 5(Access is denied.). [SQLSTATE 42000] (Error 4861). The step failed.2. Renamed destination table tbBulkLoadData to tbBulkLoadData01 and ran the bulk insert job. An error is thrown during the bulk insert attempt but not the same that is seen in production.Error from test:~~~~~~~~~~Cannot find the object "tbBulkLoadData" because it does not exist or you do not have permissions. [SQLSTATE 42S02] (Error 4701). The step failed.3. Put together a Python script to set a file lock on the C:\FormatFile.fmt file and ran the bulk insert job. An error is thrown during the bulk insert attempt but not the same that is seen in production.Error from test:~~~~~~~~~~Cannot bulk load because the file "C:\FormatFile.fmt" could not be opened. Operating system error code 32(The process cannot access the file because it is being used by another process.). [SQLSTATE 42000] (Error 4861). The step failed.4. Set exclusive locks on the tbBulkLoadData table without committing the transaction and then ran the bulk insert job. The job just waits until the previous transaction containing the locks is committed.============Does anyone have any additional ideas? I'm certainly open to suggestions.I have posted this on msdn as well but I have yet to receive a reply. Thanks in advance for any input. |
|
|
mivey4
Yak Posting Veteran
66 Posts |
Posted - 2009-10-01 : 22:03:28
|
| I've seen this error before and all I can provide to you is the fact that it usually isn't the format file that's your issue.It's the flat file that you're attempting to perform the bulk insertion on. Sometimes when the layout of the file data doesn't conform with the format file definition you'll receive that error.On other rare instances, the insertion process can fail but I've only seen this when the file is excessively large.Experimenting with the datatype definitions within the format file and the datatype of the 1st field of your table in the case of your error will possibly be a good place to start.Did you build the format file from a command-line using the table that you're performing the BULK INSERT? |
 |
|
|
x2012x
Starting Member
4 Posts |
Posted - 2009-10-02 : 08:55:12
|
| Thanks for the reply.Q/A:====Q. It's the flat file that you're attempting to perform the bulk insertion on. Sometimes when the layout of the file data doesn't conform with the format file definition you'll receive that error.A. We had considered this. However, since a re-run of the same format file is successful, I'm not sure if this would come into play.Q. On other rare instances, the insertion process can fail but I've only seen this when the file is excessively large.A. We have experienced these failures with small files as well, only a few KB.Q. Experimenting with the datatype definitions within the format file and the datatype of the 1st field of your table in the case of your error will possibly be a good place to start.A. I suppose I'm still hung up on the fact that a re-run of the same flat file and format file ends up being successful.Q. Did you build the format file from a command-line using the table that you're performing the BULK INSERT?A. I come back to the issue that a re-run is successful.====One thing to also note, our process takes the original flat file and copies it to a staging folder where the actual BULK INSERT ends up pointing to. I wonder if the failure is the result of a corruption of the flat file via the copy process?Example:========1. ABC.txt is ready to be processed.2. Our code copies ABC.txt to a staging area for processing ( example: ./Processing/ABC.13 ).3. The BULK INSERT is actually performed on ./Processing/ABC.13, not the original ABC.txt file.========I hadn't considered this as a potential cause since I would assume that we would see an error being thrown during the copy operation, which we are not.Any thoughts? |
 |
|
|
mivey4
Yak Posting Veteran
66 Posts |
Posted - 2009-10-02 : 09:14:58
|
| Ok. Here are some other things to consider.Is your BULK INSERT process pulling from the same server that the file resides on or is it across the network? If there are intermittent hiccups on the network it can cause packets to drop and skew the integrity of the data.When your code gets copied is a raw copy from point A to point B. (e.g. No encoding changes are occuring within the code when the file is copied)? Slight variances in the file encoding from the source to the destination point won't be detected by the naked eye but can sometimes make a significant difference within the file. Though I would think if this were the case, the failure would be consistent for the same file; but I've seen stranger things.Can you provide a sample copy of 1 of the files and the format file you are using to BULK INSERT or is the data sensitive? If not, you may e-mail the files to me at mivey @c-sg.com and I'd be happy to take a look at it.The format files can be tweaked sometimes to rule out the possibility of them being the culprit. I've done this in the past and made changes to things like the native SQL datatypes and it's worked in some cases. |
 |
|
|
x2012x
Starting Member
4 Posts |
Posted - 2009-10-02 : 09:28:28
|
| I agree the format file is likely not the problem, since it never changes and no other processes are accessing it.The locations of the flat file and format file are on the same hard drive which is a local disk on the server that the SQL Server instance resides.The data is sensitive so I doubt that I will be able to send it. We are going to investigate a potential problem with the copy operation since that is the only thing that seems to be changing between executions.Thanks for your input and I'll make sure to post our findings. |
 |
|
|
mivey4
Yak Posting Veteran
66 Posts |
Posted - 2009-10-02 : 09:49:32
|
| Fair enough and Good Luck! |
 |
|
|
x2012x
Starting Member
4 Posts |
Posted - 2009-10-02 : 13:38:40
|
| Some additional info on our process. When the failure is encountered, the flat file that is being loaded from ./Processing is moved to a ./Failed directory.Example:========1. ./ABC.txt is ready to be processed.2. Our code copies ABC.txt to a staging area for processing ( example: ./Processing/ABC.13 ).3. The BULK INSERT is performed on ./Processing/ABC.13 but fails.4. The failure is caught in our code and ./Processing/ABC.13 is moved to ./Failed/ABC.txt and the original file at ./ABC.txt is deleted.========This being the case, it doesn't look like we are experiencing an issue with corrupt file copies. When the BULK INSERT error is encountered and the flat file is moved to the ./Failed directory, we can re-run the ./Failed/ABC.txt file successfully. If the copy operation was resulting in corrupt files in ./Processing this file corruption would carry over into the files moved to ./FailedThanks again for any input. |
 |
|
|
|
|
|
|
|