Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 Data Corruption Issues
 Huge Inserts Fail

Author  Topic 

indupriyav
Starting Member

13 Posts

Posted - 2013-12-05 : 05:13:59
A SQL proc takes care of decompressing a zip file and it inserts data into table from the files present in the decompressed zip file.

I have made some 10 attempts to upload data but it failed. Last weekend it failed twice during decompressing.And mostly it failed during huge record insertion in a couple of tables and the proc exited after logging the failure.

I had this problem before. It may take much time to load data and sometimes it will fail during huge insertions.After we increased RAM the uploads were faster.

Again Im facing the issue.The last upload completed but Im finding missing data in tables without any "Upload failure" issue in the log for particular tables. I havent come across something like this in the past 2.5 years when I did the uploads.

Please advice.




indupriyav

Kristen
Test

22859 Posts

Posted - 2013-12-05 : 05:54:56
We use BCP to bulk import data.
If at all possible we pre-sort the import file into Clustered Index order (and we provide a HINT to BCP that the data is ordered).

We use every error reporting outcome that we know of to detect if the process fails
Go to Top of Page

indupriyav
Starting Member

13 Posts

Posted - 2013-12-05 : 06:17:46
The SQL side upload proc is running for a long time. We cant change anything here. They do Bulk Insert. So why a Bulk Insert may file. Is this because of some memory corruption or LOW memory.

indupriyav
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2013-12-05 : 06:28:06
I would have thought more likely because a Foreign Key is not present, or disk full, or .. a host of such things.

If the process does not have every possible error trap then there is no way of knowing what is causing the problem.

Worth checking both SQL and EVENT logs in case anything was recorded.

But my approach is to "catch" the error and report it as fully as possible ... without good diagnostics there is no easy cure, only guess work

Do you want to post the code in the Proc and folk here make suggestions?
Go to Top of Page

indupriyav
Starting Member

13 Posts

Posted - 2013-12-05 : 07:32:49
We have something like this.


Select @BCP = 'BCP ##' + @Table + '_' + DB_Name() + ' Out ' + @Zip_File_Location + @Table + '.bcp -c -t"||" -r\n -S' + @@ServerName + ' -T '


'Exec [Master].[dbo].xp_CmdShell ''' + @BCP + ''', No_Output ' +

'BULK INSERT ' + @Schema_Table + '_Current] FROM ''' + @Zip_File_Location + @Table + '.bcp'' ' +
'WITH ( ' +
'FIELDTERMINATOR = ''||'', ' +
'ROWTERMINATOR = ''\n'', ' +
'MAXERRORS = 0, ' +
'TABLOCK ' +
') ' +

And they are logging the '@@RowCount value which shows as '0' instead of number of records.

So here why the bulk insert fails. Its failing to read from the datafile due to less disk space or less RAM or something. WHere can I check SQL logs.I connect to SQLServer DB thro Management Studio. So i guess I cant take a look at the event logs.


indupriyav
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2013-12-05 : 07:48:58
I would add (to the BCP command)

-o OutputFilename

parameter so you get a file of the Output, if any. That may well contain a useful error message.

Also redirect any BCP output to another file by appending

>>MyErrorFilename

But I'm not really sure what you are trying to do? If I have understood correctly this is what you are trying to EXEC ??

Exec [Master].[dbo].xp_CmdShell 'BCP ##MyTable_MyDBName Out C:\MyZip_File_Location\MyTable.bcp -c -t"||" -r\n -SMyServerName -T'
, No_Output
BULK INSERT [MySchema_Table_Current]
FROM 'C:\MyZip_File_Location\MyTable.bcp'
WITH ( FIELDTERMINATOR = '||', ROWTERMINATOR = '\n', MAXERRORS = 0, TABLOCK )

Are you trying to Export from one database and import into another all in one command? or have I misunderstood?

Why not use BCP for the Import as well as the Export? You could then use "-o" parameter for Output file on the Import too, as well as redirecting and screen output to an ErrorFile too.
Go to Top of Page

indupriyav
Starting Member

13 Posts

Posted - 2013-12-05 : 08:08:24
I cant change anything. I just want to know what is causing the Insert failure.THe proc does the foll.

-Decompresses a zip file which has lot of files with extension .bcp.
- For instance it reads from sample1.bcp and inserts records into sample1 table in a SQL db.
- Likewise it reads from sample2.bcp and inserts into sample2 table in SQL db etc.
- As per logs when it reads from sample1.bcp and inserts the lines into sample1 table, it fails with 0 records inserted.

On what condition the reading from sample1.bcp file into sample1 table fail? That is what Im looking for.

indupriyav
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2013-12-05 : 08:48:23
quote:
Originally posted by indupriyav


For instance it reads from sample1.bcp and inserts records into sample1 table in a SQL db.


You appear to be using BCP for export, and BULK INSERT for import.

I suggest you use BCP for import and add the OutputFile and ErrorFile suggestion I made and see if that captures any errors when you get zero rows.

quote:
On what condition the reading from sample1.bcp file into sample1 table fail? That is what Im looking for.


Add more/better diagnostics, then it is likely you will find the cause.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2013-12-05 : 08:49:39
quote:
Originally posted by indupriyav


For instance it reads from sample1.bcp and inserts records into sample1 table in a SQL db.


The PROC appears to be using BCP for export, and BULK INSERT for import, rather than just importing (after UNZIP)?

I suggest you make an experiment, using BCP, for import using a file that has previously failed and add the OutputFile and ErrorFile suggestion I made and see if that captures any errors when you get zero rows.

quote:
On what condition the reading from sample1.bcp file into sample1 table fail? That is what Im looking for.


Add more/better diagnostics, then it is likely you will find the cause.
Go to Top of Page

indupriyav
Starting Member

13 Posts

Posted - 2013-12-06 : 06:26:36
Pardon me. Im working on SQL server from recently only. 3 things.

1. You mentioned in your message about disk full.

It reads from a folder \\SQLBox\Data\Zip file and dumps into table.

Which disk is full. \\SQLBox\Data or the disk where the table/database is mounted.

Im finding less space in \\SQLBox\Data.


2. In this code snippet I believe this what happens.

Select @BCP = 'BCP ##' + @Table + '_' + DB_Name() + ' Out ' + @Zip_File_Location + @Table + '.bcp -c -t"||" -r\n -S' + @@ServerName + ' -T '


'Exec [Master].[dbo].xp_CmdShell ''' + @BCP + ''', No_Output ' +

'BULK INSERT ' + @Schema_Table + '_Current] FROM ''' + @Zip_File_Location + @Table + '.bcp'' ' +
'WITH ( ' +
'FIELDTERMINATOR = ''||'', ' +
'ROWTERMINATOR = ''\n'', ' +
'MAXERRORS = 0, ' +
'TABLOCK ' +
') ' +

It does bulk insert into mytable_Current from mytable.bcp present in location specified in @Zip_File_Location.

What does the @BCP does? THey create a error ZIp file of failed records. Is that what is done by @BCP.

3. Now the process inserts "0 records" in big tables. Before that it gave this error and terminated.

"Bulk Insert Failure: Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)"."

What is this error ?

indupriyav
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2013-12-06 : 06:41:33
1, 2, and 3

You need to log more diagnostic information, as I have explained.

You also need to log the value in @BCP so you can see exactly what command is being issued, and the command being sent to xp_CmdShell - personally I would prepare that in a @WorkingVariable and then EXEC it - that way the exact contents of the working variable can be output for logging, whereas the way it is currently you have to concatenate the various pieces manually to work out what the whole command is, and that is error prone IME

Its a waste of your time and mine just guessing which disk might be full, why there are 0 rows inserted, what errors might be occurring. Just put some logging in place and then you will be able to see the error message.

If you cannot alter the code then get the people who provided it to add the logging. I would not accept the code they have provided without better logging, and I would expect them to rectify the code without and additional cost.
Go to Top of Page
   

- Advertisement -