Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2002-04-23 : 13:02:02
|
| Scott writes "I have to load large delimited files into tables numerous times per day. I set up a stored procedure similar to the one in your article on using Bulk Insert for text files. Are there limitations on the sizes of the files that can be inserted using this command. I have not yet worked with DTS, but may have to soon. I have a file with 65000 rows of data. I set the BATCHSIZE to 500 and get to within the last 20 rows before I get an Error 7399. Is my file too big, or am I missing a parameter that needs to be set?I am pretty new to this so please bear with me.Thanks Scott" |
|
|
graz
Chief SQLTeam Crack Dealer
4149 Posts |
Posted - 2002-04-23 : 13:02:02
|
| BULK INSERT will easily handle 65,000 rows of data. I use it to load much more than that every day. Was there any additional text with the error message? Checking MS support site (always a good first place to start) for error 7399 yielded results about linked tables. Are you loading into a SQL Server table? |
 |
|
|
drman
Starting Member
38 Posts |
Posted - 2002-04-23 : 13:15:42
|
| I am pulling the information from the comma delimited text file and importing it into a temporary table. The sample data that I am using contains 65536 records. I think I am missing one of the parameters because if I don't set the BATCHSIZE, I get the error message and no imported data. if I set the BATCHSIZE to 30000, I get the error message after 60000 records have been imported. If I set the BATCHSIZE to 2, I get all of the records and no error message. I must be missing something like the ROWTERMINATOR parameter.Here is my Bulk insert statement:BULK INSERT tmpImportTable FROM 'C:\media manager\data\001803\sample.csv' WITH ( BATCHSIZE = 2 , FIELDTERMINATOR = ',' )Thanks in advancedrman |
 |
|
|
dsdeming
479 Posts |
Posted - 2002-04-23 : 14:13:57
|
| What you've described is a situation where the procedure fails any time the row count isn't evenly divisible by the batch size. Try setting the row terminator in your bulk insert statement. Also take a look at the end of the input file for extra carriage returns and line feeds. |
 |
|
|
drman
Starting Member
38 Posts |
Posted - 2002-04-23 : 14:39:40
|
| That appears to have solved the problem. I tried the ROWTERMINATOR, and it did not make a difference. I then tried to delete everything at the end of the last data line and it worked. The CSV file was created from Excel. How do I prevent that in the future?Thanks for the help .. Great site and effort.Scott |
 |
|
|
MichaelP
Jedi Yak
2489 Posts |
Posted - 2002-04-23 : 19:33:42
|
| Maybe you can create a program or DLL that can "clean up" said CSV files before your bulk insert runs. To run EXE's look up xp_cmdshell in Books Online.To call DLL's look up sp_OACreate in the Books Online.Michael |
 |
|
|
|
|
|