SQL Server Forums
Profile | Register | Active Topics | Members | Search | Forum FAQ
 
Register Now and get your question answered!
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 Site Related Forums
 Article Discussion
 Article: Using BULK INSERT to Load a Text File
 New Topic  Reply to Topic
 Printer Friendly
Next Page
Author Previous Topic Topic Next Topic
Page: of 4

AskSQLTeam
Ask SQLTeam Question

USA
0 Posts

Posted - 03/18/2001 :  18:23:52  Show Profile  Visit AskSQLTeam's Homepage  Reply with Quote
This example combines dynamic SQL, BULK INSERT and the proper handling of double-quotes to solve a client's problem with loading various text file formats into a database.

Article Link.

Anonymous
Starting Member

0 Posts

Posted - 04/06/2001 :  06:25:12  Show Profile  Reply with Quote
file header

I want to BULK INSERT a file into my database which has a first row in a different format than the rest of the file.

The problem is that I seem to have to put a blank row after the header row with the same format as the rest for the rest of the file to be inserted, otherwise BULK INSERT skips a row. BULK seems to be checking the format of the proceeding row when deciding if a row is to bee inserted or not. Is there an other way to go around this than putting a blank row after the header? Is it a bug or a feature?

/Kajsa

Go to Top of Page

igor
Starting Member

0 Posts

Posted - 04/16/2001 :  15:36:06  Show Profile  Reply with Quote
Bulk Insert

How about inserting file with none uniform delimiters. Is anything works short of using Format file.
Example:
"B",12454,"CAINE","TOOL",1989,"AK"
"A",45896,"FURM","TOOL",2001,"FG"

Go to Top of Page

Garth
SQLTeam Author

USA
119 Posts

Posted - 04/17/2001 :  13:54:00  Show Profile  Visit Garth's Homepage  Reply with Quote
Non-Uniform

Import everything between the commas and strip out what you don't need.

Garth

Go to Top of Page

cs
Starting Member

1 Posts

Posted - 04/25/2001 :  06:24:09  Show Profile  Reply with Quote
Timeout

Hi there.

I've tried the bulkcopy feature as you've written. It works perfectly !

The only problem is that when I try to bulkcopy more than 150000 rows (like 200000 rows (6 columns)) I get an timeout, like this:

Microsoft OLE DB Provider for ODBC Drivers error '80040e31'

[Microsoft][ODBC SQL Server Driver]Timeout expired

I use an asp script to active the stored procedure I've made. And I use an MS SQL 7.0 server.

Go to Top of Page

cs
Starting Member

1 Posts

Posted - 04/25/2001 :  08:43:38  Show Profile  Reply with Quote
Timeout - PART 2

It seems like it's not the bulkcopy itself which times out, because all the data are saved in the temp table.

It's the insert statement which times out.

The strange thing is that it doesn't timeout when I execute it in the Enterprise Manager - only if I execute it from ASP.

Plz Help !

Go to Top of Page

cs
Starting Member

1 Posts

Posted - 04/28/2001 :  10:59:10  Show Profile  Reply with Quote
Timeout - Part 3

Fixed the problem.

If you call the stored procedure from an ASP page make sure your script timeout is big enough with "Server.ScriptTimeout"

And your ADO connection needs to set higher so that your DB connection won't timeout aswell - Connection.CommandTimeout

Hope it helps for the people who have the same problem as I used to have.

Go to Top of Page

ppynrde
Starting Member

2 Posts

Posted - 06/01/2001 :  10:23:21  Show Profile  Reply with Quote
Bulk insert

What is the best way of importing using the Bulk insert command if there is a comma between the text delimiters

eg. if one line is like:

"Edwards, Rich",1978,22

This is really bugging me so any help would be appreciated
Go to Top of Page

robvolk
Most Valuable Yak

USA
15658 Posts

Posted - 06/01/2001 :  10:41:07  Show Profile  Visit robvolk's Homepage  Reply with Quote
quote:

Bulk insert

What is the best way of importing using the Bulk insert command if there is a comma between the text delimiters

eg. if one line is like:

"Edwards, Rich",1978,22



The best way is to not use a comma as a delimiter. Technically a delimiter is a character that is not data, at all, ever, so that any occurrence of the delimiter character can (and will) be discarded.

If you have commas in your data that you want to keep, use something else as a delimiter, like a tab, vertical pipe (|), or tilde (~).

I've had no problems with a tab-delimited, no quotes, carriage return/line feed text format. IMHO this is the best text format to use.

Go to Top of Page

ppynrde
Starting Member

2 Posts

Posted - 06/01/2001 :  10:46:48  Show Profile  Reply with Quote
Thanks, but the problem is that .csv is how the data is exported to me . Is there anyway of using file format to get around this?

Go to Top of Page

robvolk
Most Valuable Yak

USA
15658 Posts

Posted - 06/01/2001 :  11:45:50  Show Profile  Visit robvolk's Homepage  Reply with Quote
You can import it into an Access database; Access handles quotes and embedded commas. You can then transfer it straight from Access or export it into a better text format.

Don't know if this is an option for you, but if someone sends me data in a format I can't use, I ask them to change formats. If they refuse, I let it be known that I can't do my job because the vendor/data provider isn't cooperating :) Unless they're using ancient software, it should be no problem for them to alter their format to your specs.

Go to Top of Page

phunkydude
Starting Member

Namibia
3 Posts

Posted - 06/04/2001 :  14:31:33  Show Profile  Reply with Quote
quote:

You can import it into an Access database; Access handles quotes and embedded commas. You can then transfer it straight from Access or export it into a better text format.

Don't know if this is an option for you, but if someone sends me data in a format I can't use, I ask them to change formats. If they refuse, I let it be known that I can't do my job because the vendor/data provider isn't cooperating :) Unless they're using ancient software, it should be no problem for them to alter their format to your specs.


Another, maybe easier approach would be to massage the BULK INSERT format file. Try using 'native' DATAFILETYPE and changing the column definitions in the format file.
Go to Top of Page

naveen
Starting Member

1 Posts

Posted - 06/07/2001 :  03:53:46  Show Profile  Reply with Quote
How to skip the columns when loading the database using BULK INSERT?

i want to load only 3 columns out of 6 using BULK INSERT from a Text file

i am using SQL Server 7.0
Go to Top of Page

phunkydude
Starting Member

Namibia
3 Posts

Posted - 06/07/2001 :  10:33:28  Show Profile  Reply with Quote
quote:

How to skip the columns when loading the database using BULK INSERT?

i want to load only 3 columns out of 6 using BULK INSERT from a Text file

i am using SQL Server 7.0



The format file defines the column mapping between the text file and the table.
For each of the columns that do not exist in the table (or that you want to exclude from being imported), specify the SQL Server Table Column number to be zero.


Go to Top of Page

vkul
Starting Member

2 Posts

Posted - 09/24/2001 :  17:26:48  Show Profile  Reply with Quote
Hi there

I have a CSV file with headers which has to be imported to MS SQL table. What options/command do I use with BCP to ignore the headers .
Thank you
quote:

file header <P>I want to BULK INSERT a file into my database which has a first row in a different format than the rest of the file.

The problem is that I seem to have to put a blank row after the header row with the same format as the rest for the rest of the file to be inserted, otherwise BULK INSERT skips a row. BULK seems to be checking the format of the proceeding row when deciding if a row is to bee inserted or not. Is there an other way to go around this than putting a blank row after the header? Is it a bug or a feature?

/Kajsa




Go to Top of Page

robvolk
Most Valuable Yak

USA
15658 Posts

Posted - 09/24/2001 :  17:39:51  Show Profile  Visit robvolk's Homepage  Reply with Quote
You can add the -F (note the capital F) parameter to specify the first row that should be imported. -F2 would begin the import on line 2 (skipping line 1, in other words).

Books Online has all the documentation on the bcp utility.

Go to Top of Page

ddoremon
Starting Member

USA
1 Posts

Posted - 02/05/2002 :  13:04:20  Show Profile  Send ddoremon a Yahoo! Message  Reply with Quote
[quote]
ASK CS Timeout - Part 3
I call a stored procedure from ASP. My Server.ScriptTimeout is default 90 sec. I testing ADO time out by
Connection.CommandTimeout =1
Command.CommandTimeout = 1

But why my runrs is running until IIS time out... I think suppose ADO should be timeout first...That's what I want. Can you help me.

Thanks a bunch

Go to Top of Page

robvolk
Most Valuable Yak

USA
15658 Posts

Posted - 02/06/2002 :  11:32:25  Show Profile  Visit robvolk's Homepage  Reply with Quote
There's no reason to set CommandTimeout to 1 second, if anything that will aggravate any timeout problems you are having. Set them back to 30 or 60 seconds and see if the timeout goes away.

Are you using a loop with rs.MoveNext and response.Write within the loop? I guarantee that if you are, this is where your timeouts are occurring.

Take a look at these links:

http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=11019
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=11833

I am a HUUUUUUUUUUUUUUUUUUUUUGE proponent of using GetString and GetRows in ASP pages. I heartily encourage you to play around and learn how to use them, it is probably the best way to create high-perfoming ASP pages that have heavy data requirements, and definitely worth your time. Follow all of the included links as well.

Go to Top of Page

Steve@CSC
Starting Member

6 Posts

Posted - 07/29/2002 :  17:30:33  Show Profile  Reply with Quote
quote:

You can import it into an Access database; Access handles quotes and embedded commas. You can then transfer it straight from Access or export it into a better text format.

Don't know if this is an option for you, but if someone sends me data in a format I can't use, I ask them to change formats. If they refuse, I let it be known that I can't do my job because the vendor/data provider isn't cooperating :) Unless they're using ancient software, it should be no problem for them to alter their format to your specs.





I'm surprised that SQL Server doesn't handle this situation (importing a quoted text field w/ embedded comma) since BOTH MS Access & FoxPro do...I've got the same situation with limited options on input file format. Another shame is that MS won't break the 2GB/Table glass ceiling on VFP. If we're still forced to 'upsize' to SQL, at least this simple import functionality should be respected.

(Sorry for griping, but this is an old sore spot with me & MS...)
Go to Top of Page

robvolk
Most Valuable Yak

USA
15658 Posts

Posted - 07/29/2002 :  20:46:38  Show Profile  Visit robvolk's Homepage  Reply with Quote
Actually, you can do it, Garth has an article demonstrating it:

http://www.sqlteam.com/item.asp?ItemID=3207

And DTS can handle quotes too.

As far as 2GB table limit, that must only be FoxPro, SQL Server can handle more. MS always seemed to treat FoxPro as a bastard child anyway; I got the impression that they only included it in VS 6.0 to keep vocal FP supporters from complaining. It was very obvious even back then that MS was gonna throw their chips into VB and VC++ exclusively...and they didn't even do that, they went .Net!

Go to Top of Page

Steve@CSC
Starting Member

6 Posts

Posted - 07/30/2002 :  16:50:23  Show Profile  Reply with Quote
Thanks, but that wouldn't work in this case. Problem is that the comma field delimiter may be between nulls or numerics too, and the only time character input data HAS double-quotes surrounding it is when it contains an embedded comma.

Example:
Line1,7,,Happy Days,Formica,ThisLineIsFine
Line2,14,SomeText,Some More,"Twin, Evil",ThisLineChokesMeUp

Go to Top of Page
Page: of 4 Previous Topic Topic Next Topic  
Next Page
 New Topic  Reply to Topic
 Printer Friendly
Jump To:
SQL Server Forums © 2000-2009 SQLTeam Publishing, LLC Go To Top Of Page
This page was generated in 0.12 seconds. Powered By: Snitz Forums 2000