Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 New to SQL Server Programming
 SQL BCP Error

Author  Topic 

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-12 : 11:51:44
Hi all

I'm having problems using BCP to create a format file. I always end up with this error.

SQLState = 08001, NativeError = 2
Error = [Microsoft][SQL Native Client]Named Pipes Provider: Could not open a con
nection to SQL Server [2].
SQLState = HYT00, NativeError = 0
Error = [Microsoft][SQL Native Client]Login timeout expired
SQLState = 08001, NativeError = 2
Error = [Microsoft][SQL Native Client]An error has occurred while establishing a
connection to the server. When connecting to SQL Server 2005, this failure may
be caused by the fact that under the default settings SQL Server does not allow
remote connections.



I am using the local machine with SQL on it but i do have 2 IP's. I dont think this is causing this error, how can i fix this?

Any help would be greatly appreciated, thanks.

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2008-08-12 : 11:57:00
Create a text file manually.
Put text in like this
9.0
4
1 SQLCHAR 0 7 "," 1 PersonID ""
2 SQLCHAR 0 100 "," 0 ExtraField SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 100 "," 2 FirstName SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 0 100 "\r\n" 3 LastName SQL_Latin1_General_CP1_CI_AS
and save the text file.
Remember to put switch on to use format file.

Also suggested to you here
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=108581



E 12°55'05.25"
N 56°04'39.16"
Go to Top of Page

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-13 : 08:26:33
Cheers Peso, i have managed to get bcp to create a format file, i needed to put the servername in the command, but now when using the format file in a bulk insert query i get a "invalid column number in format file".
Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-13 : 08:34:18
Hi JaseDytec

Can you post your format file and the whole string that your are trying to run bcp with.

(change your severnames / tablenames / passwords if that is a concern. I just want to see the exact syntax you are trying to use).

Regards,

-------------
Charlie
Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-13 : 08:35:19
Scratch that,

I didn't notice that you have moved on to a bulk insert.

Can you post your sql code instead?

Thanks,

-------------
Charlie
Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-13 : 08:42:04
A sample datafile would be good also. (in fact this, along with your format file are the most important things to have)

-------------
Charlie
Go to Top of Page

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-13 : 10:24:13
Thanks for the offer charlie, I have now managed to get it to work on a simple case but i still have a problem. Basically i am inserting 3 coloumns of csv data into a sql table which has 5 columns, i can do this but i need to have a set value in one of the additional columns.

Here is my table

CREATE TABLE [dbo].[ETI](
[Device ID] [int] IDENTITY(1,1) NOT NULL,
[Vehicle ID] [int] NULL,
[Device Type] [varchar](50) NULL,
[Duration] [decimal](24, 12) NULL,
[Units] [varchar](15) NULL
) ON [PRIMARY]

This is my Query

-- Select Database

USE [database]
GO

--Add data
BULK
INSERT ETI
FROM 'filepath.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FORMATFILE = 'c:\fmt.fmt'

)

GO

The Format file is

9.0
3
1 SQLCHAR 0 50 "," 3 Device_Type Latin1_General_CI_AS
2 SQLCHAR 0 50 "," 4 Duration ""
3 SQLCHAR 0 15 "\n" 5 Units Latin1_General_CI_AS

The problem i face at the moment is that for the Vehicle ID, I need to use a set value, for example 1, so that when i excute this query, each row of Vehicle ID will be 1, but when i want to add another vehicle's data to the table, i could easily run a similar bulk insert with Vehicle ID set at 2, and so on.

Thanks
Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-13 : 10:36:01
I take it changing the import csv file to add a column for vehicleId is out of the question? This looks like it might become an automated process.

Erm,,,,,

You *could* wrap the bulk insert in the following.



BEGIN TRANSACTION

-- get the highest current Id
DECLARE @latestValue INT
SELECT @latestValue = MAX([Device ID]) FROM ETI

--Add data
BULK
INSERT ETI
FROM 'filepath.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FORMATFILE = 'c:\fmt.fmt'

)

UPDATE ETI SET [Vehicle ID] = 2 WHERE [Device ID] > @latestValue

COMMIT TRANSACTION


And change the update value to suit. Don't know if this is 100% safe but I think so.



-------------
Charlie
Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-13 : 10:42:08
OR if you are using the vehicle_Id as some sort of ascending surrogate key, You could do this...

BEGIN TRANSACTION

-- get the highest current Id
DECLARE @latestValue INT
SELECT @latestValue = MAX([Device ID]) FROM ETI

-- get the highest current vehicleID
DECLARE @vehicleId INT
SELECT @vehicleId = MAX([Vehicle ID]) FROM ETI

--Add more data
BULK
INSERT ETI
FROM 'filepath.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FORMATFILE = 'c:\fmt.fmt'

)

UPDATE ETI SET [Vehicle ID] = @vehicleID WHERE [Device ID] > @latestValue

COMMIT TRANSACTION


-------------
Charlie
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2008-08-13 : 11:13:17
or you could even do this with OPENROWSET


-- get the highest current vehicleID
DECLARE @vehicleId INT
SELECT @vehicleId = MAX([Vehicle ID]) FROM ETI

INSERT INTO ET1 (vehicleid,...)
SELECT @vehicleId,* FROM OPENROWSET(...)
Go to Top of Page

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-13 : 11:17:39
WOW thats brilliant charlie. What tooks me hours of attempts took you seconds.

Really apprieciated the help

Jase



Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-13 : 11:20:18
That looks like a much better solution visakh16.

I've never used openrowset before but I like what I've read about it and usually forget it exists.

(I don't run into situations where I have to automated a bulk upload very often. Which is probably why I know more about bcp that BULK INSERT)

JaseDytec : Check out OPENROWSET in books on line -- looks like a winner!


-------------
Charlie
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2008-08-13 : 11:40:17
quote:
Originally posted by Transact Charlie

That looks like a much better solution visakh16.

I've never used openrowset before but I like what I've read about it and usually forget it exists.

(I don't run into situations where I have to automated a bulk upload very often. Which is probably why I know more about bcp that BULK INSERT)

JaseDytec : Check out OPENROWSET in books on line -- looks like a winner!


-------------
Charlie


not sure how it perform on comparison with bulk insert but worth trying out
Go to Top of Page

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-13 : 11:56:35
Will do,it looks quite promising. Cheers guys
Go to Top of Page

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-14 : 06:25:14
Hey guys, when trying to insert a large CSV file, containing 68 columns and 20,000 rows, i get a error. (Using Charlies method above)

Bulk load data conversion error (truncation) for row 2, column 68 (Longitude Min).

If i load a smaller file it works perfectly.

Go to Top of Page

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-14 : 06:57:21
It seems to cut off at 3548 rows. Would Openrowset work for large csv files
Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-14 : 07:31:36
Hi Jayse,

doesn't look like a number of rows issue and I'd be surprised if bulk import / bcp got into a tizz for a small file with 20,000 rows in it..

I think that message is saying that one of your values will be truncated. -- Have you checked out row 2 of the big file you are trying to import. Does the smaller file not have this data row?



-------------
Charlie
Go to Top of Page

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-14 : 08:05:24
There files are in the identicle format, the larger ones just have more rows,
If i add this to the bulk insert

LASTROW=3500

It will work but only copies that amount of data, and wouldnt be an effective method for a very large file
Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-14 : 08:58:20
Hi Jase,

I still think there may be a data item in that file wider than you've allowed for in your format file.

You can try,

1) a batch size in your bulk insert.
BATCHSIZE [ = batch_size ]

2) Trying without a format file. Use syntax like this.
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)

and give me a shout how you get on.


-------------
Charlie
Go to Top of Page

JaseDytec
Starting Member

14 Posts

Posted - 2008-08-15 : 03:54:41
I have tried the solutions, batchsize does not work, but the simple bulk insert does work. So im guessing that something to do with a large csv file affects the format file, which works fine for under 3500 rows.
Go to Top of Page

Transact Charlie
Master Smack Fu Yak Hacker

3451 Posts

Posted - 2008-08-15 : 04:22:01
quote:
Originally posted by JaseDytec

I have tried the solutions, batchsize does not work, but the simple bulk insert does work. So im guessing that something to do with a large csv file affects the format file, which works fine for under 3500 rows.



HI Jase,

If the simple bulk insert without your format file worked then that indicates that there *is* a data item in your import file that is too big (too wide for the column definitions you have specified in your format file). (because now it doesn't check).

Easiest way I've found to check is to do this.

1) make a table with enough columns to hold every item in your file.

2) Import using the non format file method I posted.

3) Select from the table where the LEN of the field is longer than that specified in your imput file.

Something like (this isn't tested so syntax might not be 100%)


CREATE TABLE _testing (
[field1] VARCHAR(8000)
, [field2] VARCHAR(8000)
, [field3] VARCHAR(8000)
)

BULK
INSERT _testing
FROM 'c:\yourfilename.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)

SELECT
[field1]
, LEN([field1])
, [field2]
, LEN([field2])
, [field3]
, LEN(field3])
FROM
_testing
WHERE
LEN([Field1]) > 50
OR
LEN([field2]) > 50
OR
LEN([field3]) > 15

(I've taken the lengths from your posted format file).

I'm sure there must be an item in your datafile that isn't the right length.

Regards,

-------------
Charlie
Go to Top of Page
    Next Page

- Advertisement -