Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Transact-SQL (2000)
 SOLVED:Insert Large Quantity of Data in Single Qry

Author  Topic 

badman3k
Starting Member

15 Posts

Posted - 2009-05-06 : 12:21:47
All,

I'm looking for some pointers and advice on processing a rather large CSV file. There's approximately 6000 rows, and if I take the character count it comes out at close to 490000 characters.

My first thoughts were to take each row individually and insert it into the database. That was definitely possible, but very time consuming, especially when I'm doing the calls from PHP.

So then I thought, okay, what if I pass the whole file as a single variable, split on the newline, and then use exec() to insert the rows dynamically, as each column is comma separated.

That didn't go too well - the variable I was trying to pass overflowed the size SQL would take into the varchar, and then Apache crashed, which didn't help.

What are your thoughts? Has anyone had similar experiences that they can share their advice on?

I'm really looking for a simple case of: Upload CSV file, read in file, add all entries to DB, return success/failure.

Any advice would be greatly appreciated.

Many thanks in advance,
Richard

sql_er
Constraint Violating Yak Guru

267 Posts

Posted - 2009-05-06 : 15:32:49
Is this a php question or a sql server question?

Do you need to do it only once or on a regular basis with different files?
Go to Top of Page

badman3k
Starting Member

15 Posts

Posted - 2009-05-06 : 16:04:07
sql_er:

Sorry the description of the problem was a little vague. The problem is an SQL server problem, as processing the file in PHP causes the apache server to hang, converting the file to an array eats too much memory and time - not practical or feasible. Also the only sql queries I can call are stored procedures as that's all the security of the database will allow me. I can create as many stored procedures as I want.

It is something that I wish to do on a monthly basis.

I believe I've come up with a solution, that's a middle ground between SQL server and PHP, which is as follows:
PHP Uploads the file to a specific location on the server.
The stored procedure on SQL server runs a BATCH INSERT from the file at the specific location, and puts the data into a temp table.
Then the SP continues to modify the data as necessary, putting the final results into the final table.
PHP gets the notification that all went well, whereupon it moves the file to a new 'archive' location, for the future uses - none at present.
This way PHP doesn't have to kill itself with the processing of the file. SQL server can process the data faster, as that's what it was designed for.

So far things look very promising, the only issues I'm coming across are data type mismatches, but I'm sure I can iron those out in time.

I'd like to thank you for taking the time to look.

For reference my code is as follows:
create table #tmp_mobile_usage (date varchar(20), time varchar(20), called varchar(30), mobile_id varchar(20), country varchar (3), key_word varchar(20), description varchar(100), usage varchar(15), cost varchar(20))
BULK
insert #tmp_mobile_usage FROM 'C:\wamp\www\csv\Apr-09.csv' WITH ( FIELDTERMINATOR = '\t', ROWTERMINATOR = '\n')
set @rowcount = @@rowcount
set @usage_err = @@error

insert into mobile_usage (date, time, called, mobile_id, country, key_word, description, usage, cost) SELECT str2date(date as datetime), time, called, mobile_id, country, key_word, description, usage, cost FROM #tmp_mobile_usage order by mobile_id asc, date asc, time asc

drop table #tmp_mobile_usage

Thanks again.
Richard
Go to Top of Page
   

- Advertisement -