Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2005 Forums
 Transact-SQL (2005)
 Large import.

Author  Topic 

Vinnie881
Master Smack Fu Yak Hacker

1231 Posts

Posted - 2007-05-11 : 01:42:33
I am importing data into my application from another which requires me to use a cursor (I must import via running numourus Stored procedures for each record in order to properly import the information into the new application, a cursor was my easiest option to accomplish this)

Each record takes about 1/2 a second to import. However there are 56,000 records. Does anyone see a issue to just let the cursor run through all 56,000 records? Should I break it down into smaller groups?

Thanks.

harsh_athalye
Master Smack Fu Yak Hacker

5581 Posts

Posted - 2007-05-11 : 01:54:56
Why don't you import all data in the staging table first, do all your validations on the data (if possible avoiding cursor) and then move corrected data to the final table? I think this is a standard approach.

Harsh Athalye
India.
"The IMPOSSIBLE is often UNTRIED"
Go to Top of Page

Vinnie881
Master Smack Fu Yak Hacker

1231 Posts

Posted - 2007-05-11 : 02:31:13
The reason I did not do this is do to there are roughly 60 tables that all need to be updated/inserted in a specific order. I could write all this code, but it was far easier to do it in a cursor and simply running the needed stored procedures to complete. Considering this is a 1 time process, it was the easier approach to accomplish what I wanted.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2007-05-11 : 04:47:36
We have this too. PITA. We specifically use the Sprocs to do the inserts so that any future changes to logic can be centralised in the Sprocs (and all our clients various bulk import processes will take advantage of the revised logic).

But its jolly slow!

We don't bother with a Cursor though, we import into a staging table and then select the columns from each record from it, into @Variables, pass to the Sproc, and then loop round for the next record.

Not sure this is any different, performance-wise, to a Cursor, but it may have less resource overhead.

EDIT: One thing we do do is to remove all records from t eh Staging table which have a zero-footprint-update. The Feeds we have tend to send us data when something unrelated has changed, we get rid of these before looping round the Sproc that Updates the main tables. No use for Inserts of course ...

Kristen
Go to Top of Page
   

- Advertisement -