Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Administration (2000)
 Table rename

Author  Topic 

lemondash
Posting Yak Master

159 Posts

Posted - 2006-08-03 : 10:23:50
Hi,

Need to insert data (7000 000) rows into a table everday.
Dont want to truncate and insert as if fails there is no rollback.
Thinking of building the staging table and inserting the data and switching them around by renaming the tables.
Does a table rename force all indexes to rebuild ?

thanks


Srinika
Master Smack Fu Yak Hacker

1378 Posts

Posted - 2006-08-03 : 10:46:58
Just a few ideas:

* y don't u insert the data checking for non-existence (using NOT EXISTS)
* y truncate ? Needs to have data afresh from source ?. How about Insert / Update depending on existence ?
* u may store the data in Destination tbl, in a temp table, truncate, and insert, if failed, get data from temp tbl. else drop or truncate temp table

Srinika
Go to Top of Page

lemondash
Posting Yak Master

159 Posts

Posted - 2006-08-03 : 11:33:21
Hi,

Yip need to repop from fresh, trying to find fastest way to do this.
Table needs clustered index, but inserting 7000 000 rows on live table slow with index. drop index, insert data create index -slow.
thought best way would be to create a parallel table and switch names. but was worried that a rename would rebuild index anyway

thanks
Go to Top of Page

eyechart
Master Smack Fu Yak Hacker

3575 Posts

Posted - 2006-08-03 : 13:45:06
I don't think a table rename forces a rebuild of indexes. You can find out by doing a quick test :) You will probably want to insert into a table that is minimally indexed in the first place, so that might not be much of an issue anyway.

Maybe just a clustered index (if this table has one) on your staging table to start, adding additional indexes after the load. I say that because if you later add a clustered index, all the data from the table will have to be moved into it - you might as well start with the clustered index and avoid the additional work.

anyway, the table rename logic seems to be a valid way to do this.

1. create new staging table with minimal indexes
2. insert data
3. if data insert is successful, drop old table and rename staging to match
4. build additional indexes

steps 3 and 4 can probably be swapped around now that I think about it.



-ec
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-08-03 : 14:57:02
Does it need that much effort EC?

I reckon 7-Million rows should import in a few seconds, maybe a minute or so max., if the job is properly designed.

A Bulk Load with the import file pre-sorted into Clustered Index order should do the trick.

Kristen
Go to Top of Page

eyechart
Master Smack Fu Yak Hacker

3575 Posts

Posted - 2006-08-03 : 16:39:19
quote:
Originally posted by Kristen

Does it need that much effort EC?

I reckon 7-Million rows should import in a few seconds, maybe a minute or so max., if the job is properly designed.

A Bulk Load with the import file pre-sorted into Clustered Index order should do the trick.




Yeah, 7 million rows is not that much data. But what if it was 70 million or 700 million?

I don't think the steps I outlined are that much work actually and I have used bulk insert, DTS and SELECTS for the data load step in the past. obviously, the bulk insert will be faster for certain data loads. One exception being If the data being loaded is coming from another database on the same server, then a SELECT will be faster. Atleast that has been my experience.

The approach I outlined is a fairly typical approach to solving a problem like this. The other issue that comes into play (that wasn't mentioned as a requirement) is the requirement for as little interruption to the reporting data as can be achieved. If you direct load into your reporting table, then you have all these issues with people potentially reporting off of incomplete data. The quick switch with the drop and rename solves that issue pretty decently (not perfectly I know), and you also get around the issue of having to truncate the table, etc. which was mentioned in the first post.



-ec
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-08-04 : 04:00:36
All good points EC, but I personally wouldn't go to that much effort for 7M rows - unless I already had the templates on the shelf, or a similar, but bigger, job was on the horizon!

Kristen
Go to Top of Page
   

- Advertisement -