Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 New to SQL Server Programming
 data consolidation

Author  Topic 

slimt_slimt
Aged Yak Warrior

746 Posts

Posted - 2008-10-03 : 11:15:32
Hi,

i'm consolidating data from several different serves.
Every night a job is executed to transfer new data as well as look for any new updates on old data.

Looking for updates, i've created timestamp to look for nonmatching data (based on timestamp) - if any then upload.

The problem is that when doing this on 100Mio. rows it takes time, especially when transfering only new rows (10.000 per day) is done quickly, but updating (based on difference in timestamps) takes way to much time.

Any idea for improvments?

Thankx

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2008-10-03 : 11:19:55
dont timestamp gets changed during updation also? we maintain two columns one for creationtime and other for lastmodificationtime in tables which are used by nightly job for picking incremental data changes.
Go to Top of Page

slimt_slimt
Aged Yak Warrior

746 Posts

Posted - 2008-10-03 : 11:43:42
actualy is it varbinary(8) as timestamp so it doesn't get updated. but it still takes ages to have nonmatching for update.


anyother ideas?
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2008-10-03 : 11:45:42
quote:
Originally posted by slimt_slimt

actualy is it varbinary(8) as timestamp so it doesn't get updated. but it still takes ages to have nonmatching for update.


anyother ideas?


cant you add a audit column lastmodifed to table? that will make matters very easy for incrementa;l load.
Go to Top of Page
   

- Advertisement -