Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 New to SQL Server Programming
 How to merge more tables into a single one

Author  Topic 

ferpsql
Starting Member

4 Posts

Posted - 2008-11-19 : 18:13:01
Hi all,

I've 4 temporary databases (DB1..DB4), each one that contains the same table A. An external process fills that table A for each temporary database. At the end I've to merge that table A from DB1 to DB4 databases into the same table A in the target new database DB5.
In addition during the merge I've also to change the primary key in table A for each temporary database DB1..DB5 in order to avoind duplicate key in the table A in the database DB5.
I'm new to SQL Server and I'm trying to understand what is the best way. Should I use DTS ? Do you have any starting sample that I could elaborate to solve my case ?

I thank you in advance.
Regards
ferpsql

sodeep
Master Smack Fu Yak Hacker

7174 Posts

Posted - 2008-11-19 : 18:21:55
Why are so many databases used in this process? SO Each table A in each DB is filled differently by external process.Can you elaborate on it please?
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2008-11-19 : 18:38:51
INSERT INTO DB5.dbo.Table1 (...)
SELECT ... FROM DB1.dbo.Table1 UNION ALL
SELECT ... FROM DB2.dbo.Table1 UNION ALL
SELECT ... FROM DB3.dbo.Table1 UNION ALL
SELECT ... FROM DB4.dbo.Table1

You'll need to explain a bit more about the duplicate key thing. I can't see your system nor read your mind.

Tara Kizer
Microsoft MVP for Windows Server System - SQL Server
http://weblogs.sqlteam.com/tarad/

Subscribe to my blog
Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2008-11-20 : 00:47:10
i think what you need is to keep an identity column in your destination table and make it primary key. this will make sure you dont have problems with dupolicate values coming from other DBs in your business key column.
Go to Top of Page

ferpsql
Starting Member

4 Posts

Posted - 2008-11-20 : 04:43:59
Hi Tara,

thanks a lot for your feedback. Here some more information.

I've an external application APP that reads a file F and inserts the contents into DB5.dbo.Table1 and use a sequence for filling the primary key. Then APP reads data from Table1 and produces some statistics.
Now APP should manage a big file F and I noticed that is not able to use more than 1 CPU and for processing the big file takes a longer time. I've a test environment with 2 CPU and production with 4 CPU. I asked to the vendor how to manage it and in the meantime I've received from the vendor a workaround where I should split F in more pieces and run APP more times in order to load each piece in a temporary database (DB1..DB4).
So each APP running instance will insert the contents of the related piece into DB<i>.dbo.Table1 (and still use a sequence for filling the primary key).
When I merge the data into DB5.dbo.Table1 I've to take in consideration also that I've duplicate rows.

So starting from your example I could

UPDATE DB1.dbo.Table1 SET pk = pk + 100
...
UPDATE DB4.dbo.Table1 SET pk = pk + 400

INSERT INTO DB5.dbo.Table1 (...)
SELECT ... FROM DB1.dbo.Table1 UNION ALL
SELECT ... FROM DB2.dbo.Table1 UNION ALL
SELECT ... FROM DB3.dbo.Table1 UNION ALL
SELECT ... FROM DB4.dbo.Table1

and make it a little more 'parametric' because the number of splits (and the number of temporary databases) depends on the target environment.

Do you think the above approach is the best one ?
(I know the solution is to have APP that is able to run on more CPUs but maybe that behaviour will be fixed in a long time ...)

Thanks
ferpsql
Go to Top of Page
   

- Advertisement -