Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
morphviper
Yak Posting Veteran
60 Posts |
Posted - 2009-03-17 : 11:38:43
|
Here is my challenge. We have an application that has its own database, and sends data to another database. These two databases do not really reference each other.Now here is where it gets confusing. the application runs processes that use a unique ID generated from and Identity field in the application Database. When it puts data into the user database it uses this process ID as a Key. So Process 234, the application would insert data into the users DB by using the 234 as a key.Right now the user Database has the field as an Int.The issue here is that the upgrade to the application and its database has started the Identity column back to zero. So every time the application tries to insert data into the users Database it get an error for duplicate key.Restrictions: I cannot touch the Applications Database. I have full access to the users Database. the users database has many website and back end connections to the data, so I must be careful when changing anything.What I need is what are the best solutions to handle this?Some suggestions brought to me are: Change the field in the User DB to a varchar and put an "H" for History on the old data in the user DB. Issues, we have many SP's that would need changed and Some Applications that access this data like reports that would need changed. Any help would be great! |
|
mfemenel
Professor Frink
1421 Posts |
Posted - 2009-03-17 : 13:00:54
|
Do you have to have the identity column in synch between both systems? I'm wondering if you update the code that inserts into your users DB to append a a value of 5 million to your id from the application database. For example if you get a value of 2 from the application that becomes 5000002. Also thinking some form of trigger could check your user table and if it's a dupe pad it out and if not insert it. You'd now for sure that anything over 5 million (using our example) was data inserted that way and preventing an error so if you ever needed to match up to the app table you'd just subtract 5 million and be in synch.Mike"oh, that monkey is going to pay" |
 |
|
morphviper
Yak Posting Veteran
60 Posts |
Posted - 2009-03-17 : 13:14:27
|
The two databases do not have to sync up. that is the good thing. Now for trouble shooting we might need the process number for the error and go back to our application system. We are unable to touch the code that inserts data into the user DB. Also as for adding 5 million, our process numbers with in 6 months are in the millions. So adding 5 million might put us to our int limit faster than we would like and we would have to go higher than 5 million to start.I know, there are some crazy limitations and I am trying to think of all them but it is hard to remember all of them.Also as for the trigger, doesn't sound too bad but we have a ton of tables that this would have to be done on and with foreign keys and so forth could be a huge undertaking as well. I don't know if there is a perfect solution to this, but I am still looking. Thank you for your input thus far! |
 |
|
mfemenel
Professor Frink
1421 Posts |
Posted - 2009-03-17 : 13:29:03
|
Hmm. You've got a fun one here. What about a replacing your table with a view and you could switch your structure around a little under the covers and not impact your other tables?Mike"oh, that monkey is going to pay" |
 |
|
|
|
|