Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Kristen
Test
22859 Posts |
Posted - 2004-07-07 : 12:55:25
|
I need some advice / suggestions on XML plsWe need to provide an XML toolset for clients to import data.Basically its Stock details, Price details, Stock levels. Customer addresses and Order status.So I figure to allow XML data that pretty much corresponds to our table schema. (With documentation to help them understand what is supposed to go where, but part of the "Win" for us would be to make the data conversion their problem, and reduce the "Why didn't Row X get imported" questions)Then I plan to generate XSD from our MetaData (i.e. the database schema for Type/Size/FK etc. and our application's MetaData for RegEx constraints and other business rules)So then the client can validate their XML data against our XSD. That will be GREAT because it will get rid of 90% of the timewasting we currently get with "Why didn't this record get through data conversion?"So then I planned to pull all the XML files into "transfer tables" - these will have the same-ish schema as the main tables, but much more lax rules (no constraints, no FKs etc.)[Same DB? "SandPit" DB? Don't really want to overload the Trans.Logs with this stuff, and a SIMPLE backup model would do]Then some data tests - i.e. will the data insert OK? (Does it have all the necessary data to satisfy FKs etc.? - I'm assuming I'm going to need to do some tests that go beyond what XSD could achieve in this regard - plus the customer may be uploading a "delta" of data changes, rather than the whole-nine-yards)I would think we would make a comparison tool available at this point - compare Live data to Xfer data. That will allow the customer to get "comfortable" with the data they are about to load; its an optional step, once something is proven it can then just be scheduled, but comparison after-the-fact would still exist for when it goes pear-shaped!Next up is to do an UpSert into the live database. If a record exists Update it. Otherwise Insert it.Now you can tell em the far better ways of doing it!Kristen |
|
MichaelP
Jedi Yak
2489 Posts |
Posted - 2004-07-08 : 14:08:53
|
Well, one thing that might make it a bit easier is to include some XML elements in the file they send you that tells you how many rows they expect to be imported. When you do your tests if you get the same number, then move them into the live database. If you get a different number, drop a record into some sort of "exception" table and send them a message. You could even go so far as to say "I'm able to import the following records <insert record list here>" in that e-mail.Other than that, sounds like you are on the right track. The XSD is KEY here. That's going to solve most of your problems.One other thing: Maybe you give them a .Net app that loads that XML data into some Datasets that are setup the same way your database is. Then they can do all the checking on their end and on THEIR CPU. Once they get it right, upload it to your system and go through the processes above.Michael<Yoda>Use the Search page you must. Find the answer you will.</Yoda> |
 |
|
Kristen
Test
22859 Posts |
Posted - 2004-07-08 : 14:18:06
|
T'rrific, thanks for that. I like the "Here come N rows" check&balance> .Net app that loads that XML data into some DatasetsI could do with a bit of help to understand this better. Is this just to display their data back to them in "my" format, so they can see how it looks, or something more sexy?Some other questions I've been pondering and could do with some help on:Should I expect to be able to validate the FKey stuff via XSD, or will it in practice be better to grab the data into a temporary table and then do some FKey checks?Should I validate, say, a country code via XSD or will it in practice "bloat" the XSD (I mean, I don't think they'll have any data for Timbuktu, but the code for that is in the lookup table ...) and also possibly require frequent updates to XSD as new codes are added, and thus be better done once I've got the data into my temporary table?ThanksKristen |
 |
|
MichaelP
Jedi Yak
2489 Posts |
Posted - 2004-07-08 : 15:00:12
|
Another solution is to create a Web Service that takes the data that you need. Then there's no need for all of this data scrubbing.Michael<Yoda>Use the Search page you must. Find the answer you will.</Yoda> |
 |
|
MichaelP
Jedi Yak
2489 Posts |
Posted - 2004-07-08 : 15:03:11
|
The post above was not a response to your response.That .Net app is "more sexy" in that it should do all of the same validation that you were going to do on their end. It should then show the user how many records you got after that scrubbing. This might not work for you since you have to do that FK validation. It might be best to have them tell you how many rows and let you do the validation and scrubbing on your end in SQL.Michael<Yoda>Use the Search page you must. Find the answer you will.</Yoda> |
 |
|
Kristen
Test
22859 Posts |
Posted - 2004-07-08 : 18:32:49
|
We plan to get this to be a Web Service, but we're not quite ready for that yet (available resources issues and timing with other development rollout ... boring stuff!).I've got the hang of the use of the benefit of the .Net Datasets now, thanks. We've got some fairly slinky recordset display stuff, so I can give the users a pretty good experience once we've got their data into some transfer tables - and can display the anomolies in all their glory! (The Top&Bottom is that we've wasted gazzilions of hours because of data errors their end and they don't really realise how bad their data continues to be!) Big mistake building a data conversion routine when we should have built this XML/XSD and Data Scrubber gear instead ...Kristen |
 |
|
|
|
|
|
|