Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Transact-SQL (2000)
 Export large XML query to file

Author  Topic 

Kristen
Test

22859 Posts

Posted - 2005-02-14 : 11:04:49
What's the best way to get an XML resultset into a file?

Something like

SELECT
[Tag] = 1,
[Parent] = NULL,
[MyTable!1!MyColumn1!element] = MyColumn1,
[MyTable!1!MyColumn2!element] = MyColumn2,
...
FROM dbo.MyTable
FOR XML EXPLICIT


Can I just call an SProc from BCP to squirt the results of this to a file? Or is there a utility/code snippet which will connect to DB, make a STREAM and squirt the result to file?

Thanks

Kristen

X002548
Not Just a Number

15586 Posts

Posted - 2005-02-14 : 12:45:53
Create a view and bcp out the view?



Brett

8-)
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-02-14 : 13:05:17
I'll give that a go. There are 30 tables to export, the SProc has them one-after-another. But I suppose I could make a view that UNIONs them together ...

Here's what I get on the first one :-(

Error = [Microsoft][ODBC SQL Server Driver][SQL Server]Warning: Server data (3046 bytes) exceeds host-file field length (0 bytes) for field (1). Use prefix length, termination string, or a larger host-file field size. Truncation cannot occur for BCP output files.

I reckon all the XML [for the whole table] is output as one column, onw row, so is exceeding the width available. Can I work around that?

What I want to do is to have a client's web site be able to get new "Meta data" for about 30 tables via a web page - ideally by querying our "master web site". This is instead of having to have DTS files or, as we do at present:

DTS the tables to a "temporary" database
Backup TemporaryDatabase
(Maybe ZIP and) Copy Backup file to target machine
Restore Backup file to temporary database on target machine
Run script to selectively transfer data to the Live database.

The tables in question are subject to fairly frequent DDL changes - so the DTS stuff is a pain because it gets out of shape, and we also need to be able to import for a new "wider" set of tables to the "narrrow" tables in older databases.

XML seems ideal for this, but it seems to be a right pain to actually produce XML for tables of any real size.

I built a web page that called my SProc. The SProc ran for 10 seconds, IIS on the server (and IE on the Client) ran for an hour at 100% CPU rendering the stuff - before I terminated it ...

Gotta be an easier way ...

Kristen
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-02-14 : 13:20:53
Here's an exanmple if it helps:

bcp "SELECT * FROM Northwind.dbo.Products FOR XML auto" queryout junk.XML -S MyServer -U MyUserID -P MyPassword -c -r -t

Starting copy...
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC SQL Server Driver][SQL Server]Warning: Server data (3046 bytes) exceeds host-file field length (0 bytes) for field (1). Use prefix length, termination string, or a larger host-file field size. Truncation cannot occur for BCP output files.


Kristen
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 2005-02-14 : 15:22:45
What if you used osql and redirected the output to a file...

I think Nigels got something...

I don't use XML all that much..umm or at all

How are you gonna apply it?



Brett

8-)
Go to Top of Page

byrmol
Shed Building SQL Farmer

1591 Posts

Posted - 2005-02-14 : 15:50:31
Kristen,

In the past, I have BCP'd as normal then ran a XSL template to transform the CSV into XML. SQL Server is happy about it and the transformation is fairly fast.

I'll see if I can dig up that old XSL template if you like...

DavidM

“And the answers fall easier from a barrel of a gun than it does from the lips of the beautiful and the dumb.
The world won’t end in darkness it will end in family fun, with Coca-Cola gloves behind a Big Mac sun.” The Beautiful South
Go to Top of Page

spirit1
Cybernetic Yak Master

11752 Posts

Posted - 2005-02-14 : 16:10:14
how big are those tables?
if they're not more than 50-100k rows maybe you should consider usind dataset's WriteXML() methid??
no fuss with anything.
just a thought....

Go with the flow & have fun! Else fight the flow
Go to Top of Page

hgorijal
Constraint Violating Yak Guru

277 Posts

Posted - 2005-02-15 : 00:18:20
Kristen,
If frequent DDL changes are what's hindering you... why not try this..

1. Script table DDLs from Source (with DROP and CREATE)
2. BCP out data (Native/Char format; NOT XML)
3. Zip and Copy both Scripts and BCP data to Destination.
4. Execute Script
5. BCP in data

Why XML? Note that data files in xml format will be far more larger than the standard bcp files.

HTH..

Hemanth Gorijala
Exchange a Dollar, we still have ONE each._______Exchange an Idea, we both have TWO each.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-02-15 : 01:16:16
Nightmare.

The error reported earlier is benign - I didn't think to actually look at the data file after getting a slew of errors!

The performance is dire ...

BCP whips the data out fast enough ...

... in my "import" SProc I have:

EXEC sp_xml_preparedocument @hDoc OUTPUT, @strXML

-- Create temporary table
SELECT [Action] = 'X', *
INTO #MyTable
FROM dbo.MyTable
WHERE 1=0

-- Get data from XML into temporary table
INSERT INTO #XML_KK_MB_CAT_ColumnAttribute
SELECT 'X', *
FROM OPENXML (@hDoc, '/root/MyTable', 3)
WITH dbo.MyTable

-- Mark records which "Match" existing data
UPDATE U
SET Action = 'M'
FROM #MyTable U
JOIN dbo.[MyTable] S
ON U.[MyPK] = S.[MyPK]
WHERE NOT
(
(U.[MyCol1] <> S.[MyCol1]
OR (U.[MyCol1] IS NULL AND S.[MyCol1] IS NOT NULL)
OR (U.[MyCol1] IS NOT NULL AND S.[MyCol1] IS NULL)
)
...
)

so I can wind up with a temporary table which has the "latest" data in it. Then I can do a Delta update of the actual data.

Timing (seconds elapsed from SProc start) are:

1.110 sp_xml_preparedocument
146.700 INSERT INTO TempTable SELECT * FROM @hDoc_XML_table
147.120 UPDATE to mark matching records

The XML file for this one table is 1.3MB - another 20 MB of XML data exists in the job - so its pretty dire :-(

Kristen
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-02-15 : 02:14:43
"Why XML?"

Well I definitely need some help with the problem, rather than my perceived solution!

I need to get data from our central development system onto client's systems. I do this about 4 times a year for each client. We are taking on new clients at a rate of 2 a month, which will grow to between 4 and 10 a month by the end of the year.

The data in question controls our application - a set of application rules, if you like.

As I said above I currently DTS this data to a fresh database (here), and backup & transfer that DB to the client's SQL box, and then use a script to update their live database with any changes.

Getting a file onto some of the SQL boxes can be a nightmare - they are at ISP's, behind firewalls, have no FTP or physical access, and so on. So I use Terminal Services to get to the SQL box and IE to "pull" a ZIP file from some other web site. So I have to first upload it to some public-facing web server.

Alternatively I could just DTS the data from here direct - but the structure of the tables changes often enough that I would have to re-make the package each time. And the DTS process itself take quite a long time (more than my Upload database/restore method) ...

So, I thought: why not just upload the data to their web site? Stick it into a web page which will pass the data to its SQL box, and then an SProc can import the data.

So I then thought XML.

However, the web pages have very limited file access; to prevent people usurping the web boxes the file permissions are set very tight. SO soring data, from a web page, into a file is complicated to set up. And even if we did that the SQL boxes have no (LAN) Share to the Web boxes - so no easy way to upload BCP files via the web that SQL could then "see".

Having said that, Native format BCP files would be much MUCH better than my current crummy XML stuff.

Any ideas would be appreciated.

Kristen
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 2005-02-15 : 10:30:11
I don't understand the changing structure component.

Is it chnaging on their end or yours?

Are the tables the same or different. If different, how do you know to map the columns. What about new columns or columns that disappeared?

If they're the same, how? Do push out DDL to Alter the destination database?

Could you generate a bcp out query from the catalog, along with format cards?



Brett

8-)
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-02-15 : 15:01:08
I agree we can change the DDL, its just that its a pain if we've added a new column for some additional functionality that old systems will ignore,a dn they just need the lastest data to be getting on with twhat they currently do - only more up-to-date!

But mainly I thought the XML route would pump data out in a really straightforward manner, but its turned out to be a nightmare. And I'm miffed that I spent two days trying to sort it out, and then decided that it wasn't going to be any good.

Now can the parser be SO slow?

Kristen
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2005-02-15 : 15:58:35
How about bcping something completely braindead such as this:

select '<oxml><x1>'+ltrim(id)+'</x1><z7>'+name+'</z7></oxml>' from sysobjects

you wont get the field length error at least...

rockmoose

Edit1: xml is useful for consultants paid by the hour.
Edit2: and hardware vendors
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-02-16 : 14:43:44
Thanks Rocky, but getting it OUT is fine, its getting it IN that is taking forever ...

Let me ask a different question.

Have any of you guys used sp_xml_preparedocument / OPENXML to process any XML [i.e. SQL-side] greater than say 10MB? 1MB then?

The performance is just PANTS. :-(

I'd love someone to prove me wrong on this.

I am currently reckoning on having an XML -> flat file tool that will present the data to BCP to import. That's just plain daft, but it looks like its going to be much faster - albeit it several days of [unnecessary] development.

Kristen
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2005-02-16 : 22:35:00
As daft as that sounds, I've done something similar in the past. Not only is the performance much better, it's actually easier to deal with, especially with large XML files. A simple XSL sheet is all you need to push it out as CSV or tab delimited. The programming needed for it would be pretty small, maybe a dozen lines of VB or JavaScript.

Have you looked at the XML Bulk Load utility BTW? I haven't used it, but it's specifically designed for situations like yours.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-02-17 : 14:34:02
"XML Bulk Load"

Nope, knew nothing about it. Thanks for the pointer.

<fx:Toddles off to look>

Kristen
Go to Top of Page
   

- Advertisement -