Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 Send result set to a text file from a trigger

Author  Topic 

AskSQLTeam
Ask SQLTeam Question

0 Posts

Posted - 2002-02-18 : 09:23:39
John writes "How would you do this:

I need to be able to setup a trigger in which everytime there is an insert-a-new-record event a query is executed and the result set is sent to a text file. We plan on writing a program that reads and processes the data found in the text file. We consider this "push" approach better than the "pull" alternative where the application is looping through continuously and querying for an new records. And it is less intrusive since our software won't need to log into the database server since the data file will be delivered to a public directory.

thanks in advance,
john"

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-02-18 : 09:46:30
I think you can do this, but I have some doubts about whether it's the right way. I'm mostly concerned about an INSERT that is later rolled back; the rows would be put in the text file but WON'T exist in the table. It's not a question of "that's not a problem, we don't do anything like that"; sooner or later you will come across such a scenario. Really what you're doing is maintaining two separate sets of the same data; without replication controlling it you WILL end up with conflicts.

What does this other application do, why does it need only the newly inserted rows? What happens to rows that are deleted or updated, wouldn't the other app need to know about them as well? It's problematic to split your data into these kinds of categories, as I said earlier, you'll be left with conflicting versions of your table's data at some point.

To answer your question, if you decide to go forward, this is how I'd go about doing it: put the new rows into a secondary "holding" table (that has the same structure), then have a scheduled job run at a certain interval to output this table to a file. You can clean up the holding table at the same time, or clear it out on a daily basis (you didn't say what happens to the file as rows are added).

CREATE TRIGGER CopyNewRows ON myTable FOR INSERT AS
INSERT INTO holdingTable SELECT * FROM inserted


You can then create a DTS job to transfer the holding table to a text file, and schedule it to run at any interval you choose. WARNING: you DO NOT want to incorporate the export in the trigger itself; it will hold resources for too long and slow the process down unacceptably.

Again, I think you should reconsider this approach, and look at a pull strategy. You can always add a column that lists the date and time a row was inserted, and have a query use that as a guideline. If you index this column, the performance will be fine, and you will always have the option of reconstructing a (lost) day's worth of insert activity.

Go to Top of Page
   

- Advertisement -