Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Log actions to tables within transactions

Author  Topic 

Daniel_Buchholz
Starting Member

28 Posts

Posted - 2002-04-02 : 07:57:56
Hi all!

I want to have a log table of all actions performed during a rather complex set of stored procedures. Most of them have transactions. Currently I have a procedure that I call as needed just doing the logging.

Now my problem is: If one of the transactions fails a rollback occurs and I consequently lose any of my log entries done within this transaction. Does anyone have a better solution for doing log actions?

I really would like to have the possibility to keep logging outside of the scope of any transaction which is not possible i think, as i want to log from inside of them.

Any idea?

Thanks!

Daniel

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-04-02 : 08:09:49
See if this helps:

http://www.sqlteam.com/item.asp?ItemID=2290

Go to Top of Page

AndrewMurphy
Master Smack Fu Yak Hacker

2916 Posts

Posted - 2002-04-02 : 08:23:24
have done something similiar to this in a DB2 application....but the same principle applies.


start a loop
if failure-variable = "Y" then
log appropriate error message - record number, status, program stage, etc
clear failure-variable
skip normal processing logic
if failure-variable <> "Y" then
process record as normal
if rollback-situation encountered then
set failure-variable = "Y"
end loop


basically....your process is driven by an attempt to process an input table in some manner...in the event of a failure/rollback...you want to re-process the same input record, but only logging the previously discovered failure....after logging (commit to the log) and skip forward to the next input record.


extra logic needs to be added if you are processing batches of records within 1 transaction, in which case a rollback brings you back multiple good input records and 1 bad 1. in this case, you need to be able to work forward a 2nd time through the good input records (in that batch) and commit this (reduced) set at the end (before you do anything with the bad input record)....then treat the failing record as a seperate batch...just logging the error and then proceed to starting a new batch of hopefully good input records, starting 1 record after the known failure.


if any of the above needs clarifying, revert.

Go to Top of Page

Daniel_Buchholz
Starting Member

28 Posts

Posted - 2002-04-02 : 08:36:27
quote:

See if this helps:

http://www.sqlteam.com/item.asp?ItemID=2290





I guess that´s what I was looking for. Will give it a try. Thanks!

Daniel
Go to Top of Page

Daniel_Buchholz
Starting Member

28 Posts

Posted - 2002-04-03 : 08:19:49
It´s me again! :-)

I have spent some more thoughts on this topic and just want to verify that I am not running in the wrong direction.

I want to use Andy Pope´s approach to logging.

We have several nested procedures here (up to 4 or 5 levels deep). But of course I want to create the 'logging connection' only once and even don´t want to pass the objecttoken to every single procedure that I call ( a day has only 24 hours).

So my idea was to do a 'create function' after I set up the log connection. I want this function to return the objecttoken for the log connection. Then all of the log routines would call the function to get the token and do the logging. So you see what I´m trying to do is a workaround for not having some kind of global variables.

Sounds easy. But are there any implications I don´t see at the moment?

Daniel
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-04-03 : 09:02:12
I think this might be better:

Write a stored procedure that accepts the log parameters you want to record. The SP will log it using Andy's SQL-DMO technique to ensure it won't be rolled back. It will log in, insert the row, and close the connection. The logging table is a regular, permanent table so you don't have to worry about scope problems with temp tables.

Each time you want to log the progress of your other SPs, call this logging SP, for example:

INSERT INTO myTable SELECT * FROM otherTable
IF @ERROR=0
EXEC LogProgress 'Step 5', 'OK', 0
ELSE
EXEC LogProgress 'Step 5', 'Failed', @ERROR


In this case you don't need global variables or a global initialization routine, and you don't need to maintain a connection between calls.

Now, I admit that this could have some serious performance drawbacks, but you should test to find out for sure. You're instantiating a COM object repeatedly, and creating and destroying connections. Normally that's death and should be avoided at all costs. MAKE ABSOLUTELY CERTAIN that you do the proper cleanup and object destruction/connection closing, or your server will start smoking.

Let us know how this works out!

Go to Top of Page

Daniel_Buchholz
Starting Member

28 Posts

Posted - 2002-04-04 : 07:59:37
Heyho!

It works!

Fearing performance problems I tried my approach and not your´s, robvolk, sorry . I really don´t want do open a fresh connection for each log entry. In case of errors in our procedures we easily get several thousand (!) error log messages.

So I try to reuse one connection and here is how I think it works (and it fact it does work here, the server is fine):

I have one procedure for establishing the log-connection:

LogConstructor
  • Establish and verify a new server connection using SQL-DMO

  • Dynamically create a user function that returns the object token for the connection (the return value of the SP_OACreate statement)


Simple. This procedure consists mainly of Andy´s code.

Of course there is another procedure that does some cleanup:

LogDestructor
  • Destroy the connection represented by the user function´s return value

  • Drop the user function



This works fine. But you have to make sure that LogConstructor and LogDestructor get called within the same batch.

Between the two calls you can log whatever you want with a log procedure that goes something like this (again some of Andy´s code there):

LogMessage
  • Call the user function to get the object token for the log connection

  • Set the log connection´s database

  • Build up a dynamic sql statement that will do the logging

  • Execute the statement using sp_OAMethod

  • sp_OADestroy the new return values that turned up in the proc (Database and QueryResults)



That´s about all. Had some tests here and until now it works. And I did not experience any perfomance hit until now. But I will have more tests for sure...

Comments appreciated...

Daniel
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-04-04 : 09:46:32
NICE JOB!

I forgot that you were doing this in one batch, so you're right, performance won't be an issue.

Can you post the code? It would be a great addition to the SQL Team library! You can put it in the Script Forum, or in the comments portion of the article:

http://www.sqlteam.com/item.asp?ItemID=2290

Go to Top of Page

Daniel_Buchholz
Starting Member

28 Posts

Posted - 2002-04-05 : 02:43:30
Code is up!

[url]http://www.sqlteam.com/Forums/topic.asp?TOPIC_ID=14623[/url]

Daniel
Go to Top of Page
   

- Advertisement -