Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Returning to old problem (for gurus)

Author  Topic 

barmalej
Starting Member

40 Posts

Posted - 2005-10-15 : 13:07:18
Hello,
This old problem to save some log data from SP even if proper transaction is rolled back.

One known solution is to use something like:
select @sql = 'isql.exe -Q "INSERT INTO errortable VALUES(101,''My Error Description'')"'
exec master..xp_cmdshell @sql
--------
Problems here could be access permission and performance.
I can think also of another way: to trigger SQL server job which hopefully will save data and not remove during rollback. I am not sure at all about good performance though.

Any fresh ideas?

robvolk
Most Valuable Yak

15732 Posts

Posted - 2005-10-15 : 13:13:13
Have you read this one yet?

http://www.sqlteam.com/item.asp?ItemID=2290
Go to Top of Page

barmalej
Starting Member

40 Posts

Posted - 2005-10-15 : 13:19:08
very interesting, thanks.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-15 : 13:29:25
We have a Log Table (containing errors and informational stuff inserted by our SProcs as they run. We want to preserve this information if we roll back.

If we need to rollback we query all the relevant data from the Log Table into an @TableVar(we have a "session ID" in the log table, so we can easily get the relevant data - never more than 20 - 50 rows.

Then we Rollback, and then we insert into the Log Table from the @TableVar anything which is missing (we also change the value in one of the columns to indicate that it was rolled back).

Note that an @TableVar is NOT changed by a Rollback.

Kristen
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2005-10-15 : 13:35:26
You can also insert the log data into a table variable.
After a rollback, the data in the table variable is not rollbacked'ed
And you can insert into the real log table.

declare @errlog table(logdata varchar(35))

begin tran
insert @errlog values('somestuff')
rollback

select logdata from @errlog


that seems easier than the sp_OA stuff...

rockmoose


time to go watch the telly, Kristen's in the house, fastest sql slinging yak around
Go to Top of Page

barmalej
Starting Member

40 Posts

Posted - 2005-10-15 : 13:54:08
Yes, solution with table variable looks easier though as I am investigating problem places where deadlocks happen or bad and long running queries slow database (so that killing them is the only way to continue production) it is a good question if table variable will survive always untill could be saved :-).

Thank you very much.

Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2005-10-15 : 14:23:06
quote:
Originally posted by barmalej

Yes, solution with table variable looks easier though as I am investigating problem places where deadlocks happen or bad and long running queries slow database (so that killing them is the only way to continue production) it is a good question if table variable will survive always untill could be saved :-).

Thank you very much.



Depending on how paranoid you are on losing the log data, another solution could be to stream the data into a text file (very rollback safe)
Then use bcp/bulk insert to get it back into the logtable at a later time.

create the file : C:\err.log

exec master..xp_cmdshell 'echo 123;my message >> C:\err.log', NO_OUTPUT
exec master..xp_cmdshell 'echo 456;my message >> C:\err.log', NO_OUTPUT

rockmoose
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-16 : 01:56:58
"solution could be to stream the data into a text file"

Have you any experience of how fast, or otherwise, that is Rocky?

Definitely something I would like to consider ...

I mean, if every single SProc did a "Hi I've started" insert to log, and a "Hi I've finished" UPDATE of that log row on exit, would ECHO to text file be quicker?

Kristen
Go to Top of Page

barmalej
Starting Member

40 Posts

Posted - 2005-10-16 : 05:20:40
Joining to Kristen and asking also if anybody has tried the idea (and speed) to:
1. Make SP that clones (not cloned yet) records in log table. (It must be possibly a small trick here so that SP must not wait for not finished transactions writing also into this table)
2. Create SQL server job that calls this SP
3. In debugged SP write log record to log table and using sp_start_job start the job.

ROLLBACK or SQL process killing will remove original log records but hopefully we still have cloned records left. And this is pure SQL solution not demanding writing DOS file and other permissions, etc.

Gennadi
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2005-10-16 : 06:23:45
quote:
Originally posted by Kristen

"solution could be to stream the data into a text file"

Have you any experience of how fast, or otherwise, that is Rocky?

Definitely something I would like to consider ...

I mean, if every single SProc did a "Hi I've started" insert to log, and a "Hi I've finished" UPDATE of that log row on exit, would ECHO to text file be quicker?

Kristen



Can't say. You would have to TEST
What I think is that it is about the same, (thinking loud).

If possible putting the logging table on it's own physical disk would help performance I reckon.

Gennadi,
I did not understand your last post.
Go to Top of Page

barmalej
Starting Member

40 Posts

Posted - 2005-10-16 : 07:26:00
The idea is in fact similar to http://www.sqlteam.com/item.asp?ItemID=2290
There external process writes "Message from Inside Transaction" and thus it is not removed during rollback, right?
But instead of bothering with SQL-DMO I thought of (but have not tested yet, hoping that it is known way)
another way of saving this data:
a. Write "Message from Inside Transaction 1" to logtable (it will be 'rollbacked' possibly)
b. Call at once sp_start_job and via job CLONE "Message from Inside Transaction 1" to "Message from Inside Transaction 1 Cloned" , it will be then created by job not by my process.

After rollback I hope that second "Message from Inside Transaction 1 Cloned" still will be in the table.

In addition I can make it possible that an other process (= job) will not wait untill main transaction will be committed/rollbacked/killed in order to write own record (="Message from Inside Transaction 1 Cloned") into logtable. But this is another question.

Gennadi
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2005-10-16 : 07:55:41
A call to sp_start_job would fail if the job is already running.
You need to do a dirty read from the logtable in the job, while the transaction is running. (nolock).
You would perform at least twice the logging, once for the "noncloned" then for the cloned row. Plus any logging in msdb that the job does.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-16 : 07:56:18
"Can't say. You would have to TEST"

What's a good way to compare the two?

We already have "Log that I started" and "Update log that I finished" SProcs that are being called from pretty much all our application SProcs.

I'm inclined to add to those to [additionally] append to Log File, I could then see how average execution time of all SProcs slowed down, if at all, and whether we get contention for the Log Text File.

Then I could replace the "append to file" with "Insert to Log Table" [a new one, not the existing one] and see how that compared.

Finally I could look at whether we could really remove the real-time logging, or if we have some stuff that is dependent on it (( don't think so, but ...)

Kristen
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2005-10-16 : 08:30:25
Maybe this simple test settles it:
appending to file was not very fast...

--create table errlog(id int identity primary key, logdate datetime not null default getdate(), msg varchar(255) not null)
--drop table errlog
--truncate table errlog

set nocount on

declare @i int, @d datetime, @logcmd varchar(255)
select @i = 0, @d = getdate()

PRINT 'to table, 1000 inserts'
while @i < 1000
begin
set @i = @i + 1
insert errlog(msg) values( 'some message' )
end

select datediff(ms,@d,getdate()) msecs

select @i = 0, @d = getdate()

PRINT 'to file, 1000 writes'
while @i < 1000
begin
set @i = @i + 1
set @logcmd = 'echo ' + ltrim(@i) + ';' + convert(varchar(23),getdate(),121) + ';' + 'some message' + ' >> C:\errlog.log'
exec master..xp_cmdshell @logcmd, NO_OUTPUT
end
select datediff(ms,@d,getdate()) msecs

-----------------------------------------------------
to table, 1000 inserts
msecs
-----------
266

to file, 1000 writes
msecs
-----------
15686
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-16 : 12:01:04
Blast!

Kristen
Go to Top of Page
   

- Advertisement -