Author |
Topic |
barmalej
Starting Member
40 Posts |
Posted - 2005-10-15 : 13:07:18
|
Hello,This old problem to save some log data from SP even if proper transaction is rolled back.One known solution is to use something like:select @sql = 'isql.exe -Q "INSERT INTO errortable VALUES(101,''My Error Description'')"'exec master..xp_cmdshell @sql--------Problems here could be access permission and performance.I can think also of another way: to trigger SQL server job which hopefully will save data and not remove during rollback. I am not sure at all about good performance though.Any fresh ideas? |
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2005-10-15 : 13:13:13
|
Have you read this one yet?http://www.sqlteam.com/item.asp?ItemID=2290 |
|
|
barmalej
Starting Member
40 Posts |
Posted - 2005-10-15 : 13:19:08
|
very interesting, thanks. |
|
|
Kristen
Test
22859 Posts |
Posted - 2005-10-15 : 13:29:25
|
We have a Log Table (containing errors and informational stuff inserted by our SProcs as they run. We want to preserve this information if we roll back.If we need to rollback we query all the relevant data from the Log Table into an @TableVar(we have a "session ID" in the log table, so we can easily get the relevant data - never more than 20 - 50 rows.Then we Rollback, and then we insert into the Log Table from the @TableVar anything which is missing (we also change the value in one of the columns to indicate that it was rolled back).Note that an @TableVar is NOT changed by a Rollback.Kristen |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-10-15 : 13:35:26
|
You can also insert the log data into a table variable.After a rollback, the data in the table variable is not rollbacked'edAnd you can insert into the real log table.declare @errlog table(logdata varchar(35))begin tran insert @errlog values('somestuff')rollbackselect logdata from @errlog that seems easier than the sp_OA stuff...rockmoosetime to go watch the telly, Kristen's in the house, fastest sql slinging yak around |
|
|
barmalej
Starting Member
40 Posts |
Posted - 2005-10-15 : 13:54:08
|
Yes, solution with table variable looks easier though as I am investigating problem places where deadlocks happen or bad and long running queries slow database (so that killing them is the only way to continue production) it is a good question if table variable will survive always untill could be saved :-). Thank you very much. |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-10-15 : 14:23:06
|
quote: Originally posted by barmalej Yes, solution with table variable looks easier though as I am investigating problem places where deadlocks happen or bad and long running queries slow database (so that killing them is the only way to continue production) it is a good question if table variable will survive always untill could be saved :-). Thank you very much.
Depending on how paranoid you are on losing the log data, another solution could be to stream the data into a text file (very rollback safe)Then use bcp/bulk insert to get it back into the logtable at a later time.create the file : C:\err.logexec master..xp_cmdshell 'echo 123;my message >> C:\err.log', NO_OUTPUTexec master..xp_cmdshell 'echo 456;my message >> C:\err.log', NO_OUTPUTrockmoose |
|
|
Kristen
Test
22859 Posts |
Posted - 2005-10-16 : 01:56:58
|
"solution could be to stream the data into a text file"Have you any experience of how fast, or otherwise, that is Rocky?Definitely something I would like to consider ...I mean, if every single SProc did a "Hi I've started" insert to log, and a "Hi I've finished" UPDATE of that log row on exit, would ECHO to text file be quicker?Kristen |
|
|
barmalej
Starting Member
40 Posts |
Posted - 2005-10-16 : 05:20:40
|
Joining to Kristen and asking also if anybody has tried the idea (and speed) to:1. Make SP that clones (not cloned yet) records in log table. (It must be possibly a small trick here so that SP must not wait for not finished transactions writing also into this table)2. Create SQL server job that calls this SP3. In debugged SP write log record to log table and using sp_start_job start the job. ROLLBACK or SQL process killing will remove original log records but hopefully we still have cloned records left. And this is pure SQL solution not demanding writing DOS file and other permissions, etc.Gennadi |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-10-16 : 06:23:45
|
quote: Originally posted by Kristen "solution could be to stream the data into a text file"Have you any experience of how fast, or otherwise, that is Rocky?Definitely something I would like to consider ...I mean, if every single SProc did a "Hi I've started" insert to log, and a "Hi I've finished" UPDATE of that log row on exit, would ECHO to text file be quicker?Kristen
Can't say. You would have to TEST What I think is that it is about the same, (thinking loud).If possible putting the logging table on it's own physical disk would help performance I reckon.Gennadi,I did not understand your last post. |
|
|
barmalej
Starting Member
40 Posts |
Posted - 2005-10-16 : 07:26:00
|
The idea is in fact similar to http://www.sqlteam.com/item.asp?ItemID=2290There external process writes "Message from Inside Transaction" and thus it is not removed during rollback, right?But instead of bothering with SQL-DMO I thought of (but have not tested yet, hoping that it is known way) another way of saving this data: a. Write "Message from Inside Transaction 1" to logtable (it will be 'rollbacked' possibly)b. Call at once sp_start_job and via job CLONE "Message from Inside Transaction 1" to "Message from Inside Transaction 1 Cloned" , it will be then created by job not by my process. After rollback I hope that second "Message from Inside Transaction 1 Cloned" still will be in the table.In addition I can make it possible that an other process (= job) will not wait untill main transaction will be committed/rollbacked/killed in order to write own record (="Message from Inside Transaction 1 Cloned") into logtable. But this is another question.Gennadi |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-10-16 : 07:55:41
|
A call to sp_start_job would fail if the job is already running.You need to do a dirty read from the logtable in the job, while the transaction is running. (nolock).You would perform at least twice the logging, once for the "noncloned" then for the cloned row. Plus any logging in msdb that the job does. |
|
|
Kristen
Test
22859 Posts |
Posted - 2005-10-16 : 07:56:18
|
"Can't say. You would have to TEST"What's a good way to compare the two?We already have "Log that I started" and "Update log that I finished" SProcs that are being called from pretty much all our application SProcs.I'm inclined to add to those to [additionally] append to Log File, I could then see how average execution time of all SProcs slowed down, if at all, and whether we get contention for the Log Text File. Then I could replace the "append to file" with "Insert to Log Table" [a new one, not the existing one] and see how that compared.Finally I could look at whether we could really remove the real-time logging, or if we have some stuff that is dependent on it (( don't think so, but ...)Kristen |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-10-16 : 08:30:25
|
Maybe this simple test settles it:appending to file was not very fast...--create table errlog(id int identity primary key, logdate datetime not null default getdate(), msg varchar(255) not null)--drop table errlog--truncate table errlogset nocount ondeclare @i int, @d datetime, @logcmd varchar(255)select @i = 0, @d = getdate()PRINT 'to table, 1000 inserts'while @i < 1000begin set @i = @i + 1 insert errlog(msg) values( 'some message' )endselect datediff(ms,@d,getdate()) msecsselect @i = 0, @d = getdate()PRINT 'to file, 1000 writes'while @i < 1000begin set @i = @i + 1 set @logcmd = 'echo ' + ltrim(@i) + ';' + convert(varchar(23),getdate(),121) + ';' + 'some message' + ' >> C:\errlog.log' exec master..xp_cmdshell @logcmd, NO_OUTPUTendselect datediff(ms,@d,getdate()) msecs-----------------------------------------------------to table, 1000 insertsmsecs ----------- 266to file, 1000 writesmsecs ----------- 15686 |
|
|
Kristen
Test
22859 Posts |
Posted - 2005-10-16 : 12:01:04
|
Blast! Kristen |
|
|
|