Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
rajani
Constraint Violating Yak Guru
367 Posts |
Posted - 2004-06-28 : 21:36:51
|
| SQL 2000 - SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTEDWe've large query that takes considerable time. To speed up response time, we are thinking of adding SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED in our stored procedures. Does anyone have any positive or negative experience in1)using SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED Thank you, RajaniCheers |
|
|
derrickleggett
Pointy Haired Yak DBA
4184 Posts |
Posted - 2004-06-28 : 21:45:23
|
| If this is important data; and you can't afford to have people making decisions based on the SELECT if something has changed, then it's a really, really bad idea. On the other hand, if the consistency of the data isn't important, you can really speed up queries sometimes by doing this. The best thing is to correct the design so you don't have this problem though.MeanOldDBAderrickleggett@hotmail.comWhen life gives you a lemon, fire the DBA. |
 |
|
|
rajani
Constraint Violating Yak Guru
367 Posts |
Posted - 2004-06-28 : 21:54:46
|
| Thanks for the quick reply derrickactually the data is our projects,current assignment ,the time spent on these assignments etc., actually the application is for internal use only.The situation is everytime we query from front end application i display summary of total time spent on queried assignments.as u know,these summary calculations always take time.i want speed the process up.by the way Could u tell me what worst can happen apart from the one u said in ur post.many thanksCheers |
 |
|
|
derrickleggett
Pointy Haired Yak DBA
4184 Posts |
Posted - 2004-06-28 : 22:59:16
|
| Well, if you are in the middle of updating and there are locks on the table, it will just read through them. So, the data won't be accurate anyway. You might consider breaking your sums out. You could run sums every 15 minutes or so and select from that. It doesn't sound like that complicated of a query though. If you wouldn't mind posting it, you might get one of the brilliant minds on here (that wouldn't be me) to help you out.MeanOldDBAderrickleggett@hotmail.comWhen life gives you a lemon, fire the DBA. |
 |
|
|
rajani
Constraint Violating Yak Guru
367 Posts |
Posted - 2004-06-28 : 23:07:21
|
| actually i like to post the entire query here.currently i'm very busy so i cannot do so at the moment but i'll definetly do so soon.by the way i read following tip on (www.sql-server-performance.com)If your application needs to retrieve summary data often, but you don't want to have the overhead of calculating it on the fly every time it is needed, consider using a trigger that updates summary values after each transaction into a summary table.While the trigger has some overhead, overall, it may be less that having to calculate the data every time the summary data is needed. You may have to experiment to see which methods is fastest for your environmentcan i ask ur expert comments plz.Cheers |
 |
|
|
derrickleggett
Pointy Haired Yak DBA
4184 Posts |
Posted - 2004-06-28 : 23:38:11
|
It's better to control your point of entry and do the sums in the procedures that are inserting/updating if you can manage your processes good enough. This is my opinion of course. I'm sure there's plenty on here who will disagree. MeanOldDBAderrickleggett@hotmail.comWhen life gives you a lemon, fire the DBA. |
 |
|
|
|
|
|
|
|