Author |
Topic |
esthera
Master Smack Fu Yak Hacker
1410 Posts |
Posted - 2008-07-02 : 01:58:01
|
I just deleted from a table 989279 rows.after I do a big delete like this is there anything I should do to reindex or does sql do it by itself? |
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-07-02 : 08:09:45
|
Update stats on that table and run reindex in offline hours if you see fragmentation. |
|
|
esthera
Master Smack Fu Yak Hacker
1410 Posts |
Posted - 2008-07-02 : 08:22:55
|
are there any risks in doing these to losing data? |
|
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-07-02 : 08:26:02
|
You have already deleted data. Updating Stats will refresh your stats so your execution plan can use it. |
|
|
esthera
Master Smack Fu Yak Hacker
1410 Posts |
Posted - 2008-07-02 : 08:28:50
|
but the table still has over a million record - can running update stats possibly cause any data loss? |
|
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-07-02 : 08:29:58
|
NO |
|
|
esthera
Master Smack Fu Yak Hacker
1410 Posts |
Posted - 2008-07-02 : 08:39:28
|
and what about reindex?what do they each actually do? |
|
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-07-02 : 08:41:29
|
As you do massive update/delete/insert, your index pages get fragmented. So to avoid that, you have to rebuild index in High OLTP environment. It will update stats as well. |
|
|
tfountain
Constraint Violating Yak Guru
491 Posts |
Posted - 2008-07-02 : 10:12:43
|
We have a very busy transactional database. I currently run sp_updatestats every 20 minutes through the day and we have nightly jobs during the week that performs online index defrags. During the late night on a weekend we perform a full index rebuild. Pretty standard stuff.The usage of sp_updatestats was a later addition (recent) but the rest has been in place for years and works like a charm. With 2005, sp_updatestats has been optimized to only update stats if they have reached a threshold so having it run every 20 minutes does not impact our performance at all. It usually finishs in 1-3 minutes (and this is going across 14 user databases). |
|
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-07-02 : 10:21:54
|
quote: Originally posted by tfountain We have a very busy transactional database. I currently run sp_updatestats every 20 minutes through the day and we have nightly jobs during the week that performs online index defrags. During the late night on a weekend we perform a full index rebuild. Pretty standard stuff.The usage of sp_updatestats was a later addition (recent) but the rest has been in place for years and works like a charm. With 2005, sp_updatestats has been optimized to only update stats if they have reached a threshold so having it run every 20 minutes does not impact our performance at all. It usually finishs in 1-3 minutes (and this is going across 14 user databases).
So you mean in all your databases Auto-updatestatistics is turned off? |
|
|
tfountain
Constraint Violating Yak Guru
491 Posts |
Posted - 2008-07-02 : 10:48:24
|
quote: Originally posted by sodeepSo you mean in all your databases Auto-updatestatistics is turned off?
We have the auto update statistics on and the auto update async option is off. But even then we still have issues with statistics getting out of whack before SQL Server updates them (in it's leisure). Even the async version will not help us in this regard since it does not benefit the currently executing query. However, forcing the stats to be updated provides this immediate performance boost.See, the way these methods are designed are 'reactive' in nature. They update statistics based on detecting they are out of date *when a request tries to use them*. Initiating the update stats on a schedule allows us to be more 'proactive' and avoid any temporary performance hits. Note that the primary need for this is due to the fact we have several large processes written in the database using T-SQL that are basically 'self-degrading' in terms of performance.When I say 'self-degrading', if you look at how SQL Server creates and uses execution plans, these processes mess up the plan and stats that the procs started with internally and keep hitting the same tables over and over again with massive insert/updates/deletes. |
|
|
esthera
Master Smack Fu Yak Hacker
1410 Posts |
Posted - 2008-07-03 : 01:53:43
|
ok i reindexednow sure how to tell if it made a difference or nothow often would you recommend to reindex? |
|
|
TRACEYSQL
Aged Yak Warrior
594 Posts |
Posted - 2008-07-04 : 11:43:58
|
tfountain do you have a couple of yur update stats script for doing all databases and do you put that in a "SYSTEMADMIN" DB and run sp from here or in master...Cheers |
|
|
rmiao
Master Smack Fu Yak Hacker
7266 Posts |
Posted - 2008-07-04 : 20:03:39
|
>> how often would you recommend to reindex?Depends on how you modify data, can do it weekly in weekend. |
|
|
tfountain
Constraint Violating Yak Guru
491 Posts |
Posted - 2008-07-05 : 23:02:02
|
quote: Originally posted by esthera ok i reindexednow sure how to tell if it made a difference or nothow often would you recommend to reindex?
As rmiao stated, it depends on the activity of your database. In our shop, we have a very busy database and we perform index defrags nightly and index rebuilds weekly. |
|
|
tfountain
Constraint Violating Yak Guru
491 Posts |
Posted - 2008-07-05 : 23:07:27
|
quote: Originally posted by TRACEYSQL tfountain do you have a couple of yur update stats script for doing all databases and do you put that in a "SYSTEMADMIN" DB and run sp from here or in master...Cheers
I have a database on each of our servers just for DBA related stuff. No one has access except for me. I have a simple stored procedure that uses the sp_MSforeachdb proc that simply calls update stats system procedure. I personally avoid putting things in the master database unless absolutely necessary. IMO, the only legitimate reason would be so you can have the ability to call a proc that will act on any object in the context of the database it's executed in. But even then you can get a little fancy and utilize some dynamic statements and execute commands in the context of a specific database. |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
TRACEYSQL
Aged Yak Warrior
594 Posts |
Posted - 2008-07-06 : 08:28:40
|
sp_MSforeachdb i give that a try....Cheers |
|
|
TRACEYSQL
Aged Yak Warrior
594 Posts |
Posted - 2008-07-06 : 08:34:19
|
Do you do the full each night - is it correct running the full it only does the missing ones. |
|
|
tfountain
Constraint Violating Yak Guru
491 Posts |
Posted - 2008-07-07 : 13:50:10
|
quote: Originally posted by TRACEYSQL Do you do the full each night - is it correct running the full it only does the missing ones.
I issue the statement EXEC sp_updatestats @resample='resample'; In 2005, the sp_updatestats has some optimizations built in so it does not process any index/stats if they do not meet the thresholds SQL Server uses internally to determine if a stat is out of date. With that, I have found no adverse impact of running this across the entire database. |
|
|
|