I've a huge DB in which one table is taking app 240gb of data. While rebuilding index for the complete db all the tables are able to get rebuilded...except the one table with 240gb of data.
It's a SQL 2008 Server and the tables have been created few years back.
What are the best possible ways to check that specific table with large data? What could be the problem ? What to be focused in that table ? What could be the real bug that's jamming to rebuild the index ?
If that table never gets reindexed then there's something in the reindex script that excludes it. Not uncommon for custom rebuild scripts to explicitly exclude the largest tables in a DB. You'll have to check over the script and see why that table is excluded.
One possible reason to look for is that tables with blob data columns (image, text, varchar(MAX), etc.) cannot be rebuilt with the ONLINE=ON option. A table as large as the one you mention could easily have this configuration. You can rebuild the index with ONLINE=OFF but you'll want to find a time when you won't mind the table being unavailable.
================================================= It is not so much our friends' help that helps us as the confident knowledge that they will help us. -Epicurus, philosopher (c. 341-270 BCE)