Hi! I recently worked on a Data warehouse server which has performance issues. When I dig up the DB wise tables then found that frequently used tables have more than 5 billion records. Now, what steps should I take so that the fetching of records from such tables will be optimized? All such tables already have Indexes & the statistics is also getting update on a daily basis.
Look at archiving data and splitting up the queries into separate steps. Also look at creating aggregate tables If they are indexed well then the performance should depend more on the data that needs to be retrieved rather than tha size of the table and you should never need to return thousands of rows.
========================================== Cursors are useful if you don't know sql. SSIS can be used in a similar way. Beer is not cold and it isn't fizzy.