Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
dewacorp.alliances
452 Posts |
Posted - 2006-09-08 : 02:28:50
|
Hi thereWe have a database that for datawarehousing and every nigth we pump up the data and prior to that we delete records on fact table (3 mil records). So ... does the indexing is going to help for initial processing the data which is using that fact table cause we are deleting the record initially anyway.So when exactly the indexing kicks in. Is it when the new record get insert or ?Thanks. |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2006-09-08 : 02:32:15
|
Indexes are used for the execution plans of queries. We'd have to see the queries in order to tell you if the index is being used. If you are deleting that many rows each time, you might want to do a DBCC DBREINDEX on the table as your data maybe highly fragmented due to the index. You should also consider UPDATE STATISTICS as well.Tara Kizer |
 |
|
dewacorp.alliances
452 Posts |
Posted - 2006-09-08 : 02:39:06
|
Hi TaraBasically, that what we do in DTS:1. Delete all fact tables - 3 mil records2. Start the pump up from source database into these fact tables3. Manipulate/massage the data (Deleting, insert, create YTD, total etc).4. Datewarehouse ready to go.So ... does your recommend is going to help anyway? Does this mean I have to run DBCC DBREINDEX and UPDATE STATISTIC in one of step of data manipulation (3).Thanks |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2006-09-08 : 02:50:59
|
I can't tell you if it's going to help or not. You have to test it. I just made suggestions per my experience.Tara Kizer |
 |
|
anilkdanta
Starting Member
25 Posts |
Posted - 2006-09-09 : 08:00:00
|
Hi,I dont suggest deleting 3 million records every time pushing the data to Warehouse. And even transfering 3 million ( infuture more ) records every time from source to destination.Istead of deleting all records, think about some solution with DTS package or whatever your existing solution by taking only the data added after the last move. This will reduce your processing time, index updates etc.Try looking at Replication architectures if you think one of them suits your requirements. |
 |
|
blindman
Master Smack Fu Yak Hacker
2365 Posts |
Posted - 2006-09-09 : 23:41:16
|
valdyv, you haven't actually described any problem.Is the nightly processing taking too long?Are OLAP queries during the day taking too long?"I have HAD it with these muthu-f$#%in' cursors in my muthu-f$#%in' database!" |
 |
|
|
|
|