Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Administration (2000)
 Indexing and DataWarehousing database

Author  Topic 

dewacorp.alliances

452 Posts

Posted - 2006-09-08 : 02:28:50
Hi there

We have a database that for datawarehousing and every nigth we pump up the data and prior to that we delete records on fact table (3 mil records). So ... does the indexing is going to help for initial processing the data which is using that fact table cause we are deleting the record initially anyway.

So when exactly the indexing kicks in. Is it when the new record get insert or ?

Thanks.

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2006-09-08 : 02:32:15
Indexes are used for the execution plans of queries. We'd have to see the queries in order to tell you if the index is being used.

If you are deleting that many rows each time, you might want to do a DBCC DBREINDEX on the table as your data maybe highly fragmented due to the index. You should also consider UPDATE STATISTICS as well.

Tara Kizer
Go to Top of Page

dewacorp.alliances

452 Posts

Posted - 2006-09-08 : 02:39:06
Hi Tara

Basically, that what we do in DTS:
1. Delete all fact tables - 3 mil records
2. Start the pump up from source database into these fact tables
3. Manipulate/massage the data (Deleting, insert, create YTD, total etc).
4. Datewarehouse ready to go.

So ... does your recommend is going to help anyway? Does this mean I have to run DBCC DBREINDEX and UPDATE STATISTIC in one of step of data manipulation (3).

Thanks


Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2006-09-08 : 02:50:59
I can't tell you if it's going to help or not. You have to test it. I just made suggestions per my experience.

Tara Kizer
Go to Top of Page

anilkdanta
Starting Member

25 Posts

Posted - 2006-09-09 : 08:00:00
Hi,

I dont suggest deleting 3 million records every time pushing the data to Warehouse. And even transfering 3 million ( infuture more ) records every time from source to destination.

Istead of deleting all records, think about some solution with DTS package or whatever your existing solution by taking only the data added after the last move. This will reduce your processing time, index updates etc.

Try looking at Replication architectures if you think one of them suits your requirements.


Go to Top of Page

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2006-09-09 : 23:41:16
valdyv, you haven't actually described any problem.

Is the nightly processing taking too long?
Are OLAP queries during the day taking too long?

"I have HAD it with these muthu-f$#%in' cursors in my muthu-f$#%in' database!"
Go to Top of Page
   

- Advertisement -