Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 Database Design and Application Architecture
 table size and performance degradation

Author  Topic 

metser
Starting Member

27 Posts

Posted - 2007-10-21 : 04:35:17
Hello,

Working with SQL Server 2000, I have a table with the following structure:
ID (INT)
userID (INT, foriegn key)
productID (INT)
productQTY (DECIMAL(5,2))
purchaseDate(smalldatetime)

I have about a 1000 users, entering about 20-30 rows per day each, i.e ~20,000 - 30,000 new rows per day. The table might be queried with a simple "SELECT" for the products a user ordered per day or per time frame (purchaseDate column).
My question (finally) is - when should I expect to see performance degradation? Is there anything I can do to prevent it (i.e splitting this table somehow to several tables)?

Thank you all in advance

Kristen
Test

22859 Posts

Posted - 2007-10-21 : 07:05:40
So long as your queries are optimised, appropriate indexes are in place, and the queries are using them , I shouldn't imagine you will have a problem.

10 or 100 millions rows is not a huge amount for SQL Server, but:

1) I would want to put the Archiving solution in place now so that it can be "shaken down" as the system grows rather than becoming an emergency fix at some future point"

2) I would want to have a plan in place for having appropriate hardware down the line from now. Multiple drives, database data / log / backups split over the drives, and so on.

3) I would also want to have a disaster recovery plan in place - a 10GB database is not really a simple restore after hardware failure

Kristen
Go to Top of Page

metser
Starting Member

27 Posts

Posted - 2007-10-21 : 09:06:45
Thank you for your good advice, Kristen.
Go to Top of Page
   

- Advertisement -