Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
mark.s
Starting Member
3 Posts |
Posted - 2007-07-20 : 03:58:05
|
There may/may not be an upper limit for the number of rows in a table, but is there any performance-related limit?I'm designing a database that stores results that have been acquired from a number of devices. Each device provides a set of data measurements every 10 minutes. Therefore each year a device will produce 52000 sets of results.If I design a table to store a row for each set of measurements from a device (PK is based on the timestamp and the deviceID), and if there are 100 devices recording for 5 years, there will be 52000x100x5 rows. Would I get a performance increase by separating this data into one table per year? Perhaps the year could be appended to the table name to identify the particular tables.A secondary issue is some devices can also be configured to produce a different set of measurements every 10 seconds. In this case there will be hundreds of millions of rows over a 5 year period. Therefore I am considering bulking the results into an array for a 10 minute period, and storing this array as a blob each 10 minutes. Is this going to be faster or slower than having hundreds of millions of rows?Thanks in advance for any advice,Mark. |
|
spirit1
Cybernetic Yak Master
11752 Posts |
Posted - 2007-07-20 : 04:18:48
|
a few millions rows per table is nothing. just keep then properly indexed, statistics updated and defragmented and you'll be fine._______________________________________________Causing trouble since 1980blog: http://weblogs.sqlteam.com/mladenp |
 |
|
rmiao
Master Smack Fu Yak Hacker
7266 Posts |
Posted - 2007-07-20 : 10:38:18
|
It's limited by storage if don't take performance in account. |
 |
|
|
|
|