SQL Server Forums
Profile | Register | Active Topics | Members | Search | Forum FAQ
 
Register Now and get your question answered!
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 Old Forums
 CLOSED - General SQL Server
 Huge Table
 Forum Locked
 Printer Friendly
Author Previous Topic Topic Next Topic  

newty25
Starting Member

21 Posts

Posted - 03/11/2002 :  14:33:48  Show Profile
I have an MSSQL Server/Database that contains a table has and still is getting outta contol. As a matter of fact, there is soooo much data in the table now, that querying the table has become quite a lengthy process.

I have considered breaking the table into yearly tables and creating a view to link them all (using the tables original as the view name...), but I believe there has to be a better way. I'm just not sure that a view will speed up a query. Is there a better way to do this without a major rewrite of the code I have deployed to interact with the database?

I became the database administrator by default... (I'm really just a programmer) since I'm the one coding up the projects that interact with the data, but I have little background or experience with these types of problems.

Thanks in advance,
newty25

smccreadie
Aged Yak Warrior

USA
505 Posts

Posted - 03/11/2002 :  14:47:18  Show Profile  Visit smccreadie's Homepage
Well, you don't say how big your table is so it's hard to make specific recommendations. SQL supports huge tables just fine and they can even be split over multiple disks.

Probably the biggest benefit for queries is proper indexing. Make sure your join fields and fields in a where clause are indexed. You're allowed one clustered index per table - use it.

Other then that, you can look at the database design and determine if there's a better structure. Sometimes regressing back from a total denormalized design will give you much better query performance by minimizing joins.



Go to Top of Page

robvolk
Most Valuable Yak

USA
15654 Posts

Posted - 03/11/2002 :  14:51:29  Show Profile  Visit robvolk's Homepage
It may not really be the volume of data, but how you access it and how the table is indexed (or not). I personally have had 100,000 row SQL Server tables that were queried for 1,000 rows, and returned results in less than 5 seconds, on low-level hardware. Many people here have had even larger tables that peform just as well or better. It's just a question of tuning the data and hardware you have now. If you can answer these questions for us:

-Specifically, how many tables are involved, and how many rows do you have?
-What kind of hardware is SQL Server running on? How much RAM does it have?
-Is the SQL Server used for anything else, like Exchange or IIS?
-How is the data queried (Query Analyzer, VB/C++ app, web app? Using ADO? Stored procedures?)
-Are there indexes on the table(s)?
-Are you using cursors?
-How many people are hitting the server at one time?

...we should be able to give you some guidelines on improving it. If you can post code, table definitions/DDL statements, whatever you can, it would be a huge help.

Hey graz, we need a bullseye emoticon cause the SNIPING is pretty heavy!

Edited by - robvolk on 03/11/2002 14:52:28
Go to Top of Page

newty25
Starting Member

21 Posts

Posted - 03/11/2002 :  17:10:15  Show Profile
There is only one table involved, it is indexed and has a clustered index. The volume of data is on the magnitude of 6.74 million rows. I'm not so sure about the CPU, but I know the server has 512M RAM (it's a multi-processor platform, and it's probably 2 x 500). The server isn't used for anything else... don't know about the cursoring... mostly one users at a time, but no more than five users at a time (on an extremely heavy day)... and mainly VB is used to interact with the database, but some analysts use MS Access and link to the table via ODBC.

I hope that helps... but I have a feeling it's futile. Thanks again for the help!
newty25

Go to Top of Page

chadmat
The Chadinator

USA
1974 Posts

Posted - 03/11/2002 :  17:22:53  Show Profile  Visit chadmat's Homepage
I was going to suggest a distributed partitioned view, but that is probably overkill for such a small user base.

Have you run perfmon to see where your bottlenecks are? I would bet Memory (just a WAG) is a bottleneck, that is a pretty bare bones machine.

-Chad

Go to Top of Page

rrb
SQLTeam Poet Laureate

Australia
1479 Posts

Posted - 03/11/2002 :  17:59:53  Show Profile
quote:

I personally have had 100,000 row SQL Server tables that were queried for 1,000 rows, and returned results in less than 5 seconds, on low-level hardware.


Hey rob, I get a sub-1 second response retrieving a 1000 row result from a table with 320 000 rows. Did I read you right?

Newty -when you say slow - how slow? The Access connections could be causing some pain - but I'm sure chadmat can sort that out for you.

* How about timing from QA say select count(*) from table - and see how long that takes.....
* Is the response the same at different times of the day?

--
I hope that when I die someone will say of me "That guy sure owed me a lot of money"
Go to Top of Page

robvolk
Most Valuable Yak

USA
15654 Posts

Posted - 03/11/2002 :  18:04:26  Show Profile  Visit robvolk's Homepage
Yeah, that was on a 400 MHz single processor machine running SQL 6.5, (not to mention the ADO and ASP overhead), so I imagine you'd get MUCH better performance with a real machine.

Go to Top of Page

rrb
SQLTeam Poet Laureate

Australia
1479 Posts

Posted - 03/11/2002 :  18:05:46  Show Profile
quote:

Yeah, that was on a 400 MHz single processor machine running SQL 6.5, (not to mention the ADO and ASP overhead), so I imagine you'd get MUCH better performance with a real machine.



Just out of interest, how fast were you pedaling?

--
I hope that when I die someone will say of me "That guy sure owed me a lot of money"
Go to Top of Page

robvolk
Most Valuable Yak

USA
15654 Posts

Posted - 03/11/2002 :  18:07:23  Show Profile  Visit robvolk's Homepage
Not me, the hamsters

Go to Top of Page

Marcio
Starting Member

USA
2 Posts

Posted - 03/14/2002 :  16:18:56  Show Profile
1)For a situation like that, I believe you need to consider your indexes. How many indexes you have and moreover, are the indexed columns being used on the where clause ? If not you are not using the index although I know you have them there.
2) Does your table has a lot of insert and updates ? If that is true, you may need to again rethink your index since indexes are rebuild everytime you have an insert or update and with a large table like that your index may be fragmented.
NOTE: In summary, for a table like that you really need to start reading about "tunning" and "indexes optimization"
3)Consider moving part of the data you need for another table, for example every two hours. SQL Server is very good with DTS package, and you may be able to prepare the data you more often query on.

Go to Top of Page
  Previous Topic Topic Next Topic  
 Forum Locked
 Printer Friendly
Jump To:
SQL Server Forums © 2000-2009 SQLTeam Publishing, LLC Go To Top Of Page
This page was generated in 0.06 seconds. Powered By: Snitz Forums 2000