SQL Server Forums
Profile | Register | Active Topics | Members | Search | Forum FAQ
 
Register Now and get your question answered!
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 SQL Server Limitation on Data Size
 New Topic  Reply to Topic
 Printer Friendly
Author Previous Topic Topic Next Topic  

DestinyJack
Starting Member

22 Posts

Posted - 08/04/2005 :  05:38:47  Show Profile  Reply with Quote
Greetings to everyone, this is my first post in this forum. Nice to meet you all.

I have a Visual Basic .Net application that use SQL server as database. There's a lots of data in the database (about 193 tables, with about a quarter of them have at least 1 million records). There's a so called core table (with 1.9 million records) in this database too that every user will have to access to it to perform some task.

So the question is, if there's 13 concurent users running the application, will it cause performance issue? Like making the application not responding?

Kristen
Test

United Kingdom
22403 Posts

Posted - 08/04/2005 :  08:01:42  Show Profile  Reply with Quote
Hi DestinyJack, Welcome to SQL Team!

Well ... "it depends"

If all interaction with the database si through Stored Procedures, and the tables, indexes and queries are carefully optimised, and the database is on a dedicated SQL server box with decent hardware I reckon you'll be fine.

But if the quality of the application, the database design, the hardware its running on or the skills of the DBA etc. are "rubbish" then you probably haven't got a hope to be honest.

Kristen
Go to Top of Page

Stalker
Yak Posting Veteran

Russia
80 Posts

Posted - 08/04/2005 :  08:06:08  Show Profile  Reply with Quote
:)

it depends on so many factors...

for example it is possible to kill perfomance even with several thousands rows... on the other hand if you've created your base in the right way, it will work with billions of rows (on microsoft website you can find information about database 5.3 TB in size and about 33 billions rows)
Go to Top of Page

DestinyJack
Starting Member

22 Posts

Posted - 08/04/2005 :  22:38:04  Show Profile  Reply with Quote
I see, thanks for the reply.

The interaction with the database is through Stored Procedures. The stored procedure is just a simple "Insert Into (blah blah blah...) Select something from tableA". The TableA here is the tables that has 1.9 million records and has 13 user accessing it concurrently. This table has no primary key, and it has only 1 composite index, which consists of 4 columns.

I belief that the problems come from here, and I am thinking of adding an unique identifier as primary key and split the composite index into 4 seperated index instead. but I am not sure whether this will helps on the problem. Anyone has any clues that could support me on this solution?
Go to Top of Page

Kristen
Test

United Kingdom
22403 Posts

Posted - 08/04/2005 :  23:57:33  Show Profile  Reply with Quote
Looks like there is plenty of room for improvement then!

We don't create tables here without Primary Keys - there's always something that can utilise them, even if its only a 3rd party data comparison tool!

If the unique ID has no particular purpose it sounds like you want a non-clustered Primary key.

The composite index is fine provided that all the queries hit on the keys in that index, starting from the 1st key and optionally using the rest.

Replacing it with 4 separate indexes will make them more general purpose for a variety of queries, however if the most commonly used query only uses those the 4 columns in that index then that query is "covered" and splitting the index into 4 will reduce performance

But I reckon your best bet would be to look at the performance of some of your queries. Here's what I do:

I isolate a piece of code that I want to test which will run stand alone in Query Analyser

The I surround it with the following code:

-- Clear cache (for level playing field)
-- Do **NOT** use these on a Production server
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE

-- Comment IN the SHOWPLAN to see the Query Plan
-- or comment IN the STATISTICS to see the logical stats.

-- SET SET SHOWPLAN_TEXT ON
GO
-- SET STATISTICS IO ON; SET STATISTICS TIME ON

-- ... put query here e.g.

SELECT * FROM Northwind.dbo.Products

SET STATISTICS IO OFF; SET STATISTICS TIME OFF
GO
SET SET SHOWPLAN_TEXT OFF
GO

The STATISTICS will show the Scan count and Logical reads. Work on getting these numbers as low as possible. The Physical time will vary depending on server load and data caching - clear the cache for a "ground zero" test (but that's not a good idea on a production system!)

Kristen
Go to Top of Page
  Previous Topic Topic Next Topic  
 New Topic  Reply to Topic
 Printer Friendly
Jump To:
SQL Server Forums © 2000-2009 SQLTeam Publishing, LLC Go To Top Of Page
This page was generated in 0.08 seconds. Powered By: Snitz Forums 2000