Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Import/Export (DTS) and Replication (2000)
 Updating a Web App

Author  Topic 

chilluk
Starting Member

24 Posts

Posted - 2002-01-28 : 10:56:51
My website is pure ASP, with a SQL Server 2000 backend. the business requirement means that it must be updated every 2 hours with stock changes etc.

The source of the data is an old DOS based system, so I export that data en masse, and then FTP it to the site.

The data is then picked up by a DTS package, which puts the data into holding tables (copies of the live ones in most instances)

Then a series of queries are run to update, insert and delete the imported data into the live tables. Some of the queries are more complex as they compile sales charts etc from the data.

Finally the full text indexes are re-populated to pick up the data changes.

This all works as designed, but with one problem - the site grinds to a halt for 10 minutes whilst the update runs - which is obviously unnaceptable.

I have no easy way to only export the changed data, as there isn't any flags to denote when stuff has been altered, so I can't even limit the amount of work the process does that easily.

When the process is running, the task monitor shows very often that the server is running at full capacity - this is on a dual processor box with 2Gb RAM!!

Has anyone got any ideas on more efficient ways to do this? Am I missing some config settings, or some locking issues?

Chris Hill
chris@softwarefirst.com

andre
Constraint Violating Yak Guru

259 Posts

Posted - 2002-01-28 : 11:25:58
You might want to try BULK INSERTing the data into temporary tables and then when that's finished, copy the data into the live tables. BULK INSERT is the fastest way of populating tables I know of.

Go to Top of Page

mfemenel
Professor Frink

1421 Posts

Posted - 2002-01-28 : 13:24:41
One thing you might consider is setting up a "data hub" A second server, or at the least a powerful desktop to grind the data down to it's final form and then move it over to production. We've implemented that solution where I work and it's been working great just using a Dell desktop. Just a thought.

Mike
"..still no custom title"
Go to Top of Page

aiken
Aged Yak Warrior

525 Posts

Posted - 2002-01-28 : 15:33:24
Those both sound like good reccomendations.

First, though, I'd look at *why* the app grinds to a halt. Is it really CPU and/or memory usage, or is some query being blocked by the inserts. Depending on your requirements, you may be able to get away using the WITH (NOLOCK) table hint on selects.

Use sp_who2 to figure out who's blocking who during those 10 minute intervals, and see if you can fix it. Failing that, you could do something creative like creating whole new tables, populating them, and then renaming the tables around to swap the new tables into production.

Cheers
-b

Go to Top of Page

chadmat
The Chadinator

1974 Posts

Posted - 2002-01-28 : 16:27:53
Are the tables heavily indexed? If so, that could be why it 'Grinds to a halt' during large scale inserts.

Usually I would suggest dropping the indexes, doing the import, and recreating the indexes, but I'm not sure that is feasible every 2 hours.

I like the suggestion of importing into a (Non indexed) staging server, but that still leaves you with the problem of getting it into the production table.

HTH
-Chad

Go to Top of Page

chilluk
Starting Member

24 Posts

Posted - 2002-01-29 : 09:29:42
Thanks for the replys so far - some useful info. I will probably look further into the staging area idea.

I suspect it may be because I am trying to update or delete etc records that are being selected. The slowest part of the refresh seems to be the chart compilation, which builds the top X titles per format into a temp table, and then updates the live chart table with the results. Whilst that process is being run, the existing data is being constantly selected from the site, as the chart data appears on every page in one form or another.

Does anyone else run SQL under a busy site. How do other people cope with these demands?

Chris Hill
chris@softwarefirst.com
Go to Top of Page
   

- Advertisement -