Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Transact-SQL (2000)
 Update from large (800,000 row) table

Author  Topic 

sumo
Starting Member

45 Posts

Posted - 2003-04-18 : 08:02:38
We have a process that updates live data about every hour with around 8,000 rows. This update generally doesn't take long or cause problems. Sometimes, though, we have to run this same process in the morning on around 800,000 rows. Again, it doesn't take very long (~10 minutes) to run. However, if a user of our web site tries to run a query that accesses the data, they get a sql timeout error. The server is a 4 CPU unit with 2 GB of RAM. Here's the query that's run:

CREATE PROCEDURE dbo.UpdateInventory_Quick WITH RECOMPILE

AS

SET NOCOUNT ON

BEGIN TRAN UpdateInv
--Build table with invkey's rather than part #'s
create table #newinv
(
invkey int,
warehouse int,
onhand int
)

IF @@ERROR <> 0
GOTO ErrorHandler

insert into
#newinv
select distinct
i.invkey, im.whsecd as warehouse, (im.curboh-im.resboh) as onhand from ejbellmove.dbo.invmas im
inner join
inventory i on im.vendor = i.vendnum and im.partno = i.partnum
inner join
onhand oh on i.invkey = oh.invkey and warehouse = oh.warehouse

IF @@ERROR <> 0
GOTO ErrorHandler

--Fix bad data in onhand column
update
#newinv
set
onhand = 0
where
onhand < 0 or onhand is null

IF @@ERROR <> 0
GOTO ErrorHandler

update
dbo.onhand
set
dbo.onhand.onhand = ni.onhand
from
#newinv ni
where
dbo.onhand.invkey = ni.invkey and dbo.onhand.warehouse = ni.warehouse

IF @@ERROR <> 0
GOTO ErrorHandler

drop table #newinv
COMMIT TRAN UpdateInv
RETURN 0

SET NOCOUNT OFF

ErrorHandler:
ROLLBACK TRAN UpdateInv
RETURN @@ERROR

GO


Is there anything about this query that will cause grief? Should I remove the transaction, with recompile?

Michael Sumerano
Technical Web Administrator
eKeystone.com
http://www.ekeystone.com/

nr
SQLTeam MVY

12543 Posts

Posted - 2003-04-18 : 08:41:39
You don't need the transaction as the only thing it will effect is the update at the end - that's a single statement so will be atomic.

For the large import you might want to put indexes on the temp table.

You could also do the updates in batches so that the impact is less. You would need to do this from a permanent table as you wouldn't be able to run in a transaction and would need to know which rows had been updated in a crash so you could continue.

Depends on the logic as to whether this is viable.

==========================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
Go to Top of Page

sumo
Starting Member

45 Posts

Posted - 2003-04-18 : 08:52:35
I took your recommendations and here's the code now:

CREATE  PROCEDURE dbo.UpdateInventory_Quick

AS

SET NOCOUNT ON

--Build table with invkey's rather than part #'s
create table #newinv
(
invkey int,
warehouse int,
onhand int
)

--Insert data into temp table
insert into
#newinv
select distinct
i.invkey, im.whsecd as warehouse, (im.curboh-im.resboh) as onhand from ejbellmove.dbo.invmas im
inner join
inventory i on im.vendor = i.vendnum and im.partno = i.partnum
inner join
onhand oh on i.invkey = oh.invkey and warehouse = oh.warehouse

--Create an index on the temp table to make update faster
create nonclustered index newinv1 on #newinv (invkey, warehouse)

--Fix bad data in onhand column
update
#newinv
set
onhand = 0
where
onhand < 0 or onhand is null

--Update inventory
update
dbo.onhand
set
dbo.onhand.onhand = ni.onhand
from
#newinv ni
where
dbo.onhand.invkey = ni.invkey and dbo.onhand.warehouse = ni.warehouse

drop table #newinv


SET NOCOUNT OFF

GO


If anything is adversely affected, I'll post again, but most likely this will take care of any issues. Thank you for pointing out the obvious!

quote:

You don't need the transaction as the only thing it will effect is the update at the end - that's a single statement so will be atomic.

For the large import you might want to put indexes on the temp table.

You could also do the updates in batches so that the impact is less. You would need to do this from a permanent table as you wouldn't be able to run in a transaction and would need to know which rows had been updated in a crash so you could continue.

Depends on the logic as to whether this is viable.

==========================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.



Michael Sumerano
Technical Web Administrator
eKeystone.com
http://www.ekeystone.com/
Go to Top of Page

nr
SQLTeam MVY

12543 Posts

Posted - 2003-04-18 : 09:07:31
You might try moving the create index to just before the update and making it

create nonclustered index newinv1 on #newinv (invkey, warehouse, onhand)

this should make the index covering so that the update doesn't need to access the data pages in the temp table.

Just noticed that that's all the fields in the temp table so might make it clustered - not sure what would be best.

==========================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.

Edited by - nr on 04/18/2003 09:09:31
Go to Top of Page
   

- Advertisement -