Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2012 Forums
 Transact-SQL (2012)
 How can this be improved?

Author  Topic 

TheKai
Starting Member

16 Posts

Posted - 2014-01-11 : 22:27:19
I'm using a DB which is used to persist an in-memory store of "objects". I might need to update or insert in the neigborhood of 10,000 objects in each save cycle. Each object can correspond to several rows in the DB, so a save cycle might involve 20k - 30k updates/inserts. I'm concerned about performance. I have thus far chosen to do this using the MERGE functionality, but am wondering if the following might be improved upon for performance sake. Any input would be greatly appreciated. I have listed one of the MERGE blocks, but the entire SP has 5 of these MERGE blocks (with 20k - 30k updates/inserts per block).





-- Int Properties
MERGE Items_tblPropertiesInt AS IntsTarget USING @ItemPropertyInts AS IntsSource
ON
IntsSource.PropertyOwner = IntsTarget.OwnerID
AND IntsSource.PropertyId = IntsTarget.PropertyID

WHEN MATCHED THEN
UPDATE SET IntsTarget.PropertyValue = IntsSource.PropertyValue

WHEN NOT MATCHED BY TARGET AND IntsSource.PropertyId != 0 THEN
INSERT (OwnerID, PropertyID, PropertyValue,PropertyName)
VALUES (IntsSource.PropertyOwner, IntsSource.PropertyId, IntsSource.PropertyValue, IntsSource.PropertyName)

WHEN NOT MATCHED BY SOURCE AND IntsTarget.OwnerID in (SELECT PropertyOwner FROM @ItemPropertyInts) THEN
DELETE;

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2014-01-12 : 01:38:44
Sorry without having any details on what columns contain and whats it that you're going to apply etc its hard to suggest.
Post details in below format and then somebody may be able to suggest an alternative

http://weblogs.sqlteam.com/brettk/archive/2005/05/25/5276.aspx

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/
https://www.facebook.com/VmBlogs
Go to Top of Page

TheKai
Starting Member

16 Posts

Posted - 2014-01-12 : 01:56:35
Sorry about that. Here is the TVDT


CREATE TYPE [dbo].[ItemPropertyInt] AS TABLE(
[PropertyOwner] [uniqueidentifier] NULL,
[PropertyId] [int] NULL,
[PropertyValue] [int] NULL,
[PropertyName] [nvarchar](128) NULL
)


and here is the table i'm updating



CREATE TABLE [dbo].[Items_tblPropertiesInt](
[ID] [int] IDENTITY(1,1) NOT NULL,
[OwnerID] [uniqueidentifier] NOT NULL,
[PropertyID] [int] NOT NULL,
[PropertyValue] [int] NOT NULL,
[PropertyName] [nvarchar](128) NULL,
CONSTRAINT [PK_Items_tblPropertiesInt] PRIMARY KEY CLUSTERED
(
[OwnerID] ASC,
[PropertyID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
)

Go to Top of Page

visakh16
Very Important crosS Applying yaK Herder

52326 Posts

Posted - 2014-01-12 : 02:01:33
what about details on what data you're trying to apply, expected number of records affected, available indexes etc?

------------------------------------------------------------------------------------------------------
SQL Server MVP
http://visakhm.blogspot.com/
https://www.facebook.com/VmBlogs
Go to Top of Page

Lamprey
Master Smack Fu Yak Hacker

4614 Posts

Posted - 2014-01-13 : 12:35:39
It looks like you are using a Table Variable. If that table variable is holding 20-30k rows, you might want to try using a temp table to see if that has any affect. It's possible that the optimizer might be able to make use of the statistics on a temp table (assuming it is indexed).
Go to Top of Page
   

- Advertisement -