Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2012 Forums
 Transact-SQL (2012)
 Problems inserting millions of rows into IDENTITY

Author  Topic 

jhilb007
Starting Member

17 Posts

Posted - 2014-08-21 : 11:40:52
I have a number of large tables using GUIDs for record Ids. Performance is ok. They are not sequential GUIDs.

Trying to go to BIGINT with IDENTITY, as PK (CLUSTERED) instead for performance.

However, my inserts into IDENTITY BIGINT keyed tables (typically millions of rows per table a day) simply lock-up or error with a not-enough locks error.

Adding TABLOCKX helps, but not what I want to do. A process that took 3 minutes a day using non-sequential GUIDs jumped to 18 minutes when changed to BIGINT and required a TABLOCKX on the INSERT to get it done.

Everything I read says to get off GUIDS, but the alterative, BIGINT IDENTITY seems to be-counter productive, and not possible in some cases.

Breaking the INSERT into smaller batches is not an option. I need to be able to push millions of rows into the table at a time.

Jeff

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2014-08-21 : 12:35:16
Is it in one transaction? Adding millions of rows to a bigint identity column should not be an issue. You've got something wrong in the environment.

Tara Kizer
SQL Server MVP since 2007
http://weblogs.sqlteam.com/tarad/
Go to Top of Page

jhilb007
Starting Member

17 Posts

Posted - 2014-08-21 : 12:46:12
Yes, one transaction. However the I just realized that the queries having the performance issue are all around INSERTs where the data source is another database (not the one inserting into). Actually they are from a view on another database. I suspect that has something to do with it. I will try inserting from the VIEW locally, then into my BIGINT keyed table to see if there is a difference.
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2014-08-21 : 12:47:57
If you are inserting millions of rows in one transaction, then I would suggest changing that process so that it does it using a bulk approach and not with insert.

Tara Kizer
SQL Server MVP since 2007
http://weblogs.sqlteam.com/tarad/
Go to Top of Page

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2014-08-21 : 16:14:09
Try making the table a heap (drop the clustered attribute on the pk).
Then you should avoid some page contention problem. See http://kejser.org/clustered-indexes-vs-heaps/




Microsoft SQL Server MVP, MCT, MCSE, MCSA, MCP, MCITP, MCTS, MCDBA
Go to Top of Page
   

- Advertisement -