Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2008 Forums
 Transact-SQL (2008)
 flush insert results for big queries

Author  Topic 

yosiasz
Master Smack Fu Yak Hacker

1635 Posts

Posted - 2011-03-11 : 18:02:03
Greetings

Is there a way in SQL that when you are inserting millions of rows to tell it to flush the results set as the query is happening instead of waiting until ever row is returned from query.

And in case it errors out rollback?

Thanks

If you don't have the passion to help people, you have no passion

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2011-03-11 : 18:10:51
You would do the inserts in batches and not one big transaction.

Tara Kizer
Microsoft MVP for Windows Server System - SQL Server
http://weblogs.sqlteam.com/tarad/

Subscribe to my blog
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2011-03-11 : 18:16:45
You may want to issue a CHECKPOINT between batches to make sure everything is flushed to disk.
Go to Top of Page

yosiasz
Master Smack Fu Yak Hacker

1635 Posts

Posted - 2011-03-11 : 18:26:17
excellent. now how to break the huge query to return batches of data? any good way to do this. I am using MERGE, I wonder if it has issues with big data sets.

Thank u much!

If you don't have the passion to help people, you have no passion
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2011-03-11 : 18:35:26
Basically you add TOP 100000 (or whatever number of rows to process in a batch) to your statement.

You also need a WHERE clause that will exclude all rows that get MERGEd into the destination table(s) for each batch. Otherwise you'll process the same data over and over again.

You can then wrap this in a WHILE loop:
WHILE EXISTS(SELECT * FROM myTable WHERE meets_condition) BEGIN
MERGE TOP (100000) ...
WHERE meets_condition
CHECKPOINT -- flush to disk
END

Go to Top of Page

yosiasz
Master Smack Fu Yak Hacker

1635 Posts

Posted - 2011-03-11 : 19:04:53
robvolk, you are awesome!!!

WHILE EXISTS(SELECT * FROM myTable WHERE meets_condition) BEGIN
MERGE TOP (100000) ...
WHERE meets_condition
CHECKPOINT -- flush to disk
END


the red WHERE clause being in the select statementa that populates the Merge target?

as in
[code]
MERGE TOP (100000) [database].[dbo].[table] AS target
USING (
SELECT id, space, the, final, frontier FROM nebula WHERE meets_condition
)
[code]

If you don't have the passion to help people, you have no passion
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2011-03-11 : 19:48:00
Yes.
Go to Top of Page

yosiasz
Master Smack Fu Yak Hacker

1635 Posts

Posted - 2011-03-11 : 19:57:55
this is very nice but WHILE EXISTS(SELECT * FROM myTable WHERE meets_condition) is killing me because I am stuck in a sort of infinite look. I will have to look for another way to keep the loop happening

If you don't have the passion to help people, you have no passion
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2011-03-11 : 21:29:06
That's what I meant earlier, if you use the loop (you don't have to) you have to write the WHERE clause in such a way that processing each batch removes it from the results to be merged.

You may not want to use MERGE for this, or at least not MERGE using the original tables. It might be better to SELECT each batch into a table variable or temp table and MERGE from that. You can then clear the temp table between each batch so that you're not processing the same data again.
Go to Top of Page

yosiasz
Master Smack Fu Yak Hacker

1635 Posts

Posted - 2011-03-13 : 14:42:35
ok implemented and working beautifully! this also helps me refine other bulky inserts

Appreciate the help.

If you don't have the passion to help people, you have no passion
Go to Top of Page
   

- Advertisement -