Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 Site Related Forums
 The Yak Corral
 now that's a lot of pages!

Author  Topic 

jezemine
Master Smack Fu Yak Hacker

2886 Posts

Posted - 2012-04-14 : 00:52:06
10 percent processed.
20 percent processed.
30 percent processed.
40 percent processed.
50 percent processed.
60 percent processed.
70 percent processed.
80 percent processed.
90 percent processed.
Processed 227540000 pages for database 'NASNA', file 'NASNA' on file 1.
100 percent processed.
Processed 14 pages for database 'NASNA', file 'NASNA_log' on file 1.
BACKUP DATABASE successfully processed 227540014 pages in 18353.542 seconds (96.856 MB/sec).

usually I don't backup such large databases because in my case they store results of calculations that I could repeat if need be. The input to the calculations IS backed up.

But in this case I needed a backup. It took 5 hours!


elsasoft.org

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2012-04-14 : 02:40:46
Do you use any of the performance parameters of the backup command? We were able to pretty significantly reduce the backup time with those on a VLDB. There's a SQL CAT article that says what they recommend for "local" backups and then for network backups.

Tara Kizer
Microsoft MVP for Windows Server System - SQL Server
http://weblogs.sqlteam.com/tarad/

Subscribe to my blog
Go to Top of Page

jezemine
Master Smack Fu Yak Hacker

2886 Posts

Posted - 2012-04-14 : 08:41:04
Wasn't aware of any perf params in backup command.

I'll be sad to hear that WITH GO_SLOW is the default though. :(





elsasoft.org
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2012-04-14 : 15:11:52
I'll post what they are early next week when I get in front of my backup code.

Tara Kizer
Microsoft MVP for Windows Server System - SQL Server
http://weblogs.sqlteam.com/tarad/

Subscribe to my blog
Go to Top of Page

jezemine
Master Smack Fu Yak Hacker

2886 Posts

Posted - 2012-04-14 : 23:00:29
cool thanks.


elsasoft.org
Go to Top of Page

Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)

7020 Posts

Posted - 2012-04-16 : 09:03:21
If it is SQL Server 2008 Enterprise Edition, or 2008 R2 Standard or Enterprise Edition you should set backup compression on. That will usually save a lot of time in the backups.

You should also consider backing up to multiple output files in parallel. I'm not exactly sure why, but they always seem to run faster.
backup database [MyDB] 
to
disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_001.BAK' ,
disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_002.BAK' ,
disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_003.BAK' ,
disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_004.BAK' ,
disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_005.BAK'
with
Type = FULL,
init,
stats = 5






CODO ERGO SUM
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2012-04-16 : 10:13:12
Multiple backup files use multiple threads. One of the SQLCat Whitepapers has a detailed analysis of how many files/threads give the best performance. Here are some of them:

http://sqlcat.com/sqlcat/b/msdnmirror/archive/2008/03/02/backup-more-than-1gb-per-second-using-sql2008-backup-compression.aspx
http://sqlcat.com/sqlcat/b/whitepapers/archive/2009/08/13/a-technical-case-study-fast-and-reliable-backup-and-restore-of-a-vldb-over-the-network.aspx
http://sqlcat.com/sqlcat/b/technicalnotes/archive/2008/04/21/tuning-the-performance-of-backup-compression-in-sql-server-2008.aspx
http://sqlcat.com/sqlcat/b/technicalnotes/archive/2009/02/16/tuning-backup-compression-part-2.aspx

I'm not sure they're the ones I'm thinking of, but spend some time on the SQLCat site, it's awesome.
Go to Top of Page

jezemine
Master Smack Fu Yak Hacker

2886 Posts

Posted - 2012-04-16 : 12:47:06
great thanks everyone. maybe move this thread out of the corral. I wasn't looking for help but I certainly got schooled! :)


elsasoft.org
Go to Top of Page

jezemine
Master Smack Fu Yak Hacker

2886 Posts

Posted - 2012-06-06 : 15:02:36
Had to do this again with a bit larger db, and this time used WITH COMPRESSION and 8 bak files. rate was about 2x faster (192mb/sec vs 97mb/sec before). db is 2.6TB, backup finished in 3.5 hrs.

so thanks for the tips!

BACKUP DATABASE successfully processed 310004810 pages in 12571.842 seconds (192.645 MB/sec).


elsasoft.org
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2012-06-06 : 15:48:15
Did you try changing MAXTRANSFERSIZE or BUFFERCOUNT? Are you backing up to a local drive or a SAN?
Go to Top of Page

jezemine
Master Smack Fu Yak Hacker

2886 Posts

Posted - 2012-06-07 : 00:28:51
SAN. don't have local drives that big.

did not specify MAXTRANSFERSIZE or BUFFERCOUNT - just went with defaults.


elsasoft.org
Go to Top of Page

nigelrivett
Master Smack Fu Yak Hacker

3385 Posts

Posted - 2012-06-07 : 04:21:24
Do you have a lot of static (old) data that can be placed on read only filegroups?

==========================================
Cursors are useful if you don't know sql.
SSIS can be used in a similar way.
Beer is not cold and it isn't fizzy.
Go to Top of Page

jezemine
Master Smack Fu Yak Hacker

2886 Posts

Posted - 2012-06-07 : 13:03:21
yes, could do that with much of it. Would that matter for backup times?


elsasoft.org
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2012-06-07 : 13:12:06
Yeah, if the read-only portion is a decent percentage of the total, you could do filegroup backups of just the live data instead.
Go to Top of Page

jezemine
Master Smack Fu Yak Hacker

2886 Posts

Posted - 2012-06-07 : 22:25:05
oh i get it, yes. this was the first time the db was backed up though, and likely the last.


elsasoft.org
Go to Top of Page
   

- Advertisement -