Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
michaelb
Yak Posting Veteran
69 Posts |
Posted - 2008-11-19 : 20:25:07
|
Hi,We currently use Symantec Backup Exec to backup our SQL databases to disk and tape.I am wanting to start backing up to an FTP location so we have another secure offsite option in the case of tape failure.Symantec offer the Symantec Protection Network which is online storage for Backup Exec jobs, but its only available in USA/Canada and we are in Australia.Does anyone know of any software that will allow us to perform incremental backups of our SQL databases to an FTP location? Or another way to make Backup Exec do this?Thanks,Michael |
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2008-11-19 : 20:59:19
|
Why not just FTP the disc backup?==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
michaelb
Yak Posting Veteran
69 Posts |
Posted - 2008-11-19 : 21:09:58
|
Because we have 2 databases that total around 30GB when compressed... to upload 30GB every night would cost a fortune in bandwidth and take forever. |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2008-11-20 : 16:23:33
|
Won't you still need to transfer that data whether it's to the FTP site or disc first?==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
michaelb
Yak Posting Veteran
69 Posts |
Posted - 2008-11-20 : 16:44:05
|
We would need to upload the whole 30GB to the FTP site initially, but from then we would just want to upload the changes... hence the incremental backup. |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2008-11-20 : 17:37:32
|
Still don't understand.How were you expecting to get the 30G file to the ftp site?Once that's done then just take the differentials to disk then ftp them.You would still have to transfer the full backup whenever you took one but unless you could use tr log backups I don't see any way round that.Must admit I wouldn't have thought FTP would be a good solution for this but then I've never tried it with the sort of volumes you would get from backups even on small databases like yours.Even if you did get this working I would be concerned about what happens when your throughput increases.Have you thout about partitioning the database onto read only files? Could reduce the amount of data you have to transfer if you have old data that's static.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
|
|
|