Author |
Topic |
mikebird
Aged Yak Warrior
529 Posts |
Posted - 2008-04-07 : 08:04:20
|
There's a 24Gb live production database which I'm not allowed to alter or slow down. I urgently need a cut of that onto my 7Gb test environment whose last cut was 20/03/2008 and was OK until a week or so ago and is now not up-to-date enough.Boss is away. I think the best to do any of this is transactional replication, but it slowed production right down and is out of the question.Who has some suggestions on how to do it? I have no idea what defined this test to make it 7Gb of the 24Gb data. Where can I find this definition? Maybe it was done with DTS? I've had a look in both servers and there's no process for that. There are no backups on production in SQL Server. It's all in backup software - on tapes... So no .bak files on the live server anywhere! There is this little test copy, with the .bak files from March to produce it.. How can I update it from live? You DBAs must have a bulletproof, standard answer, or many, to that!It's all 2000, not 2005 |
|
RickD
Slow But Sure Yak Herding Master
3608 Posts |
Posted - 2008-04-07 : 08:23:11
|
Yes, get last nights tape and restore from that. This kills two birds with one stone, you get to test that the tape backups are valid as well as getting your cut.I have seen too many places that rely soley on tapes suddenly realise that the tape backups were useless. I would suggest always keeping a copy on disk.24Gb isn't all that big btw. |
 |
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-04-07 : 09:18:35
|
1)Backup database,copy to test server and restore database2)Your backup will be less size than actual size of database.3)Don't try with transactional replication,it will lock tables/database while creating first snapshot. |
 |
|
mikebird
Aged Yak Warrior
529 Posts |
Posted - 2008-04-07 : 09:19:26
|
But my main query is finding the logic used to get the smaller chunk from the larger. This will really matter, as I have put together several stored procedures over weeks and I need to be using a new version of this database which is new in an up-to-date way, but not with extra tables, views or procedures, which will have been introduced to live which are irrelevant...I'm faced with the huge full backup from tapes, and I need to produce a fresh, compliant testenv.Can I make it update using the definitions already there? |
 |
|
mikebird
Aged Yak Warrior
529 Posts |
Posted - 2008-04-07 : 09:23:47
|
sodeepIt would be nice to have confirmation that if creating a backup (.bak file) from live would not cost user performance hit. I'd assume this because the process writes to disk, while the database is still queryable and modifyable? |
 |
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-04-07 : 09:29:02
|
Does your test environment needs real-time synchronization ? |
 |
|
RickD
Slow But Sure Yak Herding Master
3608 Posts |
Posted - 2008-04-07 : 09:37:45
|
quote: Originally posted by mikebird sodeepIt would be nice to have confirmation that if creating a backup (.bak file) from live would not cost user performance hit. I'd assume this because the process writes to disk, while the database is still queryable and modifyable?
Have a look in Books Online for backups. You will find that users shouldn't be adversly affected by taking a backup. |
 |
|
sodeep
Master Smack Fu Yak Hacker
7174 Posts |
Posted - 2008-04-07 : 09:41:22
|
24GB is not so big. Nothing will be affected while taking backup as RickD said. |
 |
|
rmiao
Master Smack Fu Yak Hacker
7266 Posts |
Posted - 2008-04-07 : 23:16:49
|
Don't know how to restore 24gb db to 7gb disk. Try copy data in dts or with bcp. |
 |
|
|