Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
raaj
Posting Yak Master
129 Posts |
Posted - 2009-04-02 : 01:56:58
|
| Hi guys, I have table say Table1. It has columns say column1, column2 and column3. 1. Now i have to create two columns named column2_backup and column3_bakup. my Table1 will have following columns after altering. column1 int primarykey, column2 varchar(20), column3 varchar(20), column2_backup varchar(20), column3_bakup varchar(20) ** upon altering the new column values will be null in the table. 2. i have to insert all the values of column2 into column2_backup and column3 into column3_bakup. Since they already have null values, i am doing like this update Table1 set column2_backup = column2, column3_bakup = column3 where column1 in (select column1 from Table1). This way i am updating the new columns with old cloumn values. Now my point is i have some millions of records in the database, If i run the above query to update the records, i will run into Transaction Full problems. So i have to update say 100000 rows at a time. My question is how would i make possible to update 100000 rows at a time in an very gud manner and avoid Transaaction full problems ??? 3. Later i will have to update old column values i.e column2 and column3 values mapping to some other fields of other table. That also should be done say 100000 rows at a time. Because both the steps 2 and 3 are almost similar.Thanks,Raaj |
|
|
thiyagu_rind
Starting Member
46 Posts |
Posted - 2009-04-02 : 03:15:24
|
| Hi, Why can't you can use the DBCC command to shrink the transaction log file before doing the update statement. To update 100000 records at a time, use some identifier with the help of rank() function.RegardsThiyagarajan |
 |
|
|
|
|
|
|
|