Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
cjturner
Starting Member
1 Post |
Posted - 2007-05-15 : 12:32:08
|
| Hello,I need to add a column to a big table (500,000,000 rows) and populate it so that every row has the same value (1). If I add the column and then carry out a straight forward update, the transaction log fills before the query finishes. I can't make the transaction log any bigger. Can anyone think of a way that i can get around this? I have the space to create a copy of the table so I was considering creating a new table with just one column and row, containing the value I need to add, and then cross joining this with the main table. Any chance this might work?Thanks for reading. |
|
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2007-05-15 : 12:44:14
|
here's one way that comes to mind. you could export the table using bcp queryout with the new column value, import the result to the new table using bcp again (use the -b flag to limit batch size to something your tlog can handle, like -b100000 say).export would look something like this:bcp.exe "select *, 'myval' from mydb.dbo.mytable" queryout file.dat -c -T -SMYSERVERimport would look like this:bcp.exe mydb.dbo.mynewtable in file.dat -c -T -SMYSERVER -hTABLOCK -b100000drop all indexes on mynewtable before the import, and recreate them after it's loaded. otherwise import will be much slower... www.elsasoft.org |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2007-05-15 : 13:00:14
|
Add it with a default:ALTER TABLE bigtable ADD newcol INT NOT NULL CONSTRAINT DF_bigtable__newcol DEFAULT(1) rockmoose |
 |
|
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2007-05-15 : 15:02:08
|
my assumption was that OP was already trying that and it was blowing the tlog... www.elsasoft.org |
 |
|
|
dinakar
Master Smack Fu Yak Hacker
2507 Posts |
Posted - 2007-05-15 : 15:07:25
|
| Alter Table on such a big table could also blow up the log. Jesse's suggestion is the right way to do it. In addition to what Jesse suggested, have a job running that truncates the log, every minute, to keep a check on the log while you are doing the import.Dinakar NethiSQL Server MVP************************Life is short. Enjoy it.************************http://weblogs.sqlteam.com/dinakar/ |
 |
|
|
X002548
Not Just a Number
15586 Posts |
|
|
|
|
|
|
|