i have a process which build reports, by breaking data apart to diffrent tables.Step 1: use bcp to load files into a TempTable on the db.Step 2: goes on every column and updates the relevent tables with new data.(at this time i will add that i added to all the columns in the TempTable index, which has improved the insert time by 70%)this is an example of an index in the TempTable :CREATE NONCLUSTERED INDEX [IX_Recipient] ON [dbo].[TargetTable] ( [Recipient] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]GOALTER TABLE [dbo].[TargetTable] ADD CONSTRAINT [DF_TargetTableNEW_Recipient_1] DEFAULT (N'') FOR [Recipient]GO[Recipient] [nvarchar](25) NOT NULL,
Step 3: i want to replace all the columns (like the one above) instead of a recipient value, to change it with the suitable id fron Table which contains all recipient.the problem starts here.in my load test i have 7.7 million rows.when i try to do :UPDATE t1 set t1.UserName=Cast(t2.Id as NVARCHAR(20)) FROM TargetTable t1 inner join Dyn_User t2 (NOLOCK) ON t1.UserName=t2.Name
it takes 1.5 hours (and this is on on column out of ~20).any idea on how can i improve this update?one thing i have noticed is that there is a "sort" in it the execution plan, which takes 41% - can i avoid this?ThanksPeleg