Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
tool
Starting Member
26 Posts |
Posted - 2002-09-16 : 12:10:26
|
Hi,I have an ASP page in which a user can edit multiple records (the number of records they update varies) and submit them to the webserver. On the webserver I loop through the records calling a sproc to perform the update for each record.I now need to include all the individual updates in a single transaction. I know I have the option of using transactions within the ASP page but I have read and been told to stay away from these if possible. All other transactions for the app are also maintained in SQL Server so I would like to be consistent.The problem is I don't know how to send all the records for update to SQL Server so they can be included in 1 transaction.Has anyone done something like this before? Any suggestions would be appreciated.Thanks |
|
|
robvolk
Most Valuable Yak
15732 Posts |
|
|
tool
Starting Member
26 Posts |
Posted - 2002-09-16 : 16:09:47
|
Thanks for the response Rob!I took a look through the articles you suggested and this is what I have so far:CREATE PROCEDURE UpdateTestValues @ValList1 varchar(8000), @ValList2 varchar(8000)ASDECLARE @ValTable1 table(RecNum int identity(1,1), Val1 int)DECLARE @ValTable2 table(RecNum int identity(1,1), Val2 varchar(100))INSERT INTO @ValTable1(Val1)SELECT NullIf(SubString(',' + @ValList1 + ',' , TallyID , CharIndex(',' , ',' + @ValList1 + ',' , TallyID) - TallyID) , '') AS Val1FROM TallyWHERE TallyID <= Len(',' + @ValList1 + ',') AND SubString(',' + @ValList1 + ',' , TallyID - 1, 1) = ',' AND CharIndex(',' , ',' + @ValList1 + ',' , TallyID) - TallyID > 0INSERT INTO @ValTable2(Val2)SELECT NullIf(SubString(',' + @ValList2 + ',' , TallyID , CharIndex(',' , ',' + @ValList2 + ',' , TallyID) - TallyID) , '') AS Val1FROM TallyWHERE TallyID <= Len(',' + @ValList2 + ',') AND SubString(',' + @ValList2 + ',' , TallyID - 1, 1) = ',' AND CharIndex(',' , ',' + @ValList2 + ',' , TallyID) - TallyID > 0--INSERT INTO MyTable(Val1, Val2)SELECT Val1, Val2FROM @ValTable1 v1 INNER JOIN @ValTable2 v2 ON (v1.RecNum = v2.RecNum)GODECLARE @ValList1 varchar(8000)DECLARE @ValList2 varchar(8000)SET @ValList1='1,3,7'SET @ValList2='Val 2 for rec 1,Val 2 for rec 3,Val 2 for rec 7'EXEC UpdateTestValues @ValList1, @ValList2I really like the set based approach you described in your article but as far as I can tell to get it to work with multiple fields I needed to create table variables with identity fields and join them to reconstruct the records in the sproc.The sproc would actually need to be even longer than this since there are 10 fields in the actual table I am updating (e.g. 10 table variables and 10 joins).Am I way off here? Is there a simpler way to do this? I was thinking of using a while stmt to loop through the values in each of the CSV variables but with 10 of them I think it would be just as tedious.If anyone has some advice on how I should procede I'd appreciate it!Thanks |
 |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2002-09-16 : 16:16:17
|
| This thread:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=12538Shows how to do it using fixed-length instead of CSV values. AjarnMark has some code on doing it with CSVs too:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=19565I don't think it's all that difficult to use Mark's technique, and it will probably perform just as well. Temp tables don't use up THAT much overhead; whenever there's a performance issue with temp tables it's because of the number of calls. And if you're using SQL 2000 then you can substitute table variables. |
 |
|
|
tool
Starting Member
26 Posts |
Posted - 2002-09-16 : 16:43:14
|
| It looks like Mark's solution is very similar to mine except he uses temp tables. I guess I'll create the procedure for the actual table and see how it performs. I just worry about having 10 table variables and making 10 joins for what seems such a simple operation.We'll see how it goes. Thanks for the suggestions! |
 |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2002-09-16 : 16:59:04
|
quote: It looks like Mark's solution is very similar to mine except he uses temp tables.
Yeah, right after I posted it I realized "Damn, they're the same!" but they don't have an "undo post" feature in Snitz.  quote: I just worry about having 10 table variables and making 10 joins for what seems such a simple operation.
Don't worry about it, it'll look worse than it really is. Unless you have thousands of rows involved the performance should be fast enough, especially with table variables. No matter what, it will sure beat an ASP solution.Here's an idea, if you want to try it. You should be able to do this with one or two temp tables only, if you can pass the data as one string. You'd have one delimiter for columns and another for rows. Let's say a comma for columns and a semicolon for rows:SET @data = 'Fred,Flintstone,Father;Wilma,Flintstone,Mother;Barney,Rubble,Father;Betty,Rubble,Mother;Pebbles,Flintstone,Daughter'CREATE TABLE #names (ID int IDENTITY(1,1), data varchar(8000) NOT NULL)--use the CSV parsing on the @data variable to split the rows using the semicolon delimiter and insert them into #names--use the parsing again on the #names table to split on the commaThat might be faster because only two parsing operations have to take place. The only problem is building the string to be passed, and accommodating any missing values. |
 |
|
|
|
|
|
|
|