Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 SPROC dynamic data passed from ASP

Author  Topic 

huge
Starting Member

31 Posts

Posted - 2002-03-15 : 14:07:39
Hey,
I have an internal message system that I am working on using SQL Server and ASP. I have an inbox that is working well, now I need to be able to let the user delete his messages from his inbox.

So on the inbox beside every message I have a checkbox that the user can check. User can check as many messages as he wishes and the checkbox is in the format of
<input type="checkbox" name="chk_<%=i%>" value="<%=ORs("MessageID")%>">

where i is a loop of all the messages the user has in his inbox, and MessageID is the MessageID from the database. So both those pieces on info are being passed to the delete page.

Now on the deletepage I have the following

NumMsg = Request.Form("Nummsg")
'Dim arlanguage(23)
'countLang = 0
mysql = "Execute sproc_DeleteMessage '"
For i=1 to CINT(numMsg)
if Request.Form("chk_"&i) <> "" Then
currentmsgID = Request.Form("chk_"&i)
mysql = mysql & currentmsgID & "','"
End If
Next
mysql = LEFT(mysql, len(mysql)-2)

With this I am left with an sql statement that gives the follwing:
Execute sproc_DeleteMessage '2','3','5','11','13'
those numbers ('2','3','5', etc) are the message ID. So I pass this all into a sproc and I want to be able to dynamically assign the variables passed. So basically sometimes there will be 3 msg to delete other times there may be 8 but I dont know. So based on this data is there anything I can do in the sproc?

I know another way to do it but it is intensive on the database. This way deals within the asp loop and accesses the database every time a record has to be deleted and not in one big shot.

I hope you understand
HuGE

smccreadie
Aged Yak Warrior

505 Posts

Posted - 2002-03-15 : 14:15:45
Pass the values in as a CSV string. Use the process at http://www.sqlteam.com/downloads/sp_parsearray.sql to process. Use a delete statement is a SPROC with an in clause.

Go to Top of Page

AjarnMark
SQL Slashing Gunting Master

3246 Posts

Posted - 2002-03-15 : 16:02:12
And if you'll name your checkboxes the same name instead of an incrementing name, then the web form will return the values separated by commas.

------------------------
GENERAL-ly speaking...
Go to Top of Page

Onamuji
Aged Yak Warrior

504 Posts

Posted - 2002-03-15 : 16:14:47
<edit>
LOL, damnit you beat me to it since I sat here and typed this novel :-p
</edit>

all this database CSV format crap has to stop ... seriously ... ever consider just doing a loop to insert the values .... if you are using ASP it's simple... maybe you should rethink your naming ... you do know if you give a form field the same name it comes in as CSV already ...

<input type="checkbox" name="delete_msg" value="1">
<input type="checkbox" name="delete_msg" value="2">
<input type="checkbox" name="delete_msg" value="3">

And you check all of them and submit it in ASP you get this ...

Dim deleteMsgs: deleteMsgs = Request("delete_msg")
Response.Write deleteMsgs ' <- This will output "1,2,3"
'**********************************************************************
' Change above to this:
Dim deleteMsgs: deleteMsgs = Split(Request("delete_msg"), ",")
Dim intLength: intLength = UBound(deleteMsgs)
Dim cn: Set cn = Server.CreateObject("ADODB.Connection")
Dim intPos

' I would suggest writing a function that takes the message id and
' a connection object so you can use transactions properly.
Call cn.Open(g_strConnectionString)
Call cn.BeginTrans()
On Error Resume Next

For intPos = 0 To intLength
Call DeleteMessage(cn, deleteMsgs(intPos))
Next

If cn.Errors.Count > 0 Then
Call cn.RollbackTrans()
Else
Call cn.CommitTrans()
End If

That's much easier to deal with and maintain than trying to parse in SQL the string... save yourself the hassle...

Edited by - onamuji on 03/15/2002 16:16:41
Go to Top of Page

huge
Starting Member

31 Posts

Posted - 2002-03-15 : 16:34:43
Thanks alot to everyone.

I figured out another way using a similar approach to smccreadie post.
Thanks again

Also thanks for the tip on looping though and keeping them the same name. Will do it next time but for now, everything else is already done, it would be a hassel to implement now.

HuGE

Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-03-15 : 17:17:41
quote:
all this database CSV format crap has to stop ... seriously ... ever consider just doing a loop to insert the values
...
That's much easier to deal with and maintain than trying to parse in SQL the string... save yourself the hassle...


Well, let me ask you this: what if I want to INSERT 1,000 rows at a time? And I have a trigger on the table that's pretty involved? Not to mention a tenuous web connection/server that might stay open for 1 minute if I'm lucky?

The reason I ask is because I HAD a situation like that, and there was absolutely NO WAY I could let ASP/ADO handle a transaction with even 100 rows to be inserted; the overhead was way too much, the user would literally have to wait 10 minutes for it to finish.

There are some other methods available for handling CSV's, if I may humbly suggest my own article:

http://www.sqlteam.com/item.asp?ItemID=2652

It may seem like overkill if you will never exceed 10 rows, but anything more than 50 and it becomes a lifesaver. Considering that all the code you need is in the article you can copy and paste it and be done with it (it's pretty amazing to consider that 2,000 rows...or more...could be passed with one call to the database!)

Go to Top of Page

joldham
Wiseass Yak Posting Master

300 Posts

Posted - 2002-03-15 : 19:05:07
I second what robvolk has to say and would like to add that by processing the rows within a stored procedure instead of on the asp side, the dba can control and manipulate the transactional processing instead of having a programmer to re-write the asp code. I will agree that chances are (at least in my case) the programmer and the dba are the same person.

Also, I would think that letting the database process the string would be faster than sending each indiviual set of data from the web server to the database many, many times. I remember robvolk having a post or an article somewhere on the site about being able to return over 1,000 in less than a second including the asp processing if the page. So maybe he could shed some light on to whether or not it is faster to process this within the database.

Just a thought.

Jeremy

Go to Top of Page

Onamuji
Aged Yak Warrior

504 Posts

Posted - 2002-03-17 : 00:17:20
In all my procedures I check the TRAN count and if there isn't a transaction I start it and end it in that proc else if there is i just use a rollback incase anything happens ... I'd just really like to know how you got 2000 records in a CSV file format into a stored procedure ... considering starting at 1 and having a comma dilimeter all the way to 2000 that is 8892 characters ... far exceding VARCHAR ... anyway heh oh well just thought i'd point that out :-)
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-03-17 : 08:26:08
If you use fixed-length numbers you can get 2,000 4-digit numbers I sometimes use fixed-length strings concatenated into one huge one. You can also use multiple varchar(8000) parameters. This better illustrates it:

http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=12538

quote:
I remember robvolk having a post or an article somewhere on the site about being able to return over 1,000 in less than a second including the asp processing if the page.


I don't know if I've been able to get 1,000 returned to a web page in less than 1 second. I use GetString to write rows into an HTML table. I have gotten this to work in just under 10 seconds; about half of that is HTML render time. The remaining 5 seconds is split evenly between data retrieval and ASP response generation. When I run the same query in QA it will return the 1,000 rows in about 1 - 1.5 seconds. If the table is in the data cache it can return the rows is less than 0.5 second.

As far as passing rows into the database, oh yeah, it's MUCH faster than repetitive looping. When I had to first apply this, I had a master-child table setup. The trigger on the master table populated rows into the child table. It added between 0.25 and 1 second to the INSERT operation, depending on the server load. If I inserted 1,000 rows in one operation, it would add about 3-4 seconds.

Now, if I did repetitive INSERT ops inside an ASP loop, it would've been at least 1,000 * 0.25 of extra processing time: 250 seconds, vs. 3-4 seconds. Then there's the added overhead of multiple ADO calls and network traffic. And the resources needed to maintain a transaction over 1,000 INSERT operations...

Even if you'll never get close to these numbers, performing the operation in one pass lets you scale up from 1 to 10 to 100 rows, with at most a 20% increase in processing time. This is even less than logarithmic scaling; using loops in ASP would increase in a fairly linear manner, and will kill performance very quickly.

Go to Top of Page

Onamuji
Aged Yak Warrior

504 Posts

Posted - 2002-03-17 : 22:23:38
that's what updatable recordset are for :-) and the new and improved DataReader for .NET ... but whatever you're more comfortable with ...
Go to Top of Page
   

- Advertisement -