Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Transact-SQL (2000)
 fire a trigger from asp

Author  Topic 

sameerv
Starting Member

29 Posts

Posted - 2002-10-14 : 12:13:54
Hi guys,
Heres my problem .

I am calling a stored procedure from an ASP page.
This stored procedure reads data from a text file row by row and passes it through numerous validations before updating it. Since the volume of rows in the file is huge, the stored proc takes upto 45 minutes or more to execute.
This gives me a Script timeout error. I do not wish to increase the timeout beyond 20 minutes.

Since I do not need the ASP page to wait while the SP is executing, I execute an update query from ASP on a dummy table, which in turn fires
an update trigger which then begins the execution of the stored proc.I expected the ASP page to simply fire the trigger and carry on to the next execution, while the backend proceeds with its execution.

However, I find that the ASP page waits till the entire SP has completed executing, which causes a script time out.
So my problem remains.

Is there any way I can simply fire the trigger from ASP and then proceed executing the script without waiting till the sp completes its processing.


robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-10-14 : 12:37:00
You should put the stored procedure into a job and use sp_start_job to run it. Then you can simply call sp_start_job from your ASP page. See Books Online for more on sp_start_job.

Additionally:
quote:
This stored procedure reads data from a text file row by row and passes it through numerous validations before updating it. Since the volume of rows in the file is huge, the stored proc takes upto 45 minutes or more to execute.
This is an extremely inefficient way to process a large file. You would be far better off importing the entire file into a SQL Server table (staging table) and then process it internally to perform all of the validations. The validated data would then be inserted into the final destination tables.

As long as you use the current approach you will always have poor performance. Unless you're dealing with 50 million rows or more there's no reason for SQL Server to spend 45 minutes processing the data.

Go to Top of Page

csphard
Posting Yak Master

113 Posts

Posted - 2002-10-14 : 13:28:48
I reading about sp_start_job. Do you just make a connection to the database that this job will run against and then

conn.execute(sp_start_job @job_name = 'Nightly Backup')

or do you make a connection to msdb .


Go to Top of Page

AjarnMark
SQL Slashing Gunting Master

3246 Posts

Posted - 2002-10-14 : 17:05:03
You can connect to your own database and use syntax like:

exec msdb..sp_start_job @job_name = 'MyJobName'

Go to Top of Page

SamC
White Water Yakist

3467 Posts

Posted - 2002-10-14 : 21:41:06
This is a very cool way to spin off a task in .NET

[url]http://www.dotnetjunkies.com/howto/default.aspx?id=49[/url]

SamC

Go to Top of Page

sameerv
Starting Member

29 Posts

Posted - 2002-10-15 : 09:57:35
Hi guys,
Thanx a lot for your suggestions.I'll be trying it using a job.

Hi robvolk,

I am not reading the data directly from the text file (although I had erroneously said so). Actually I bcp the text file data into a (staging??) table and then process each record internally to perform all of the validations. If satisfied the record is appended to the final table and the next record is fetched from the staging table.
Only the latter process(i.e. excluding the bcp) takes me so long.

In case of the text file containing 15,000 records to be processed,it takes approx. 40 minutes if the final table already has about 60,000 records in it.If the final table has no records or upto less than a thousand records it takes about 9-10 minutes for the processing.

I am posting my stored proc which does the processing.
If possible,please let me know where I can optimize the code for better performance.


CREATE PROCEDURE Upload_BankBranch_Details @UserID varchar(15), @UploadID int, @SuccessCount int output, @FailureCount int output
AS

--********************Open cursor on staging table which contains recs bcp ed from txt file ******

Declare Cur_UplBankBranch CURSOR
FOR
select
ULBB_BM_Code, ULBB_BBrM_Code, ULBB_BBrM_Name,ULBB_Reason
from
CMS_UploadLog_BankBranchMaster

open Cur_UplBankBranch

fetch next from Cur_UplBankBranch
into @tmpBankCode, @tmpBranchCode, @tmpBranchName, @tmpReason

while @@FETCH_STATUS = 0
--***********************all validations checked*****************

--******************** insert into final table******
else
begin
insert into
CMS_BankBranchMaster(BBrM_BM_Code, BBrM_Code, BBrM_Name,)
values
(@tmpBankCode, @tmpBranchCode, @tmpBranchName)

--********************Also,insert current user and branch code to permission table ******

insert into
CMS_User_Branch_Mapping
values
(@UserID, @tmpBankCode, @tmpBranchCode)

--********************Also,insert all users who are designated as 'SuperUsers' and ******
--********************branch code to permission table (using cursor) ******


Declare Cur_UpdtSuperUser CURSOR
FOR
select UM_UserID from cms_groupmaster,cms_usermaster
where UM_UserID <> @UserID and UM_GM_Code=Gm_Code and GM_Description='Superuser'

open Cur_UpdtSuperUser
fetch next from Cur_UpdtSuperUser INTO @superuser

while @@FETCH_STATUS = 0
begin
insert into CMS_User_Branch_Mapping values (@superuser, @tmpBankCode, @tmpBranchCode)
fetch next from Cur_UpdtSuperUser INTO @superuser
end
close Cur_UpdtSuperUser
deallocate Cur_UpdtSuperUser
--********************Update staging table record with Reason using original cursor with "where current of" ******
update
CMS_UploadLog_BankBranchMaster
set
ULBB_Reason=@tmpReason
where current of Cur_UplBankBranch
fetch next
end

Go to Top of Page

sameerv
Starting Member

29 Posts

Posted - 2002-10-15 : 10:31:44
quote:

You should put the stored procedure into a job and use sp_start_job to run it. Then you can simply call sp_start_job from your ASP page. See Books Online for more on sp_start_job.


robvolk,
I presume, to fire the sp_start_job from ASP, I will have to execute it in a stored proc and call the stored proc from ASP.In this case too, wont the ASP wait till the entire job is completed before proceeding to execute its script?
If this is the case, is there any other way to call the sp_start_job from ASP.

quote:

This stored procedure reads data from a text file row by row and passes it through numerous validations before updating it. Since the volume of rows in the file is huge, the stored proc takes upto 45 minutes or more to execute.

quote:
This is an extremely inefficient way to process a large file. You would be far better off importing the entire file into a SQL Server table (staging table) and then process it internally to perform all of the validations. The validated data would then be inserted into the final destination tables.

As long as you use the current approach you will always have poor performance. Unless you're dealing with 50 million rows or more there's no reason for SQL Server to spend 45 minutes processing the data.



I am not reading the data directly from the text file (although I had erroneously said so). Actually I bcp the text file data into a (staging??) table and then process each record internally to perform all of the validations. If satisfied the record is appended to the final table and the next record is fetched from the staging table.
Only the latter process(i.e. excluding the bcp) takes me so long.

In case of the text file containing 15,000 records to be processed,it takes approx. 40 minutes if the final table already has about 60,000 records in it.If the final table has no records or upto less than a thousand records it takes about 9-10 minutes for the processing.

I have posted my stored proc which does the processing as a reply under the topic.
If possible,please let me know where I can optimize the code for better performance.


Go to Top of Page

AjarnMark
SQL Slashing Gunting Master

3246 Posts

Posted - 2002-10-15 : 19:32:25
Sameer,

Regarding the sp_start_job, you can just execute that command from an ADO Connection Object (like oConn.Execute("sp_start_job ...")) and not need a stored procedure. The great thing about this is that once the job is started, then the "start_job" routine is done and returns to the page. It does not wait for the job to end, because all it checks is that the job got STARTED successfully.

As for your stored procedure, I would suggest you do away with the cursor if at all possible. Perhaps you could do a series of set-based commands, perhaps adding a STATUS field to the table and do something like this...

UPDATE Stagin SET STATUS = 'PENDING'
GO

UPDATE Staging SET STATUS = 'REJECT'
WHERE ... -- Put your test here like StartDate IS NULL, or ISNUMERIC(Field1) = 0, etc.

UPDATE Staging SET STATUS = 'REJECT'
WHERE ... -- Continue with next test, and so on...

Finally...
INSERT INTO FinalTable ... SELECT ... FROM Staging WHERE STATUS = 'PENDING'

Depending on your tests, these might be able to be merged together into fewer statements. To the degree that you can use set-based operations like this instead of cursors, you should see significant gains in performance.

Go to Top of Page

sameerv
Starting Member

29 Posts

Posted - 2002-10-16 : 00:40:37
quote:

Sameer,

Regarding the sp_start_job, you can just execute that command from an ADO Connection Object (like oConn.Execute("sp_start_job ...")) and not need a stored procedure. The great thing about this is that once the job is started, then the "start_job" routine is done and returns to the page. It does not wait for the job to end, because all it checks is that the job got STARTED successfully.

As for your stored procedure, I would suggest you do away with the cursor if at all possible. Perhaps you could do a series of set-based commands, perhaps adding a STATUS field to the table and do something like this...

UPDATE Stagin SET STATUS = 'PENDING'
GO

UPDATE Staging SET STATUS = 'REJECT'
WHERE ... -- Put your test here like StartDate IS NULL, or ISNUMERIC(Field1) = 0, etc.

UPDATE Staging SET STATUS = 'REJECT'
WHERE ... -- Continue with next test, and so on...

Finally...
INSERT INTO FinalTable ... SELECT ... FROM Staging WHERE STATUS = 'PENDING'

Depending on your tests, these might be able to be merged together into fewer statements. To the degree that you can use set-based operations like this instead of cursors, you should see significant gains in performance.





Hi,
Thanks a lot for your help. Just one more question regarding this.
When calling the sp_start_job from ASP can I pass a couple of parameters (variables) which I need to use in the stored procedure to be called from the job.

Go to Top of Page
   

- Advertisement -