Return to How to Asynchronously Execute a DTS package from ASP or ASP.NET
How to Asynchronously Execute a DTS package from ASP or ASP.NET
Written by Guest Authors on 27 March 2005
This article is written by Davide Mauri. It describes a clever yet simple way to aysnchronously execute a DTS package from inside SQL Server. This can be easily called from ASP or ASP.NET.
The Data Trasformation Services are a powerful tool, and sometime its features are so useful that you’d like to invoke a DTS package not only from SQL Server but from an external program.
To do this you have several choices: you can use the DTSRun.exe tool or you can do it leveraging the SQL-DMO features.
Unfortunately if you’re developing a web application (ASP, ASP.Net or whatever you use) none of them seems to be the right choice: too much problems, too much effort and a very modest results. In addition none of these solutions can be called asynchronously: if you just need to implement a “fire-and-forget” technique, you just cannot do that!
An example of a situation like this is when you have to export / import data, allowing the user to control the procedure from the web. For an export feature the user should be able to select the query he likes to have exported in Excel format, and that receive the result in its email inbox.
Note: If your thinking that to do this kind of work a better tool already exists and it’s called Reporting Services I agree with you. In the conclusion paragraph at the end of this article I’ll take care of this part.
So, how it’s possible to obtain this behaviour? The trick is to use Jobs creating them dynamically.
Jobs can run DTS Packages using the DTSRun utility: this also allows to pass parameters to the package in the form of Package Global Variables.
The idea is to create a wrapper stored procedure that will take care of creating the job on the fly, settin’ it up, and executing it.
So, to get the job done in a proper manner, we want to create a category for our jobs.
exec msdb.dbo.sp_add_category 'JOB', 'LOCAL', 'ASP'
Using a category to classify dynamically created jobs will help to maintain our system clean and easy to administer.
Now it’s time to create the job that will execute our DTS Package. We need to go through four steps.
First of all we need to set up the command string that will be executed by the job. The string contains the call to the DTSRun utility, along with the correct parameters:
DTSRun.exe /S “(local)” /N “” /W ”0” /E
- /S specify the server where the Package is present
- /N specify the name of the Package we need to execute
/W specify whether to log the job to the event log
- /E specify the we want to use the Integrated Authentication (so we’re going to execute the code inside the DTS Package using the SQLServer Agent account or the Proxy Account, depending on the job owner. The job owner will be the account that will execute the stored procedure).
Since we also want to pass parameters to our Package, we need to add their values in the command string, using the Global Variables:
/A [global variable name]:[global variable type] = “[global variable value]”
The T-SQL code to do all these things is here:
-- Specify DTS to be executed
declare @dts varchar(128)
set @dts = 'DTS From ASP'
-- Initialize command
declare @cmd varchar(4000)
set @cmd = '"C:\PROGRAM FILES\MICROSOFT SQL SERVER\80\TOOLS\BINN\Dtsrun.exe"'
set @cmd = @cmd + ' /S "(local)" /N "' + @dts + '" /W "0" /E '
-- Specify global variables values to be passed to DTS trough DTSRUN
set @cmd = @cmd + '/A GV1:8="' + @GlobalVar1 + '" '
set @cmd = @cmd + '/A GV2:8="' + @GlobalVar2 + '" '
Now we’re ready for the second step: create the job. To do this we have to use the sp_add_job procedure. Since every job must have its own name, we’re going to use the NewId() function to generate a unique name.
-- ALLOW CONCURRENT DTS EXECUTIONS
declare @jname varchar(128)
set @jname = cast(newid() as char(36))
This will allow us to execute our stored procedure more that once concurrently. Be aware that with this behavior your DTS must handles concurrent execution correctly. If you do not like or need this method, you just need to use a fixed job name:
-- ALLOW TO RUN ONLY ONE DTS INSTANCE AT TIME
set @jname = 'AspDtsJob'
In this case, if you try to execute the stored procedure will the first call is not yet finished, Sql Server will generate an error since a job with this name already exsits. Remember to handle that error!
Ok, now we can create the job:
-- Create job
@job_name = @jname,
@enabled = 1,
@category_name = 'ASP',
@delete_level = 1,
@job_id = @jid OUTPUT
The trick here is the @delete_level parameter. A value of 1 means that the job will be automatically deleted once its execution is finished correctly. With this solution we don’t have to bother to manually delete jobs, and if a job fails we can read it’s log to find out what went wrong.
The third step is to bind the job to a server:
@job_id = @jid,
@server_name = '(local)'
And the last one is to add the step that will execute the command string we set up before:
@job_id = @jid,
@step_name = 'Execute DTS',
@subsystem = 'CMDEXEC',
@command = @cmd
Now we just have to execute the job:
-- Start job
@job_id = @jid
The complete script can be found here. You just have to customize it for your own needs.
Once you've created that stored procedure, you can call it from your web application.
Now there is just one left thing to say: Is there a better method to resolve the problem exposed in the example (export data into an excel format)?
Yes there is. To do this kind of operation you’re encouraged to use the good SQL Server Reporting Services which automatically incorporates export to Excel (and many other formats, PDF included) and other reporting functions.
I’ve used this script to satisfy customers and stakeholders requests while I’m setting up the Report Services infrastructure, where I’ll have a lot more of reporting firepower.