Monday, March 12, 2012

programmatically setting package variables in job step command line

We are trying to start a server job running an SSIS package and supply some parameters to the package when we start the job using SMO.

What we have now is this:

string cmdLine = job.JobSteps[0].Command;

cmdLine += @." /SET \Package\GetGroupRatingYear_Id.Variables[User::RatingId].Value;1";

cmdLine += @." /SET \Package\GetGroupRatingYear_Id.Variables[User::GroupId].Value;1";

cmdLine += " /SET \\Package.Variables[User::period].Value;\"" + periodEndDate + "\"";

job.JobSteps[0].Command = cmdLine;

job.Start();

It appears that when the job is run, the modified command line is not used.

What is needed to supply runtime parameters to a job step when starting the job via SMO?

Thanks,

So managing a job in this way seems a bit of a pain. Why not let the package go and get the values from an external location when it is required.

One example would be to use a package configuration, perhaps using a SQL Server configuration. You could update the table values, and then just design the package to use that configuration value, assigning the values to the variables as required. Read up on package configurations if you are not familiar with them.

A variation on the theme it to do the work yourself. You could use any table, not just a configuration format table. Use an Execuite SQL Task to query for the values and using the result set option, you can return values and on the results page of the task, set the output to variable values.

|||

Yes it's been a learning curve in how to do what we're trying to do. The application is driven by a web page where the user says 'run this job, use these parameters'. However, you can't have one predefined job with multiple instances on the server, each with their own set of parameters - which needs to be possible because of the application requirements.

What we are doing now that seems to work is creating a new job, setting the type to ssis, setting the command line to specify the package and parameters, and then starting the job. It is also set to auto delete upon success.

The other way we thought of but decided against was to have the job pick up its runtime parameters from a queue - but then we'd have to create and manage the queue.

The 'create new job' approach lets us run now or set a schedule to run later, all the instances are visible as jobs on the server (based on category to filter out for the UI), and they clean up themselves if they run successfully.

NB: if anyone is curious, changing the command line of an existing job requires the Alter() method to persist the change back to the server, otherwise it just runs with the original command. like this:

string cmdLine = job.JobSteps[0].Command;
cmdLine += @." /SET \Package\GetGroupRatingYear_Id.Variables[User::RatingId].Value;1";
job.JobSteps[0].Command = cmdLine;
job.JobSteps[0].Alter();
job.Start();

However, this permanently changes the command line in the job of course and you have to deal with that.

The code that that we're using to dynamically create the job and supply the parameters is pretty close to this:

string jobName = "the name to give to the new job";
string cmdLine = "the command line to run the package and set parameters";
ServerConnection svrConnection = new ServerConnection(sqlConnection);
Server svr = new Server(svrConnection);
JobServer agent = svr.JobServer;
if (agent.Jobs.Contains(jobName))
{
agent.Jobs[jobName].Drop();
}

job = new Job(agent, jobName);
job.DeleteLevel = CompletionAction.OnSuccess;
job.Category = "Calculate";
JobStep js = new JobStep(job, "Step 1");
js.SubSystem = AgentSubSystem.Ssis;
js.Command = cmdLine;
job.Create();
js.Create();
job.ApplyToTargetServer("(local)");
job.Alter();
job.Start();

No comments:

Post a Comment