With all due respect, I don't think these reasons are valid enough for not handling this within BSA with some minor engineering enhancement(s). Here's why:
(1) There are ton of outcome or status a AO workflow can have Vs BSA job can only have few choices
Fair enough, but BSA understands two basic exits: 0 (success) and anything else (failure). It wouldn't be too difficult to enable BSA to handle a default context item with a default XML structure, such as "output", as the indicator of success, failure, and reason of failure. Look at how CLM AO callouts work. CLM expects certain elements to exist in the returned XML structure.
(2) AO Parent process triggered by workflow job can spin off child processes which can run asynchronously while parent job has completed. So what would you tell the BSA about status of workflow?
Same approach as #1, standardize and require a single context item to be returned by the parent workflow. But the burden is in the WF design to do this elegantly.
3) WFJ is an integration job with external systems including build systems etc so we wanted to have a job dedicated to external system integration and do certain tasks as it relates to it. A classic example would be run a WFJ job which triggers a workflow which monitors build system. Once a build is available, workflow moves the build to a particular location on the server. WFJ then can call Deploy on demand basis giving dynamic input such as target server etc.
To me, an experienced user of BSA and BAO, it seems to be a missed opportunity to not leverage WFJ as a way to expand the breath of what BSA can do. Very small use cases apply where the context item input to the called WF can be hard coded (your example, WFJ dedicated per external system integration). Most use cases want to take some greater context into consideration/use (such as the plethora of BSA properties, custom or default, in the Property Dictionary, as well as simply ??TARGET.NAME??), and ability to do more fluid complex system integrations within BAO instead of BSA.
My .02 cents!
1 of 1 people found this helpful
I dont know if you already found a solution to this problem, but I am implementing something similar to what you require by using the blcli WorkflowJob assignInputParameters that Scott Dunbar mentioned.
As you say, when craten a WorkflowJob, you have to provide a FIXED value for the parameters, the solution I found was to dinamically create BSA Workflowjobs (each of them, having internally FIXED values for the parameters).
Example: BAO WF (BAO-BSA-get_properties_to_file) receives as input a servername (serverName). IN BSA, for the creation of a job than invokes this Workflow, I would have to assign a value to the parameter serverName
What I do is create an BSA NSH job that accepts as parameter the hostname I want to execute the BAO WF against [NSH jobs can take any amount of parameters that you desire]. My NSH scripts creates a "BAO WF execute job" in BSA and assigns the input parameter to the Workflow invoked (it is easier to show it than to explain it).
I create the NSH script like:
# Name of the dinamic Workflow Job to be created
# Job group in wich the Workflow Job is stored.
# AO Process Name
#(1) Create the Workflow Job
WORKFLOWJOB_DBKEY=`blcli_execute WorkflowJob createWorkflowJob $WORKFLOWJOB_NAME $WORKFLOWJOB_GROUP $PROCESS_NAME`
#(2) Assign the Workflow Job parameter values. Note that parameter3 is a multi-value parameter.
#Assign parameters command
PARAMETERS_LIST=`blcli_execute WorkflowJob assignInputParameters $WORKFLOWJOB_DBKEY $PARAMETERS_LIST`
#(3) Exeute job
WORKFLOWJOB_RUN_KEY=`blcli_execute WorkflowJob executeJobAndWait $WORKFLOWJOB_DBKEY`
This script just takes as input parameter the hostname.
- 1.- Creates a new AO Workflow job (one will be created every time you call the nsh job, so every time you execute it against a different host)
- 2.- Assigns the parameters to the Workflowjob (the servername as we want)
- 3.- Executes it
So you create the nsh script and store it in the Depot
Assigning it an input parameter “%h” (the host it is executed against)
And create a new “NSH script job” that executes that nsh script.
Then, you can execute that NSH job against the desired targets, what will launch the desired Workflow wit the different servernames
[The original answer was reedited, after I realized that was submitted in a too messy wording...]
Thanks for sharing Javier Prieto Sabugo! It's basically the same recipe you would need to call a NSH script job with specific parameter values using BAO. You'd have to create an ExecutionTask or copy of a NSH Script Job to set the parameters you need and execute it. It's just the same logic, but reverse (BSA to BAO) in a sense... Good to know!