You can create deploy jobs from a NSH script with help of BLCLI, you have to build a logic around your requirement to create a script.
If you look at deployjob BLCLI namespace you will get a good idea how to create one and based on this you can write your script.
As Santosh mentioned BLCLI Help will give you better idea to build script according to your requirement. here is small example -
Server.txt is file with list of servers. Assuming you have only one package to deploy. You may create another file of package names and loop for same, if you have multiple packages..
pkgkey=`blcli -v defaultProfile BlPackage getDBKeyByGroupAndName /BLCLITestPkg TestPkgFile`
grouID=`blcli -v defaultProfile JobGroup groupNameToId /BLCLITest`
for i in $servers
echo `blcli -v defaultProfile DeployJob createDeployJob TestJob_$i $grouID $pkgkey $i true true false`
In above script values used are -
BLCLITestPkg = package group name
TestPkgFile = package name
BLCLITest = Job Group name
Why do you have to create 300+ different deployment jobs? Can you give me an example of the type of jobs you're talking about? Is there a way you can use Compliance with remediation to handle all the deployment packages and targets for you automatically?
I need to deploy some scripts and collect back the log that the script generate to about 170+ servers.
Each server has its own customise script so there will be at least 170x2 (deploy + log collection) job to be created in my understanding.
I would think collecting the logs can be handled by a single NSH script job.
Just place the logfiles within a folder in the staging_dir for example. Passing Server properties like STAGING_DIR to both Deploy- and NSHScriptJob should enable you to do that.
In case there is some logic behind where the files end up, I would suspect that combining that in to one ScriptJob should be possible.
So at least you should be able to handle this in 171 scripts
Are the collection scripts really completely different from each other? Can you maybe use Server properties in your script to determine which commands are needed, so that you can at least get some grouping done?
I would expect to have the same script for the same OS, maybe once more divided by organizational restrictions - having one script per server seems like overkill.
Why do not use remote execution, keeping all scripts in App server and execute remote command on each server?