Here's some more info and what I think the root cause is...
If I telnet to the UNIX host and login with ID 'xyz' my commands work fine. If I pass the command through my BLPackage/Job, or, run the command through NSH (nexec su xyz, I get the error below.
server_middle_man: /pmart/informatica811/server/bin/pmrep: error wh
ile loading shared libraries: libACE.so.5.4.7: cannot open shared object file: No such file or directory
I believe this error is related to a Informatica shared library and is the result of environment variables being not being fully loaded (big differences in ENV between NSH login and telnet login).
How do I set/force the ID associated in the role I use to load all ENV variables and settings (all the dot files that load during telnet login).
This seems to be more critical to setup/ensure is working when interacting with UNIX targets.
I was about to say check the env variables...
BladeLogic doesn't source any of the profile files when we map to a user (unless you do an nexec -l through nsh I think), because those profiles could be interactive which would hang any connection as that user (waiting for the interactive input).
So you'll have to manually source those files in the external command in your blpackage. And I believe you will have to do this in each external command box if you have multiple.
I've asked engineering about having a checkbox like "source user profiles" in RBAC, but that hasn't gotten into the product yet. If you can open a ticket about doing that too, squeaky wheel and all...
OK, seems we're on the same page here...
If I want that users .profile to load, as first command in the external command, can I just refernce work.
I think you will have to manually set $HOME in the external command or explicitly reference the path to the file - /home//.profile, $HOME isn't picked up either.
I should've clarified.. i'm attempting this manually within NSH first... not from the script...
From within the users home directory, .profile and .infa_profile appear to be the two I need to load.
I run ./.profile and ./.infa_profile (both or either/or) and the command line prompt updates (appears to load/do something).
I check my ENV and nothing is updated... Did i miss somethign?
if you're doing it w/ nexec, you need to do something like:
The support desk came back the below, and it works...
I had some discussion around this issue and did some testing in a test environment. The best way for the user executing you deploy job to gain the required environment variables will be to source the profile files on the target, as we were beginning to test.
In the external command we can add something similar to:
This will source the profile file and set the required variables. This should then allow the application to execute and have the required environment variables.
Once you have an opportunity let me know how this works in your environment and we can confirm this addresses this issue. As this is a high priority issue I will follow up with you again by Thursday, 06-11-2009.
And this then brings me to another question/scenario...
Can I automate the import of objects (files) into BL through a NSH command and can I pass a parameter into this NSH command? If yes, then I believe I can automate this...
Parameter = Job Number (12345)
Files always reside at: //serverX / dirA / $parameter / $parameter.XML
Set parameter (pass to NSH command)
NSH command imports file(s) / creates objects (using parameter in naming)
blcli commands to create pkg from objects (using parameter in naming)
blcli commands to create job from pkg (using parameter in naming)
blcli commands to execute job (using parameter)
Is this crazy talk??? Looks like some reference to commands to import objects may be unsupport commands, but do they actually work in 7.4.5?
Depending on the type of package you want to create - if it's CustomSoftware than there are blcli commands for this.
If you want to make a BLPackage, you first need to create a Component Template w/ all the parts, then you should be able to use the blcli to create a BLPackage from the CT. I haven't found a way to create a BLPackage w/ the blcli w/o a CT yet.
as for the 'unreleased commands' - they work for the most part, they just aren't officially supported, but we use them often to do things like this where there aren't released commands.
OK. I think I’ve greatly simplified how to accomplish this; but still have a question that I believe ends up being a permissions issue (not sure if it can be resolved as desired).
I’ve written a NSH script and job that execute as desired. I’m passing parameters into the script as desired (setting source directory of files, login values, etc). The target is a UNIX host. If I login to ConfMgr as admin and run the job, so far no error (not all lines in the script are active yet). If I login to Config Mgr using the desired role (has rights to read script, read modify and execute job) I get the error: Access Denied Server.ExecuteNSHScript on UTDDSLABWAD04.
This role is mapped to a local ID on the unix host. I know the role is properly setup as it works as desired when used to run BLPackage jobs and executes correctly as the mapped user.
I recall some differences in how the mapping/permissions work with NSH.... I'm sure that's my issue here...
Users looks correct. Exports is * rw
Server.ExecuteNSHScript is actually a CM GUI permission. Make sure your role has that permission or Server.* on the target server, and also have it listed in your role's Authorizations.
Ya know, I checked the permissions and every variation on all objects EXCEPT the server itself. The initial settings were there, but none of the edits.
Lesson learned, i'll not forget this anytime soon.
The commands i want to run (source to load the . environment files and the subsequent commands) all succeed from a execute command within a BLPackage. Unfortuately, passing the variables and and other workflow is a bit cumbersome in a package, so i've resigned to running this fully within a NSH script.
Except, now I find the commands that work in External Commands don't have similar results from NSH.
Here's what I've got so far... (I have comment some bits at the end as I'm troubleshooting; no error codes, comments, etc are added (just base code).
Import XML Data to Informatica for DataWarehouse
6/12/2009 by Larry Eichenbaum for Morgan Stanley
sub copy_files ()
cp $SOURCE_TXT //$SERVER/$TMP_DIR
cp $SOURCE_XML //$SERVER/$TMP_DIR
cp $SOURCE_DTD //$SERVER/$TMP_DIR
sub open_shell ()
sub inf_connect ()
/$LINUX_ID/informatica811/server/bin/pmrep connect -r $ETL_SVR -n $INF_ID -x $INF_PW -h localhost -o 6000
sub inf_import ()
/$LINUX_ID/informatica811/server/bin/pmrep ObjectImport -i /tmp/stage/$CHANGE_RQST.XML -c /tmp/stage/$CHANGE_RQST.TXT
for SERVER in $SERVERLIST
# echo $SERVER $CHANGE_RQST $ETL_SVR
# echo $SOURCE_TXT $SOURCE_XML $SOURCE_DTD $TMP_DIR
Anything run in the 'external command' will be executed by /bin/sh on the target system, in the context of the target system.
Any nsh scripts are going to execute in the context of where nsh was executed from (eg the appserver or your workstation) - to execute anything on a target system you must use nexec.
So when you do the 'source' commands it's sourcing those variables in to the nsh shell you're operating in, but to really execute the pmrep command you should nexec it. but you'd need the env variables so you should do something like:
nexec ;pmrep ..."
I think, if you are going to run this through nsh.
OK... this should work. Running the below command within a session directly on the target server is successful if logged in as ID userX (i've masked some bits to protect the innocent)...
source //informatica811/server/bin/pmrep ObjectImport -i /tmp/stage/CM99999.XML -c /tmp/stage/CM99999.TXT"
However, here's what happens...
The 1st of the strung together commands issuccessful... (else the subsequent would fail with no path to the executables)
The 2nd of the strung together commands is succesful... (the feedback in the job deploy log indicates a successful connection).
The 3rd of the strung together commands fails. The command pmrep is found (proper path from 1st command) however it reports the connection wasn't found and to first connect to the repository using the connect command (command #2). Command 3 is then reported as having failed.
I've also tried stringing the commands together with && (no better luck).
Any other ways to ensure the commands run within the same shell, even if not directly strung together?