Skip navigation
1 2 3 Previous Next


36 Posts authored by: Joe Goldberg Employee

-by Joe Goldberg, Solutions Marketing Manager, Control-M Solutions Marketing, BMC Software Inc.


You’re implementing a new application and you have either downloaded a new Application Integrator job type from the Application Hub or built a job type for it. How is your life better than if you had not done that?2me.png



You can build a Connection Profile which contains all the general information about your application such as which server it’s on, which ports or libraries it uses and any credentials that may be required to run its jobs.


This means none of your jobs have to specify this detailed information and if it ever changes, you just update the connection profile instead of tons of jobs or scripts.


Your auditors and ultimately your management love you because you don’t fail audits and you don’t expose potentially sensitive and thus dangerous information.



When you build jobs, it’s IDENTICAL to the way you build jobs today. Grab the job type from the job palette and drop it into the flow wherever you need it. The forms ensure you specify correct application characteristics because you can select from a list that is retrieved from the application in real time. And if you ever need help with a particular application, the information as to which person or group for you to contact for support is part of the job type. If you use Control-M Workload Change Manager, you can even specify site standards so that these application jobs are built correctly and in adherence to your operational requirements.



When you are running these jobs and someone in the business tells you to hold all jobs for “application x”? NO SWEAT! Find, filter or create a viewpoint (your choice) that shows you all the jobs in the active environment for that application and hold them or delete them or whatever you need to do. Because Application Integrator created a new “job type” you can search for it easily. And if that same person in the business told you to change something in a bunch of those jobs? You can use Find and Update to do that quickly.



What about looking at output when something fails? Well, if your developers wrote their own scripts, they probably put the output somewhere where they can get it. The problem is the person analyzing the current problem may not know where that is or worse not have access to it. And each script could be different and your developers spent a lot of time writing that code. Tell them they don’t have to bother because Control-M will take care of it for them when flows are built using Control-M job types instead of scripts that run as “black boxes”.


Finally, considering you are probably seeing more and more new applications and they’re coming at a faster and faster rate, all the above benefits may sound better and better.


So why not visit the Application Hub today and grab a new job type for a new application you are implementing. If there isn’t one there, perhaps you can build it and help the next person who may need that very same job type. Remember, you have the technology!


The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC

-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.


MapReduce, Pig, Hive and Sqoop are “legacy” Big Data applications that require workflow management with Spark, Flume, Kafka and dozens others arriving on a regular basis.  And to make matters more interesting, embedded workload solutions like Oozie and Sparrow don’t know how to handle anything outside their own world.   So how do you cope with complex workflows that include many of these technologies, especially since new ones keep arranging on the scene?


Control-M with Application Integrator is the BMC answer. Control-M provides core application integration out of the box for MapReduce, Pig, Hive, Sqoop and HDFS operations together with every major commercial platform, technology and application from the traditional enterprise computing world. And using Control-M Application Integrator, all other applications not included in the previous lists can be easily supported by building a job type with a simple web-based designer tool.  Or even better, operations and developers can look for job types that may have already been built by other users and shared via the unique Control-M Application Hub. This approach enables Control-M to deliver a complete workflow management platform that meets all your current and future requirements.


BMC Control-M has strongly committed to the Big Data Market and is continuing its investment by:Snap1.png

  1. Platinum sponsorship at Hadoop Summit San Jose 2015
  2. Releasing Application Integrator
  3. Joining the Open Data Platform consortium


Stop by booth P9 at Hadoop Summit and learn more about the most comprehensive workflow solution in the Big Data and Enterprise market - BMC Control-M.



The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.


Whether you are already doing hadoop or just trying to learn, the #HadoopSummit in Brussels April 15-16 is for you. Snap4.pngThis is the premier community event in Europe with over 1,000 attendees expected. There are 6 tracks with dozens of sessions covering everything you should be thinking about as you try to determine if Hadoop is right for your organization or how to best implement it if you are already committed to the technology.


This event is also a great opportunity to learn how you can build Big Data Insights faster, as much as 30% faster. And once you get them running, make sure they operate more reliably so developers can sleep at night and be productive in the morning instead of being bleary-eyed from fixing problems all night.


We are joining our partner @Hortonworks, who is hosting this event. Visit @BMCControlM at Booth 12 and come listen to my session; Oozie or Easy; Managing Hadoop Workflows the EASY Way.


The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software


Interest in Big Data has gone global with organizations around the world aggressively jumping onto the Hadoop platform. The leader in open source Hadoop is Hortonworks and BMC is proud to be their partner. We have just completed joint promotion of Hadoop workload management with BMC Control-M at #Strataconf in Santa Clara and will continue to spread the word of Control-M workflow management for Hadoop through our participation in the Hortonworks Modern Data Architecture RoadShow. In the last several months this program generated great interest with sold-out attendance in Atlanta, Dallas, New York, Boston, Washington DC, Chicago, Seattle and San Francisco and London.HadoopWorldMap.png


The global tour continues with events scheduled for:

  • Paris – March 3
  • Munich – March 5
  • Tokyo – March 10
  • Los Angeles – March 24
  • Houston – March 26


Each event is a full day of Hadoop information for both business and technical audiences focusing on how organizations can unlock the potential for Hadoop including case studies and a customer speaker.


The cost of attendance is either free or a nominal $99. This makes the event very accessible so the demand will be high. Be sure to register using this link as soon as you can.


  Come and join us to learn how Control-M can help your organization harness the power of Hadoop. Accelerate deployment of applications, run them with the highest level of service quality and gain the agility to quickly get from Big Data to Big Answers to power your business


The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.


Anybody familiar with Kerberos? How about Hive Server 2 or the Streaming API or Tajo? Great if you are, but it’s OK if you're not. Ask 100 IT folks and most likely the majority won’t know either.dreamstime_30.png


So why is it a big deal that BMC Control-M has just announced support for these technologies especially if relatively few are using them today? Thought you’d never ask!


Most would agree that Hadoop and Big Data are some of the hottest terms in the IT world today. And even if many organizations still are just kicking the tires or scratching their heads about how exactly to operationalize this technology, many are already deploying it and getting massive value from their investments. And the big news is that by 2020, this will be a $50 Billion market which means almost everybody will be using this stuff.


Now for the sixty four thousand dollar question. When you get around to evaluating a new technology, do you want to bet on a vendor that is a laggard and just recently joined the party or on the world’s best solution that’s been supporting that technology for years with deep experience and expertise?


Ok, yes that was rhetorical.


If and more likely when Hadoop is on your plate, you should know that Control-M for Hadoop was originally released in June 2013.  And since then Control-M for Hadoop has seen significant global adoption across multiple companies and industries driven by continuous delivery of valuable enhancements. As of this latest update, support is provided for:


And yes, the Big Data world is increasingly populated with new “animals” and projects on an almost-daily basis. Rest assured that the BMC Control-M team is fully committed to supporting this exciting and expanding ecosystem.  If you want to extract every benefit for enterprise batch processing with Hadoop today, and be assured of having the best of all possible worlds as you expand and manage this composite business workload into the future, call on Control-M.


Read, watch and learn more by visiting here.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software



-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.


I hate to wait as much as the next guy. But some of the claims we are seeing these days in the Big Data market are really just a bit much. Do you really believe you will be processing or “visualizing” petabytes of data instantaneously?Wait In Line.jpg If you do, I’ve got some property in Florida I’d like to tell you about.


Now there are all kinds of tried and true approaches for “pre-processing”, caching, cleansing, filtering and other similar techniques which result in subsequent queries running really, really fast. But if you think you will be able to ingest petabytes of raw, unknown data and “instantaneously” derive actionable, insightful data, well you are just setting yourself up for disappointment.


The worst part is that this pattern of unattainable expectations leading to expensive and bitter disappointment is repeated with just about every new technology that comes along. Gartner Inc. calls it the Hype Cycle and it contains the Peak of Inflated Expectations followed by the Trough of Disillusionment.


Specifically, I am referring to managing batch in Hadoop. This is a fundamental management discipline that you have to get right to be successful. Assuming you won’t need it and therefore ignoring it is a recipe for something considerably less than success.



Hadoop 1 was predominantly a batch MapReduce environment. With Hadoop 2, we are seeing the unrealistic embracing of real-time and instant to the exclusion of batch as if these two computing models are mutually exclusive! Think about the whirlwind adoption of ERPs, arguably a computing revolution equal to this one caused by Big Data and still ongoing. Everything was going to be transaction-oriented and happen in real time. The reality is that ERPs are HUGE generators of batch workload.



And it seems obvious to me that no matter how powerful and sophisticated computing gets, that “batch” or non-interactive processing will continue to hold a position of prominence. After all, one of the reasons we have computers is to free humans from repetition and drudgery. The smarter computers get, the more they can do on their own (which is actually a pretty good definition of “batch).



So, as computers and computing get smarter and smarter, the way those computers do “batch” should be evolving too. That’s the idea of BMC Control-M; “smart batch”. Smart enough to manage Hadoop and non-Hadoop as a single, cohesive business service. Smart enough to manage service levels no matter how complex the collection of tasks. Smart enough to help you build and deliver your applications in the shortest possible time and smart enough to be used by even novice users from traditional desktops or mobile devices.



We showed smart batch for the 21st century with BMC Control-M at Hadoop Summit in San Jose June 3-5. We were thrilled by the enthusiastic response. It seems there still ARE a lot of folks who DO agree that batch is today and will continue to be a critical part of their enterprise and Big Data computing after all.



The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.

In only a few short years since Hadoop and Big Data have burst onto the enterprise IT landscape those technologies have gone from a novelty with promise to strategic components of a modern data architecture. In fact according to Gartner Inc. almost 70% of all organizations have either already invested or are planning to invest in Big Data.


It is because Hadoop is at the heart of this accelerated evolution and corporate embracing of Big Data that we are excited to announce our partnership with Open Source Hadoop leader Hortonworks. As enterprises seek to unlock the value of Big Data, they discover that the Hadoop ecosystem must coexist and interoperate with their traditional technologies and applications. To be successful, they must be able to manage Hadoop with the same rigor and to the same levels of service as their current offerings. BMC Software is a recognized leader in enterprise IT Management and with our new solutions for Hadoop we are helping customers realize the value of their Big Data initiatives.


Control-M for Hadoop simplifies and automates Hadoop and non-Hadoop batch services. By eliminating the need for scripting, applications can be delivered faster. Quality of service is increased with robust operational facilities such as notification, incident management and auditing. Rapid business change is enabled by providing business architects with comprehensive application support for traditional tools like Informatica, IBM Datastage, IBM Cognos, file transfer, ERP and relational databases as well as for popular Hadoop projects like MapReduce, Pig, Hive and Sqoop.


Atrium Discovery and Dependency Mapping helps organizations restore service faster by replacing dependence on tribal knowledge with reliable Hortonworks_MDA.pngconfiguration and relationship data, minimizes change risks by empowering your Change Advisory Board with trusted dependency data to evaluate change impact, reduce software audit prep time and easily prove inventory accuracy to vendors, reducing risk of non-compliance penalties and prevents outages when moving data center assets for consolidation, cloud, and virtualization projects.


Additional solutions from BMC, including Bladelogic Server Configuration, Release Lifecycle Management and Cloud Lifecycle Management, enable companies to add Hadoop into their enterprise application fabric and manage it using the same mature and robust tools they already own and are familiar with.


Hortonworks is a leader in the Hadoop market, supplying the industry’s only 100-percent open source enterprise Hadoop distribution. The combined software offering from BMC and Hortonworks empowers customers with big data management products and services to help drive their businesses forward.


Listen here to a joint webinar discussing this partnership and the value it brings to customers. Use these links to learn more about BMC and Hortonworks.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.


Imagine you’re an IT Operations dude or dudette. You’ve been working twelve hour shifts for three nights straight and tomorrow is Friday. You have a great long weekend planned.


Suddenly, you get a notification that job PYPWCR12 aborted. You snap into action and call the duty programmer, anxiously waiting until the problem is fixed and you can breathe a deep sigh of relief.


You are a dedicated employee and your company’s production is important to you but you also happen to know that PYPWCR12 is the job that makes the bank deposit that will ensure you can cash your payroll check and truly enjoy that weekend you have planned. Not only do YOU know it but that duty programmer also knew it and probably responded just a little faster than normal; perhaps for similar reasons to yours.


How did you know? Well, because you have standards in your installation. The first two characters of a job name must be a valid application (PY is payroll) and “P” means it’s a production job (this is for real and not just some programmer running tests), it’s the Weekly job that does Check Reconciliation.


Of course you know standards are important for more than just letting IT Ops folks plan for a great weekend.


Standards are critical for all kinds of reasons such as ensuring consistency, simplifying administration and security and making important information known to all interested parties. But how exactly are standards enforced? For batch job and workload definitions, usually the answer is manually and that is not a great answer. Let’s look at a simple example in that scenario above. QualityControl.jpg


What if your organization enforces security for job access? That on-call programmer can only look at jobs that are part of the production payroll application. Recently, a change was made to one of the jobs and instead of starting the job name with PYP…, a typographical error was made and the job name was PYBWCR12. When that job failed, perhaps it was not identified correctly as being part of production payroll? If it was properly identified, perhaps the on-call programmer could not access the output. Perhaps the failure itself was due to some check for the job name (very common in z/OS environments).


I mention z/OS because job name validation in this platform is a common practice that has evolved over decades but is almost completely absent in other environments. Standards enforcement is largely a manual process subject to all the challenges that manual processes present.


I’ve focused on job names but of course standards are critical for almost every character or attribute of a job.

  • If the “Run As” user is invalid, the job will likely fail security validation
  • If the filepath is invalid, the required script will not be found or even worse, the wrong script may be executed
  • If the wrong email address used for failure notification, no one will respond to correct a critical problem
  • If the application or sub-application name is incorrect, the job may not be seen by its owners or the folks repsonsible for fixing problems.
  • If documentation is omitted, Operations may not have proper instructions for handling the job


The list really does go on and on.


What makes this situation even more challenging is that many organizations have multiple standards. z/OS job names are eight characters and may be very similar to PYPWCR12 but in the very same flow, there may be Unix jobs that have names like “Payroll_Check_Reconcilliation_Bank_Deposit”. It’s common to have different standards for Windows versus Unix/Linux versus ERP versus File Transfers and so on. This makes it exceedingly difficult to remember all this and to get it right for application developers who submit requests or for centralized operations teams who build or modify job definitions.


A great way to deal with this problem is to make the workload automation solution smart enough to know your standards and to enforce them when job requests are submitted and when jobs are modified or built. That’s the idea behind the Site Standards facility of BMC Control-M Workload Change Manager. You can define any number of Site Standards you require. You can allow users to select standards or you can set a specific standard for a folder. If request submitters, usually application developers or business analysts, are very familiar with the site standards, you can insist the requests comply with the standards, or, if your users are not that knowledgeable, you can accept requests with errors.


Here’s the thing; standards are also enforced for schedulers working with the Workload Automation client so that validation must be performed and passed before changes are committed. No more guessing, hoping and relying on only human eyeballs to stand between managing job definitions and job failure.


If you have a solution that you have implemented or if you are suffering with a process that is less than perfect, tell us all about it and share with the community.



The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.

Google calls online decision-making for product purchases the Zero Moment of Truth — ZMOT. Snap39.jpgIt is the trend we see all around us that has completely changed the way we purchase goods and services both as consumers and as enterprises. BMC Control-M Workload Automation is bringing this approach to enterprise workload automation. We began with a test drive for Control-M Self Service and are now extending it to our newest component BMC Control-M Workload Change Manager.


Every organization that manages workload automation (basically every company that manages an IT environment) deals with a constant challenge when it comes to implementing changes to batch services. Usually, Application Development and IT Operations must collaborate and these groups do not speak the same language. When the time comes to make such changes, AppDev submits a request to IT using some negotiated process that almost never has any connection to the workload automation tools used by the organization. Instead, Excel spreadsheets, Word documents, email and other such methods are used to submit information. Finally, job definitions are built by schedulers. The process is completely manual. The requestors usually don’t know all the information that is required so mistakes are common. Because the jobs are hand built, yet another opportunity for errors is introduced. And if any required information is omitted, the request is rejected and the process repeats this entire cycle, frequently many times.


Meanwhile, the AppDev requestors are frustrated. The IT schedulers receiving erroneous or incomplete requests are frustrated. Operations that has to fix failed jobs resulting from manual entry errors is frustrated. And worst of all, the business is not getting its new applications and digital services so the CxOs and the shareholders are frustrated.


ENOUGH, you say. YES, I have that problem and I’m sick and tired of it. Can you show me a better way?


Well, I thought you’d never ask! BMC DOES have a better way and we want you to see it for yourself! Take a few minutes and take the Control-M Workload Change Manager Test Drive ("faire un essai" or Probefarht; yes it is available in French or German too) I guarantee that within ten minutes, you will be creating job flows even if you never used Control-M before. And, after you have had that experience, ask all your AppDev folks that submit those requests today to also take our Test Drive. I know they will thank you and your business will thank you. And you can claim some street cred with your procurement folks for exercising your electronic consumer rights and bringing ZMOT to your workload automation acquisition process.


The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software


-by Joe Goldberg, Solutions Marketing Manager, Control-M Solutions Marketing, BMC Software Inc.


This blog is a wrap up of all the questions asked during Q&A in the March 18, 2014 Webinar titled Q&A “Self Service Workflows at the Speed of Your Applications”. You can view a recording of this webinar here.


Q:     Can you pull from the active schedule like a workspace?

It is assumed this question relates to whether you can access existing folders (tables in pre-V8 lingo) from Workload Change Manager web application. If that assumption is correct, the answer is yes. Standard EM Authorizations apply to determine what users can see and access. Once logged in, the "Open Existing job flows" from the Home Page displays a selection dialog of all the folders to which the user has access.


Q:     Is there an additional cost for it?

BMC Control-M Workload Change manager is an add-on component. I strongly recommend you discuss the topic of cost with your BMC Account Manager. If you do not know who that is, please drop me a note ( and I will make sure the appropriate person contacts you.


Q:     Does the creation of the job request include a place for information such as what to do when the

         job fails. Rerun instructions, etc.

Generally in Control-M, actions to be preformed when a job fails can be provided in a variety of ways including text for human consumption in either the description of documentaiton fields as well as automated actions in the "Actions" section of the job definition. With Workload Change Manager, the availability of any or all of these Control-M functions can be controlled via site customizations. Fields can be hidden or shown and can be forced to be required (even if not required by Control-M in general). Additionally, Workload Change Manager provides the Notes fucntion which is intended as a dialog between the requestor and the scheduler. If the requestor is not familiar with the Control-M paramters, for example how to code a DO Action to kill or re-run a job, or you do not wish users to code such parameters because they may not have sufficient infromation to do so properly, Notes can be used to informally describe the requirements and the scheduler can then implement the required funcitonality.


Q:     Would scripts that are in the job definition be able to be changed with the change manager

If jobs contain embedded scripts and your site customization has been configured to show the embedded script, then requestors will be able to modiify or create embedded scripts for jobs contained in the request they submit.


View a recording of this webinar here.


-by Joe Goldberg, Lead Solutions Marketing Manager, BMC Software Inc.

Occasionally you may have to do some scripting to achieve some specific automation or integration goal. In a previous post, I mentioned the Control-M/Enterprise Manager API, which is the best choice when programming in Java and similar languages. In this post, I discuss a command line utility, called “cli”, which provides many of the same capabilities and is the best choice for shell scripts, batch files and other script-like environments like event management and other data center automation solutions.

There are a variety of other utilities many users are familiar with such as ctmorder and ctmcreate. However, these are platform specific and must be executed on either a specific agent or Control-M Server. If you desire or need solutions that are platform neutral, “cli” is the better choice.

Here are the functions you can perform with this utility:


  • Upload or download folders
  • Order or force folders
  • Order or force jobs
  • Force jobs in a folder
  • Upload or download calendars
  • Delete job processing definitions from folders



cli [{(-U emUser -P emPass) | -pf passwordFile}] -h hostName [-t timeout] [-DDMM] [-BY_FORCE] <cmd> [ACTION_REASON <reason for taking an audit action> [ACTION_NOTE <descriptive reason for audit action>]...



em cli [{(-U <emUser> -P <emPass>) | -pf <passwordFile>}] -h <hostName> [-t <timeout>] [-DDMM] [-BY_FORCE] <cmd> [ACTION_REASON <reason for taking an audit action> [ACTION_NOTE <descriptive reason for audit action>]...


Valid values for <cmd>:



This utility is installed with the Control-M client.

Sample Perl Code

Here is a sample section of perl code using the cli utility. This is from a script that defines users in the Enterprise Manager and then sets up user-specific demo jobs. This code was used to provide a self-service user registration for a Control-M Self Service Test Drive demo environment. A template set of jobs, as an XML document, is customized for the new user being defined. That XML is imported in the EM database using the emdef utility, uploaded using the cli utility (no longer necessary if using automatic synchronization in Control-M version 8 and above) and then immediately ordered into the active environment. In this way, if a brand new user is defined and then logs in, there are already Sample jobs for that user to experiment on and learn how to use the newly deployed environment.You can get the full perl script here

@EMDEFResp = `emdef deftable -u $emuser -p $empass -s $server -src_file $XMLoutput`;
$EMDEFResultCode = $?;
print "The result for DefTable is: $EMDEFResultCode \n";

if ($EMDEFResultCode > 0) {
for ($idx = 0; $idx < ($#EMDEFResp + 1); $idx += 1) {
      print "Response".$idx.": ".$EMDEFResp[$idx]."\n";
else {
@CLIResp = `cli -u $emuser -p $empass -H $server -TABLE_UPLOAD $DC $Company`;
$CLIResultCode = $?;
print "The result for Table Upload is: $CLIResultCode \n";

if ($CLIResultCode > 0) {
  for ($idx = 0; $idx < ($#CLIResp + 1); $idx += 1) {
       print "Response".$idx.": ".$CLIResp[$idx]."\n";
else {
  @FORCEResp = `cli -u $emuser -p $empass -H $server -TABLE_FORCE $DC $Company ODAT`;
  $FORCEResultCode = $?;
  print "The result for Table Force is: $FORCEResultCode \n";  


The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Perl scrip to create a BMC Control-M Enterprise Manager user


-by Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.


You’ve installed BMC Control-M Self Service and run a pilot. All the users that saw it absolutely LOVE it! And now you are planning your production rollout. Or, you’ve been running it for a while and a bunch of your business systems are being re-branded and you want to change the service names or some other attributes. You can do these things manually but if you have a whole bunch and each one is a bit different so that there’s no pattern that would apply (if there was, you may be able to use Service Rules and make your job even easier). Or, you are making a relatively few simple changes but you need to push those updates together with other changes via your change management process and manual updates don’t fit that methodology too well.


For all those situations, you can define or modify Service Definitions using XML documents and the emdef utility.


If you’re not familiar with the XML syntax and your DTD reading skills are a bit dusty, a quick and dirty way to learn is to build a service definition manually via the interactive wizard available from the Workload Automation client (“Service Definitions” in the Tools domain) and then export it. With that approach in mind, I mention export first.


Export Syntax

Execute the following command from a Command Prompt or terminal window (or read the doc):


emdef exportdefservice

The response gives you usage information as follows:


Description: exportdefservice


The exportdefservice utility exports services from the CONTROL-M/EM database to a text file, according to the criteria supplied in an XML file.



emdef exportdefservice [-USERNAME <user> [-PASSWORD <password>] | -PASSWORD_FILE <password file>] -HOST <GUI Server Name> -ARG_FILE <args file name> -OUT_FILE <file name>




emdef exportdefservice  [-u <user> [-p <password>] | -pf<password file>] -s <GUI Server Name> -arg <args file name> -out <file name>

Switches can be specified in any order.




/? Displays utility's description and available options.

/a Accept all. Directs the utility to automatically reset the Author parameter to the current CONTROL-M/EM user when these two values do not match.


If not specified, the utility skips every job definition whose Author does not match the currently logged in user.

So, to export service definitions, here is a possible command line:

emdef exportdefservice -u <em username> -p <password> -s hou-ctmwsvr-01 -arg <argfile> -out serv.out

Here is a sample “arg file”. This xml document tells the utility which services you want to export. In the example below, all services are exported (SERVICE NAME =”*”).


Arg File


<?xml version="1.0" encoding="UTF-8"?>












Out File

The output is also an xml file. In my case, a portion of it looks like this:

<?xml version='1.0' encoding='ISO-8859-1' ?>



  <SERVICE CREATED_BY="JoeG" CREATE_TIME="20110706181040UTC" INSTANTIATION_TYPE="Filter"           LAST_UPDATE_TIME="20110706181501UTC" NAME="BAO Jobs" ORDERABLE="0">



    <TERM >











Once you have either built or modified an xml document, you can then import service definitions with the same utility. This time, the syntax is similar to:


Import Syntax

emdef defservice [-USERNAME <user> [-PASSWORD <password>] | -PASSWORD_FILE <password file>] -HOST <GUI Server Name> -SRC_FILE <XML file name> [/a]


- or –

emdef defservice [-u <user> [-p <password>] | -pf <password file>] -s <GUI Server Name> -src <XML file name> [/a]

Audit Annotation:

        [-action_reason <reason for taking an audit action>] [-action_note <descriptive reason for audit action>]


if you have any questions about the usage of this utility or how to administer Service Definitions, drop me a note or post a comment.


In the next volume of this blog, I'll discuss some other nifty utilities that don't get as much as attention or use as they may warrant. I hope you drop by for a view.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software


-by Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.

BMC Control-M is a very rich and comprehensive solution for enterprise workload automation. I have been privileged to work with this solution and watch it evolve for over two decades. During that time, I’ve encountered lots of interesting technical challenges and have either learned of, or been able to participate in finding, interesting solutions. I’ve decided to start this blog series to describe some of the goodies I’ve gathered and hopefully encourage some of you to share yours. If you have a challenge of your own, please start up a discussion and let’s put the collective power of this community to work on solving it.


For my first topic, I’d like to discuss the Control-M Workload Automation API. I decided to start with this topic because I have recently fielded some questions from very experienced users who clearly were not familiar with the existence of this capability.


There is a Java API that is a standard part of Control-M/Enterprise Manager. That very same API is also exposed in the Business Process Integration Suite (BPI) via Messaging and Web Services. The BPI component is included in the current pricing models but there may still be some organizations that have not licensed this component. Assuming BPI is available, choose the technology you are most comfortable or familiar with.


It seems most folks today are pretty comfortable with Web Services and there are some great tools that let you very easily prototype Web Services to learn the request structure and examine the responses. One such tool is SoapUI which is free and really simple to use. You can easily experiment without investing any programming time and then launch into coding once you know exactly what you need to do.


So what can you do with this API? There are a number of requests related to manipulating job definitions and folders. However, I believe the most likely functions you will use are operational actions normally performed interactively in the Monitoring Domain of the Workload Automation client that can be accomplished programmatically with this API. You can find all the information you will need in the Control-M Workload Automation 8.0.00 API Guide. The publication name differed slightly in previous versions (Control-M/Enterprise Manager API Developer Guide).


  • Order  Inserts a job or folder into the Active Environment without considering scheduling criteria (force) or subject to scheduling criteria (order)
  • Job creation  Creates a job, SMART Folder, or sub-folder into the Active Jobs database
  • Add condition  Add conditions
  • Delete condition  Delete conditions
  • Job actions in active jobs database Performs actions on jobs in the active environment
    • Hold
    • Free
    • Confirm
    • Kill
    • Set to OK
  • Job tracking Polls Control-M/Enterprise Manager for job completion
  • Retrieve jobs Retrieves information about jobs in the active environment
  • Change alert status Changes the status of an alert
  • Retrieve BIM Services list Get a list of currently active services


The following are a few use cases that I have implemented where I found this API indispensable:

Web Store A typical online store allows consumers to purchase goods and services. At some point, either for each order or after some threshold has been reached, batch processing is required to complete the fulfillment of these orders. The Web Services API is used to “order” a job flow and to inject variable information collected from users as job variables.


Here is some sample java code from the Web Store implementation using the Web Services API.

Job Actions from an
Event Management tool Job alerts are sent to an event management tool. Operations staff that use that tool are not familiar with Control-M and the job types are a mix of z/OS and distributed jobs. It is desired that some actions, sometimes automated by the tool and other times initiated by an operator, are performed such as killing or rerunning a job. The Web Services API is used to retrieve job information to verify the target job and then the desired job action is performed.


I would love to hear about your experience(s) and your use case(s) if you have used these APIs, or if you are just thinking about it and want to discuss it.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software


-by Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.


Backups are a fundamental part of business continuity and storage management. Organizations and users rely on backups to protect their data.


But what if many of the backups taken are not really useful?


Why would that be, you may rightly ask? Because the state or status of data has a huge impact on the value of backups and your backup solution can’t determine that on its own. Yes, I will repeat that; your backup solution NEEDS HELP figuring out what and when to back stuff up.


It’s pretty well known that active databases, for example, need special treatment. If data is being actively updated, you need some special processing to back it uBadBackup.jpgp reliably. But the very same applies to “regular” files that are used by most applications. Submitted for your considering is the following scenario; you have an application that updates some files. The update cycle runs for several hours so you want a backup of the files at the end of each cycle.


If you schedule those backups with your backup solution, how do you know when to run the backup? If you use an arbitrary time and the application execution has changed for any reason (business requirements, increased volume, hardware errors), you wind up backing up useless data and perhaps even interfering with the application.


The simple answer is to let the same solution that schedules the execution of the business application also schedule the backup.That is the simple logic behind Control-M for Backup.


Even simple workloads present this challenge and lots of applications are far from simple. It can be difficult if not impossible to replicate the complex logic flows and dependencies of today’s applications within the crude and simple scheduler of your backup solution. And even if you succeed, it means you now have multiple schedules to maintain and synchronize. Not only is that approach likely to eventually fail and cause problems but it consumes lots of staff resources to administer and operate.


Control-M for Backup addresses these issues. If you add backups to your Control-M application flows, natural dependencies control if and when backups run. If some steps in the application fail, the backups won’t run until those errors have been corrected. If the application is delayed for any reason, again, the backups run only once the complete application has ended successfully. You can even ensure that a new cycle of the application does not start until backups have completed successfully. If any errors do occur or if you need to analyze the execution for any reason, you won’t need to bounce back and forth between different tools. Control-M for Backup helps you not only schedule and run backups but also captures the output and makes it available from all Control-M user interfaces including the Workload Automation Client, the Control-M Self Service web client and iOS and Android mobile apps.


The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share:| Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.


You’ve just been assigned to develop some new business application using Hadoop. You’re excited to learn this new technology or even if you are an experienced Hadoop veteran, each new Big Data application offers interesting challenges and the opportunity to expand your knowledge, never mind enhancing an already impressive resume.


Truth is, it doesn't matter what kind of apps you are building, eventually you're going to run them. That is kind of the point after all.


What do you do today? You may initially run your code by hand, typing in command lines, piping the output to a file then using some “cat” or "more" or “grep” to read your log and figure out what went wrong.


Eventually or maybe right from the start, you write a quickie script; which you then debug and then enhance and then debug, etc.


Once you are all done, you may be ready to go on to your next project. But wait; SOMEBODY is running that script and every once in a while it fails. They call you and then it's back to the debug/enhance cycle. You or someone may ask “can't this thing just send you an email when it fails?”. Guess who's back in the enhance/debug loop.


Then someone may say "hey when other stuff around here fails, we open an incident in our Help Desk system. Can you do that?”. You guessed it - back to the cycle.


This can be an endless process and instead of going on to the next project guess what you wind up doing.


BMC Software can fix this. WE HAVE THE TECHNOLOGY! This has been done by thousands of companies and it's quick, easy, robust, reliable and best of all you don't have to code or debug a single thing. It's what we do for a living and it's called BMC Control-M.


You may be tempted to say "Yeah but it costs money!" Yep, it sure does;  but a lot less than all the programming time you are spending and all the time your company is NOT getting to use the great application you are not writing because you are looping in the enhance/debug" loop and all the time your competition IS using THEIR Hadoop applications to extend their competitive advantage. And that doesn't even begin to take into consideration operational delays, integration with BA/BI/ETL and file transfer and relational databases and cloud and virtualization tools that somebody has to build.


Finally, Hadoop is not any different from lots of technologies that have come before it. You can choose to re-invent the wheel or you can leverage mature solutions to accelerate your development and shorten the time it takes to deliver services your business is desperately waiting for. BMC Control-M is that solution. Your company will thank you, your development manager will thank you and you will be able to spend more time honing your Hadoop skills instead of re-inventing the wheel.


The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Filter Blog

By date:
By tag: