1 2 3 Previous Next

BMC Control-M

34 Posts authored by: Joe Goldberg
Share: |


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.

 

Whether you are already doing hadoop or just trying to learn, the #HadoopSummit in Brussels April 15-16 is for you. Snap4.pngThis is the premier community event in Europe with over 1,000 attendees expected. There are 6 tracks with dozens of sessions covering everything you should be thinking about as you try to determine if Hadoop is right for your organization or how to best implement it if you are already committed to the technology.

 

This event is also a great opportunity to learn how you can build Big Data Insights faster, as much as 30% faster. And once you get them running, make sure they operate more reliably so developers can sleep at night and be productive in the morning instead of being bleary-eyed from fixing problems all night.

 

We are joining our partner @Hortonworks, who is hosting this event. Visit @BMCControlM at Booth 12 and come listen to my session; Oozie or Easy; Managing Hadoop Workflows the EASY Way.

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


Interest in Big Data has gone global with organizations around the world aggressively jumping onto the Hadoop platform. The leader in open source Hadoop is Hortonworks and BMC is proud to be their partner. We have just completed joint promotion of Hadoop workload management with BMC Control-M at #Strataconf in Santa Clara and will continue to spread the word of Control-M workflow management for Hadoop through our participation in the Hortonworks Modern Data Architecture RoadShow. In the last several months this program generated great interest with sold-out attendance in Atlanta, Dallas, New York, Boston, Washington DC, Chicago, Seattle and San Francisco and London.HadoopWorldMap.png

 

The global tour continues with events scheduled for:

  • Paris – March 3
  • Munich – March 5
  • Tokyo – March 10
  • Los Angeles – March 24
  • Houston – March 26

 

Each event is a full day of Hadoop information for both business and technical audiences focusing on how organizations can unlock the potential for Hadoop including case studies and a customer speaker.

 

The cost of attendance is either free or a nominal $99. This makes the event very accessible so the demand will be high. Be sure to register using this link as soon as you can.

 

  Come and join us to learn how Control-M can help your organization harness the power of Hadoop. Accelerate deployment of applications, run them with the highest level of service quality and gain the agility to quickly get from Big Data to Big Answers to power your business

.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software
Share: |


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.

 

Anybody familiar with Kerberos? How about Hive Server 2 or the Streaming API or Tajo? Great if you are, but it’s OK if you're not. Ask 100 IT folks and most likely the majority won’t know either.dreamstime_30.png

 

So why is it a big deal that BMC Control-M has just announced support for these technologies especially if relatively few are using them today? Thought you’d never ask!

 

Most would agree that Hadoop and Big Data are some of the hottest terms in the IT world today. And even if many organizations still are just kicking the tires or scratching their heads about how exactly to operationalize this technology, many are already deploying it and getting massive value from their investments. And the big news is that by 2020, this will be a $50 Billion market which means almost everybody will be using this stuff.

 

Now for the sixty four thousand dollar question. When you get around to evaluating a new technology, do you want to bet on a vendor that is a laggard and just recently joined the party or on the world’s best solution that’s been supporting that technology for years with deep experience and expertise?

 

Ok, yes that was rhetorical.

 

If and more likely when Hadoop is on your plate, you should know that Control-M for Hadoop was originally released in June 2013.  And since then Control-M for Hadoop has seen significant global adoption across multiple companies and industries driven by continuous delivery of valuable enhancements. As of this latest update, support is provided for:

 

And yes, the Big Data world is increasingly populated with new “animals” and projects on an almost-daily basis. Rest assured that the BMC Control-M team is fully committed to supporting this exciting and expanding ecosystem.  If you want to extract every benefit for enterprise batch processing with Hadoop today, and be assured of having the best of all possible worlds as you expand and manage this composite business workload into the future, call on Control-M.

 

Read, watch and learn more by visiting here.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

 

Share: |


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.

 

I hate to wait as much as the next guy. But some of the claims we are seeing these days in the Big Data market are really just a bit much. Do you really believe you will be processing or “visualizing” petabytes of data instantaneously?Wait In Line.jpg If you do, I’ve got some property in Florida I’d like to tell you about.

 

Now there are all kinds of tried and true approaches for “pre-processing”, caching, cleansing, filtering and other similar techniques which result in subsequent queries running really, really fast. But if you think you will be able to ingest petabytes of raw, unknown data and “instantaneously” derive actionable, insightful data, well you are just setting yourself up for disappointment.

 

The worst part is that this pattern of unattainable expectations leading to expensive and bitter disappointment is repeated with just about every new technology that comes along. Gartner Inc. calls it the Hype Cycle and it contains the Peak of Inflated Expectations followed by the Trough of Disillusionment.

 

Specifically, I am referring to managing batch in Hadoop. This is a fundamental management discipline that you have to get right to be successful. Assuming you won’t need it and therefore ignoring it is a recipe for something considerably less than success.

 

 

Hadoop 1 was predominantly a batch MapReduce environment. With Hadoop 2, we are seeing the unrealistic embracing of real-time and instant to the exclusion of batch as if these two computing models are mutually exclusive! Think about the whirlwind adoption of ERPs, arguably a computing revolution equal to this one caused by Big Data and still ongoing. Everything was going to be transaction-oriented and happen in real time. The reality is that ERPs are HUGE generators of batch workload.

 

 

And it seems obvious to me that no matter how powerful and sophisticated computing gets, that “batch” or non-interactive processing will continue to hold a position of prominence. After all, one of the reasons we have computers is to free humans from repetition and drudgery. The smarter computers get, the more they can do on their own (which is actually a pretty good definition of “batch).

 

 

So, as computers and computing get smarter and smarter, the way those computers do “batch” should be evolving too. That’s the idea of BMC Control-M; “smart batch”. Smart enough to manage Hadoop and non-Hadoop as a single, cohesive business service. Smart enough to manage service levels no matter how complex the collection of tasks. Smart enough to help you build and deliver your applications in the shortest possible time and smart enough to be used by even novice users from traditional desktops or mobile devices.

 

 

We showed smart batch for the 21st century with BMC Control-M at Hadoop Summit in San Jose June 3-5. We were thrilled by the enthusiastic response. It seems there still ARE a lot of folks who DO agree that batch is today and will continue to be a critical part of their enterprise and Big Data computing after all.

 

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.


In only a few short years since Hadoop and Big Data have burst onto the enterprise IT landscape those technologies have gone from a novelty with promise to strategic components of a modern data architecture. In fact according to Gartner Inc. almost 70% of all organizations have either already invested or are planning to invest in Big Data.

 

It is because Hadoop is at the heart of this accelerated evolution and corporate embracing of Big Data that we are excited to announce our partnership with Open Source Hadoop leader Hortonworks. As enterprises seek to unlock the value of Big Data, they discover that the Hadoop ecosystem must coexist and interoperate with their traditional technologies and applications. To be successful, they must be able to manage Hadoop with the same rigor and to the same levels of service as their current offerings. BMC Software is a recognized leader in enterprise IT Management and with our new solutions for Hadoop we are helping customers realize the value of their Big Data initiatives.

 

Control-M for Hadoop simplifies and automates Hadoop and non-Hadoop batch services. By eliminating the need for scripting, applications can be delivered faster. Quality of service is increased with robust operational facilities such as notification, incident management and auditing. Rapid business change is enabled by providing business architects with comprehensive application support for traditional tools like Informatica, IBM Datastage, IBM Cognos, file transfer, ERP and relational databases as well as for popular Hadoop projects like MapReduce, Pig, Hive and Sqoop.

 

Atrium Discovery and Dependency Mapping helps organizations restore service faster by replacing dependence on tribal knowledge with reliable Hortonworks_MDA.pngconfiguration and relationship data, minimizes change risks by empowering your Change Advisory Board with trusted dependency data to evaluate change impact, reduce software audit prep time and easily prove inventory accuracy to vendors, reducing risk of non-compliance penalties and prevents outages when moving data center assets for consolidation, cloud, and virtualization projects.

 

Additional solutions from BMC, including Bladelogic Server Configuration, Release Lifecycle Management and Cloud Lifecycle Management, enable companies to add Hadoop into their enterprise application fabric and manage it using the same mature and robust tools they already own and are familiar with.

 

Hortonworks is a leader in the Hadoop market, supplying the industry’s only 100-percent open source enterprise Hadoop distribution. The combined software offering from BMC and Hortonworks empowers customers with big data management products and services to help drive their businesses forward.

 

Listen here to a joint webinar discussing this partnership and the value it brings to customers. Use these links to learn more about BMC and Hortonworks.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.

 

Imagine you’re an IT Operations dude or dudette. You’ve been working twelve hour shifts for three nights straight and tomorrow is Friday. You have a great long weekend planned.

 

Suddenly, you get a notification that job PYPWCR12 aborted. You snap into action and call the duty programmer, anxiously waiting until the problem is fixed and you can breathe a deep sigh of relief.

 

You are a dedicated employee and your company’s production is important to you but you also happen to know that PYPWCR12 is the job that makes the bank deposit that will ensure you can cash your payroll check and truly enjoy that weekend you have planned. Not only do YOU know it but that duty programmer also knew it and probably responded just a little faster than normal; perhaps for similar reasons to yours.

 

How did you know? Well, because you have standards in your installation. The first two characters of a job name must be a valid application (PY is payroll) and “P” means it’s a production job (this is for real and not just some programmer running tests), it’s the Weekly job that does Check Reconciliation.

 

Of course you know standards are important for more than just letting IT Ops folks plan for a great weekend.

 

Standards are critical for all kinds of reasons such as ensuring consistency, simplifying administration and security and making important information known to all interested parties. But how exactly are standards enforced? For batch job and workload definitions, usually the answer is manually and that is not a great answer. Let’s look at a simple example in that scenario above. QualityControl.jpg

 

What if your organization enforces security for job access? That on-call programmer can only look at jobs that are part of the production payroll application. Recently, a change was made to one of the jobs and instead of starting the job name with PYP…, a typographical error was made and the job name was PYBWCR12. When that job failed, perhaps it was not identified correctly as being part of production payroll? If it was properly identified, perhaps the on-call programmer could not access the output. Perhaps the failure itself was due to some check for the job name (very common in z/OS environments).

 

I mention z/OS because job name validation in this platform is a common practice that has evolved over decades but is almost completely absent in other environments. Standards enforcement is largely a manual process subject to all the challenges that manual processes present.

 

I’ve focused on job names but of course standards are critical for almost every character or attribute of a job.

  • If the “Run As” user is invalid, the job will likely fail security validation
  • If the filepath is invalid, the required script will not be found or even worse, the wrong script may be executed
  • If the wrong email address used for failure notification, no one will respond to correct a critical problem
  • If the application or sub-application name is incorrect, the job may not be seen by its owners or the folks repsonsible for fixing problems.
  • If documentation is omitted, Operations may not have proper instructions for handling the job

 

The list really does go on and on.

 

What makes this situation even more challenging is that many organizations have multiple standards. z/OS job names are eight characters and may be very similar to PYPWCR12 but in the very same flow, there may be Unix jobs that have names like “Payroll_Check_Reconcilliation_Bank_Deposit”. It’s common to have different standards for Windows versus Unix/Linux versus ERP versus File Transfers and so on. This makes it exceedingly difficult to remember all this and to get it right for application developers who submit requests or for centralized operations teams who build or modify job definitions.

 

A great way to deal with this problem is to make the workload automation solution smart enough to know your standards and to enforce them when job requests are submitted and when jobs are modified or built. That’s the idea behind the Site Standards facility of BMC Control-M Workload Change Manager. You can define any number of Site Standards you require. You can allow users to select standards or you can set a specific standard for a folder. If request submitters, usually application developers or business analysts, are very familiar with the site standards, you can insist the requests comply with the standards, or, if your users are not that knowledgeable, you can accept requests with errors.

 

Here’s the thing; standards are also enforced for schedulers working with the Workload Automation client so that validation must be performed and passed before changes are committed. No more guessing, hoping and relying on only human eyeballs to stand between managing job definitions and job failure.

 

If you have a solution that you have implemented or if you are suffering with a process that is less than perfect, tell us all about it and share with the community.

 

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


-by Joe Goldberg, Control-M Solutions Marketing, BMC Software Inc.


Google calls online decision-making for product purchases the Zero Moment of Truth — ZMOT. Snap39.jpgIt is the trend we see all around us that has completely changed the way we purchase goods and services both as consumers and as enterprises. BMC Control-M Workload Automation is bringing this approach to enterprise workload automation. We began with a test drive for Control-M Self Service and are now extending it to our newest component BMC Control-M Workload Change Manager.

 

Every organization that manages workload automation (basically every company that manages an IT environment) deals with a constant challenge when it comes to implementing changes to batch services. Usually, Application Development and IT Operations must collaborate and these groups do not speak the same language. When the time comes to make such changes, AppDev submits a request to IT using some negotiated process that almost never has any connection to the workload automation tools used by the organization. Instead, Excel spreadsheets, Word documents, email and other such methods are used to submit information. Finally, job definitions are built by schedulers. The process is completely manual. The requestors usually don’t know all the information that is required so mistakes are common. Because the jobs are hand built, yet another opportunity for errors is introduced. And if any required information is omitted, the request is rejected and the process repeats this entire cycle, frequently many times.

 

Meanwhile, the AppDev requestors are frustrated. The IT schedulers receiving erroneous or incomplete requests are frustrated. Operations that has to fix failed jobs resulting from manual entry errors is frustrated. And worst of all, the business is not getting its new applications and digital services so the CxOs and the shareholders are frustrated.

 

ENOUGH, you say. YES, I have that problem and I’m sick and tired of it. Can you show me a better way?

 

Well, I thought you’d never ask! BMC DOES have a better way and we want you to see it for yourself! Take a few minutes and take the Control-M Workload Change Manager Test Drive ("faire un essai" or Probefarht; yes it is available in French or German too) I guarantee that within ten minutes, you will be creating job flows even if you never used Control-M before. And, after you have had that experience, ask all your AppDev folks that submit those requests today to also take our Test Drive. I know they will thank you and your business will thank you. And you can claim some street cred with your procurement folks for exercising your electronic consumer rights and bringing ZMOT to your workload automation acquisition process.

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


-by Joe Goldberg, Solutions Marketing Manager, Control-M Solutions Marketing, BMC Software Inc.

 

This blog is a wrap up of all the questions asked during Q&A in the March 18, 2014 Webinar titled Q&A “Self Service Workflows at the Speed of Your Applications”. You can view a recording of this webinar here.

 

Q:     Can you pull from the active schedule like a workspace?

It is assumed this question relates to whether you can access existing folders (tables in pre-V8 lingo) from Workload Change Manager web application. If that assumption is correct, the answer is yes. Standard EM Authorizations apply to determine what users can see and access. Once logged in, the "Open Existing job flows" from the Home Page displays a selection dialog of all the folders to which the user has access.

 

Q:     Is there an additional cost for it?

BMC Control-M Workload Change manager is an add-on component. I strongly recommend you discuss the topic of cost with your BMC Account Manager. If you do not know who that is, please drop me a note (joe_goldberg@bmc.com) and I will make sure the appropriate person contacts you.

 

Q:     Does the creation of the job request include a place for information such as what to do when the

         job fails. Rerun instructions, etc.

Generally in Control-M, actions to be preformed when a job fails can be provided in a variety of ways including text for human consumption in either the description of documentaiton fields as well as automated actions in the "Actions" section of the job definition. With Workload Change Manager, the availability of any or all of these Control-M functions can be controlled via site customizations. Fields can be hidden or shown and can be forced to be required (even if not required by Control-M in general). Additionally, Workload Change Manager provides the Notes fucntion which is intended as a dialog between the requestor and the scheduler. If the requestor is not familiar with the Control-M paramters, for example how to code a DO Action to kill or re-run a job, or you do not wish users to code such parameters because they may not have sufficient infromation to do so properly, Notes can be used to informally describe the requirements and the scheduler can then implement the required funcitonality.

 

Q:     Would scripts that are in the job definition be able to be changed with the change manager

If jobs contain embedded scripts and your site customization has been configured to show the embedded script, then requestors will be able to modiify or create embedded scripts for jobs contained in the request they submit.

 

View a recording of this webinar here.

Share: |


-by Joe Goldberg, Lead Solutions Marketing Manager, BMC Software Inc.


Occasionally you may have to do some scripting to achieve some specific automation or integration goal. In a previous post, I mentioned the Control-M/Enterprise Manager API, which is the best choice when programming in Java and similar languages. In this post, I discuss a command line utility, called “cli”, which provides many of the same capabilities and is the best choice for shell scripts, batch files and other script-like environments like event management and other data center automation solutions.


There are a variety of other utilities many users are familiar with such as ctmorder and ctmcreate. However, these are platform specific and must be executed on either a specific agent or Control-M Server. If you desire or need solutions that are platform neutral, “cli” is the better choice.


Here are the functions you can perform with this utility:

 

  • Upload or download folders
  • Order or force folders
  • Order or force jobs
  • Force jobs in a folder
  • Upload or download calendars
  • Delete job processing definitions from folders

 

Windows


cli [{(-U emUser -P emPass) | -pf passwordFile}] -h hostName [-t timeout] [-DDMM] [-BY_FORCE] <cmd> [ACTION_REASON <reason for taking an audit action> [ACTION_NOTE <descriptive reason for audit action>]...

 

Unix/Linux


em cli [{(-U <emUser> -P <emPass>) | -pf <passwordFile>}] -h <hostName> [-t <timeout>] [-DDMM] [-BY_FORCE] <cmd> [ACTION_REASON <reason for taking an audit action> [ACTION_NOTE <descriptive reason for audit action>]...

 

Valid values for <cmd>:


-JOB_ORDER
-JOB_FORCE
-JOB_ORDER_INTO
-JOB_FORCE_INTO
-SUB_FOLDER_FORCE_INTO
-FOLDER_ORDER
-FOLDER_FORCE
-FOLDER_UPLOAD
-FOLDER_ DOWNLOAD
-CAL_UPLOAD
-CAL_DOWNLOAD
-SUB_FOLDER_DELETE
-JOB_DELETE
-MEM_DELETE

 

This utility is installed with the Control-M client.

Sample Perl Code


Here is a sample section of perl code using the cli utility. This is from a script that defines users in the Enterprise Manager and then sets up user-specific demo jobs. This code was used to provide a self-service user registration for a Control-M Self Service Test Drive demo environment. A template set of jobs, as an XML document, is customized for the new user being defined. That XML is imported in the EM database using the emdef utility, uploaded using the cli utility (no longer necessary if using automatic synchronization in Control-M version 8 and above) and then immediately ordered into the active environment. In this way, if a brand new user is defined and then logs in, there are already Sample jobs for that user to experiment on and learn how to use the newly deployed environment.You can get the full perl script here


@EMDEFResp = `emdef deftable -u $emuser -p $empass -s $server -src_file $XMLoutput`;
$EMDEFResultCode = $?;
print "The result for DefTable is: $EMDEFResultCode \n";

if ($EMDEFResultCode > 0) {
for ($idx = 0; $idx < ($#EMDEFResp + 1); $idx += 1) {
      print "Response".$idx.": ".$EMDEFResp[$idx]."\n";
}
}
else {
@CLIResp = `cli -u $emuser -p $empass -H $server -TABLE_UPLOAD $DC $Company`;
$CLIResultCode = $?;
print "The result for Table Upload is: $CLIResultCode \n";

if ($CLIResultCode > 0) {
  for ($idx = 0; $idx < ($#CLIResp + 1); $idx += 1) {
       print "Response".$idx.": ".$CLIResp[$idx]."\n";
  }
}
else {
  @FORCEResp = `cli -u $emuser -p $empass -H $server -TABLE_FORCE $DC $Company ODAT`;
  $FORCEResultCode = $?;
  print "The result for Table Force is: $FORCEResultCode \n";  
}
}

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Perl scrip to create a BMC Control-M Enterprise Manager user

Share: |


-by Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.

 

You’ve installed BMC Control-M Self Service and run a pilot. All the users that saw it absolutely LOVE it! And now you are planning your production rollout. Or, you’ve been running it for a while and a bunch of your business systems are being re-branded and you want to change the service names or some other attributes. You can do these things manually but if you have a whole bunch and each one is a bit different so that there’s no pattern that would apply (if there was, you may be able to use Service Rules and make your job even easier). Or, you are making a relatively few simple changes but you need to push those updates together with other changes via your change management process and manual updates don’t fit that methodology too well.

 

For all those situations, you can define or modify Service Definitions using XML documents and the emdef utility.

 

If you’re not familiar with the XML syntax and your DTD reading skills are a bit dusty, a quick and dirty way to learn is to build a service definition manually via the interactive wizard available from the Workload Automation client (“Service Definitions” in the Tools domain) and then export it. With that approach in mind, I mention export first.

 

Export Syntax


Execute the following command from a Command Prompt or terminal window (or read the doc):

 

emdef exportdefservice

The response gives you usage information as follows:

 

Description: exportdefservice

 

The exportdefservice utility exports services from the CONTROL-M/EM database to a text file, according to the criteria supplied in an XML file.

 

Usage:

emdef exportdefservice [-USERNAME <user> [-PASSWORD <password>] | -PASSWORD_FILE <password file>] -HOST <GUI Server Name> -ARG_FILE <args file name> -OUT_FILE <file name>

 

-or-

 

emdef exportdefservice  [-u <user> [-p <password>] | -pf<password file>] -s <GUI Server Name> -arg <args file name> -out <file name>


Switches can be specified in any order.

 

OPTIONS:

 

/? Displays utility's description and available options.

/a Accept all. Directs the utility to automatically reset the Author parameter to the current CONTROL-M/EM user when these two values do not match.

 

If not specified, the utility skips every job definition whose Author does not match the currently logged in user.


So, to export service definitions, here is a possible command line:


emdef exportdefservice -u <em username> -p <password> -s hou-ctmwsvr-01 -arg <argfile> -out serv.out


Here is a sample “arg file”. This xml document tells the utility which services you want to export. In the example below, all services are exported (SERVICE NAME =”*”).

 

Arg File

 

<?xml version="1.0" encoding="UTF-8"?>

<!DOCTYPE DEFSERVICE SYSTEM "defservice.dtd">

<DEFSERVICE>

<SERVICE NAME="*" INSTANTIATION_TYPE="Filter">

                                <FILTER>

                                                <INCLUDING_TERMS>

                                                </INCLUDING_TERMS>

                                                <EXCLUDING_TERMS>

                                                </EXCLUDING_TERMS>

                                </FILTER>

</SERVICE>
</DEFSERVICE>

 

Out File

The output is also an xml file. In my case, a portion of it looks like this:

<?xml version='1.0' encoding='ISO-8859-1' ?>

<!DOCTYPE DEFSERVICE SYSTEM "defservice.dtd">

<DEFSERVICE>

  <SERVICE CREATED_BY="JoeG" CREATE_TIME="20110706181040UTC" INSTANTIATION_TYPE="Filter"           LAST_UPDATE_TIME="20110706181501UTC" NAME="BAO Jobs" ORDERABLE="0">

  <FILTER >

   <INCLUDING_TERMS >

    <TERM >

     <PARAM NAME="CONTROL-M Name" OP="LIKE" VALUE="CTMEBC"/>

     <PARAM NAME="APPLICATION" OP="LIKE" VALUE="BAO*"/>

     <PARAM NAME="SUB_APPLICATION" OP="LIKE" VALUE="*"/>

     <PARAM NAME="JOBNAME" OP="LIKE" VALUE="*"/>

    </TERM>

   </INCLUDING_TERMS>

  </FILTER>

</SERVICE>

</DEFSERVICE>

 

Once you have either built or modified an xml document, you can then import service definitions with the same utility. This time, the syntax is similar to:

 

Import Syntax


emdef defservice [-USERNAME <user> [-PASSWORD <password>] | -PASSWORD_FILE <password file>] -HOST <GUI Server Name> -SRC_FILE <XML file name> [/a]

 

- or –


emdef defservice [-u <user> [-p <password>] | -pf <password file>] -s <GUI Server Name> -src <XML file name> [/a]


Audit Annotation:

        [-action_reason <reason for taking an audit action>] [-action_note <descriptive reason for audit action>]

 

if you have any questions about the usage of this utility or how to administer Service Definitions, drop me a note or post a comment.

 

In the next volume of this blog, I'll discuss some other nifty utilities that don't get as much as attention or use as they may warrant. I hope you drop by for a view.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


-by Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.


BMC Control-M is a very rich and comprehensive solution for enterprise workload automation. I have been privileged to work with this solution and watch it evolve for over two decades. During that time, I’ve encountered lots of interesting technical challenges and have either learned of, or been able to participate in finding, interesting solutions. I’ve decided to start this blog series to describe some of the goodies I’ve gathered and hopefully encourage some of you to share yours. If you have a challenge of your own, please start up a discussion and let’s put the collective power of this community to work on solving it.

 

For my first topic, I’d like to discuss the Control-M Workload Automation API. I decided to start with this topic because I have recently fielded some questions from very experienced users who clearly were not familiar with the existence of this capability.

 

There is a Java API that is a standard part of Control-M/Enterprise Manager. That very same API is also exposed in the Business Process Integration Suite (BPI) via Messaging and Web Services. The BPI component is included in the current pricing models but there may still be some organizations that have not licensed this component. Assuming BPI is available, choose the technology you are most comfortable or familiar with.

 

It seems most folks today are pretty comfortable with Web Services and there are some great tools that let you very easily prototype Web Services to learn the request structure and examine the responses. One such tool is SoapUI which is free and really simple to use. You can easily experiment without investing any programming time and then launch into coding once you know exactly what you need to do.

 

So what can you do with this API? There are a number of requests related to manipulating job definitions and folders. However, I believe the most likely functions you will use are operational actions normally performed interactively in the Monitoring Domain of the Workload Automation client that can be accomplished programmatically with this API. You can find all the information you will need in the Control-M Workload Automation 8.0.00 API Guide. The publication name differed slightly in previous versions (Control-M/Enterprise Manager API Developer Guide).

 

  • Order  Inserts a job or folder into the Active Environment without considering scheduling criteria (force) or subject to scheduling criteria (order)
  • Job creation  Creates a job, SMART Folder, or sub-folder into the Active Jobs database
  • Add condition  Add conditions
  • Delete condition  Delete conditions
  • Job actions in active jobs database Performs actions on jobs in the active environment
    • Hold
    • Free
    • Confirm
    • Kill
    • Set to OK
  • Job tracking Polls Control-M/Enterprise Manager for job completion
  • Retrieve jobs Retrieves information about jobs in the active environment
  • Change alert status Changes the status of an alert
  • Retrieve BIM Services list Get a list of currently active services

 

The following are a few use cases that I have implemented where I found this API indispensable:


Web Store A typical online store allows consumers to purchase goods and services. At some point, either for each order or after some threshold has been reached, batch processing is required to complete the fulfillment of these orders. The Web Services API is used to “order” a job flow and to inject variable information collected from users as job variables.

 

Here is some sample java code from the Web Store implementation using the Web Services API.


Job Actions from an
Event Management tool Job alerts are sent to an event management tool. Operations staff that use that tool are not familiar with Control-M and the job types are a mix of z/OS and distributed jobs. It is desired that some actions, sometimes automated by the tool and other times initiated by an operator, are performed such as killing or rerunning a job. The Web Services API is used to retrieve job information to verify the target job and then the desired job action is performed.

 

I would love to hear about your experience(s) and your use case(s) if you have used these APIs, or if you are just thinking about it and want to discuss it.

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


-by Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.

 

Backups are a fundamental part of business continuity and storage management. Organizations and users rely on backups to protect their data.

 

But what if many of the backups taken are not really useful?

 

Why would that be, you may rightly ask? Because the state or status of data has a huge impact on the value of backups and your backup solution can’t determine that on its own. Yes, I will repeat that; your backup solution NEEDS HELP figuring out what and when to back stuff up.

 

It’s pretty well known that active databases, for example, need special treatment. If data is being actively updated, you need some special processing to back it uBadBackup.jpgp reliably. But the very same applies to “regular” files that are used by most applications. Submitted for your considering is the following scenario; you have an application that updates some files. The update cycle runs for several hours so you want a backup of the files at the end of each cycle.

 

If you schedule those backups with your backup solution, how do you know when to run the backup? If you use an arbitrary time and the application execution has changed for any reason (business requirements, increased volume, hardware errors), you wind up backing up useless data and perhaps even interfering with the application.

 

The simple answer is to let the same solution that schedules the execution of the business application also schedule the backup.That is the simple logic behind Control-M for Backup.

 

Even simple workloads present this challenge and lots of applications are far from simple. It can be difficult if not impossible to replicate the complex logic flows and dependencies of today’s applications within the crude and simple scheduler of your backup solution. And even if you succeed, it means you now have multiple schedules to maintain and synchronize. Not only is that approach likely to eventually fail and cause problems but it consumes lots of staff resources to administer and operate.

 

Control-M for Backup addresses these issues. If you add backups to your Control-M application flows, natural dependencies control if and when backups run. If some steps in the application fail, the backups won’t run until those errors have been corrected. If the application is delayed for any reason, again, the backups run only once the complete application has ended successfully. You can even ensure that a new cycle of the application does not start until backups have completed successfully. If any errors do occur or if you need to analyze the execution for any reason, you won’t need to bounce back and forth between different tools. Control-M for Backup helps you not only schedule and run backups but also captures the output and makes it available from all Control-M user interfaces including the Workload Automation Client, the Control-M Self Service web client and iOS and Android mobile apps.

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


http://www.bmc.com/hadoop-by Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.

 

You’ve just been assigned to develop some new business application using Hadoop. You’re excited to learn this new technology or even if you are an experienced Hadoop veteran, each new Big Data application offers interesting challenges and the opportunity to expand your knowledge, never mind enhancing an already impressive resume.

 

Truth is, it doesn't matter what kind of apps you are building, eventually you're going to run them. That is kind of the point after all.

 

What do you do today? You may initially run your code by hand, typing in command lines, piping the output to a file then using some “cat” or "more" or “grep” to read your log and figure out what went wrong.

 

Eventually or maybe right from the start, you write a quickie script; which you then debug and then enhance and then debug, etc.

 

Once you are all done, you may be ready to go on to your next project. But wait; SOMEBODY is running that script and every once in a while it fails. They call you and then it's back to the debug/enhance cycle. You or someone may ask “can't this thing just send you an email when it fails?”. Guess who's back in the enhance/debug loop.

 

Then someone may say "hey when other stuff around here fails, we open an incident in our Help Desk system. Can you do that?”. You guessed it - back to the cycle.

 

This can be an endless process and instead of going on to the next project guess what you wind up doing.

 

BMC Software can fix this. WE HAVE THE TECHNOLOGY! This has been done by thousands of companies and it's quick, easy, robust, reliable and best of all you don't have to code or debug a single thing. It's what we do for a living and it's called BMC Control-M.

 

You may be tempted to say "Yeah but it costs money!" Yep, it sure does;  but a lot less than all the programming time you are spending and all the time your company is NOT getting to use the great application you are not writing because you are looping in the enhance/debug" loop and all the time your competition IS using THEIR Hadoop applications to extend their competitive advantage. And that doesn't even begin to take into consideration operational delays, integration with BA/BI/ETL and file transfer and relational databases and cloud and virtualization tools that somebody has to build.

 

Finally, Hadoop is not any different from lots of technologies that have come before it. You can choose to re-invent the wheel or you can leverage mature solutions to accelerate your development and shorten the time it takes to deliver services your business is desperately waiting for. BMC Control-M is that solution. Your company will thank you, your development manager will thank you and you will be able to spend more time honing your Hadoop skills instead of re-inventing the wheel.

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


-by Joe Goldberg, Solutions Marketing Consultant, BMC Software Inc.

 

This is a wrap up of the questions that were asked in this webinar and the answers provided by BMC Software.

 

What is a good example of operational processing that you have moved to Hadoop?

 

MetaScale described how they analyzed some of their more complex mainframe flows and identified the most processing-intensive steps. They then moved those steps to Hadoop leaving the rest of the work on the mainframe. With BMC Control-M, that is a relatively minor task because the logical business flow remains the same and even the structure of the flow remains the same. All that has occurred is that some of the steps that previously ran on one platform (mainframe) are now running on a different platform (Hadoop) with the possible addition of some new steps to perform the movement of data (also managed by Control-M).

 

This approach has almost zero impact on Operations. There are no new tools to learn and no new flow requirements. The SLAs remain the same, the notifications and ll other operational tasks remain unchanged. If the Development Teams are also using Control-M, the impact on them is similarly minor.

 

How long does it take to create an enterprise data hub?

 

Of course this varies widely from organization to organization. One way to achieve such goals more quickly is to work with a service provider like MetaScale who has the experience to assist and guide you in the process.

 

In a Hadoop cluster, how many data nodes can you lose before it presents a problem?

 

That number is mostly a function of the replication factor you are using for HDFS. The default value is three, which means there are three copies of each HDFS data block on three different data nodes. In such a situation, if you are to lose three data nodes simultaneously (relatively), you can experience data loss.

 

Has the number of incidents, or failures that impact service to clients, been reduced by adding Hadoop to the environment?

 

The number of incidents has not necessarily been reduced but the overall level of service has definitely improved. Using Hadoop, some processing that affected customer service and previously took a very long time is now being performed more quickly resulting in a higher level of customer satisfaction. Another byproduct that has been observed is that product pricing and specialized offers have become either more competitive or better suited to individual customers. This too  leads to greater satisfaction for the customer and higher value sales for the seller.

 

What Hadoop distribution does MetaScale use / recommend?

 

MetaScale is most familiar with Cloudera but has deployed others such as Hortonworks, Intel and MapR. When choosing a distribution for your business, you really need to look at your requirements and then select the distribution and vendor that would best address those needs.

 

As a company that is just getting started with a POC for Hadoop/Big Data, what recommendations do you have? Any gotchas that we should be aware of?

 

Since MetaScale is a service provider that is in the business of helping organizations get started with and derive business value from Big Data an easy recommendation is to engage with a trusted service provider that has the experience, like MetaScale.

 

However, a more general answer is to approach Hadoop like any other technology project. Identify the project requirements and what you hope to achieve. Do not get overwhelmed with the Hype. Make sure you understand that Hadoop is wonderful for some things but is not a solution to every problem in your organization. And finally, be prepared for a steep learning curve and the challenges of hiring the talent you need because there’s huge demand for experienced Big Data practitioners but the available pool is relatively small.

 

What distributions of Hadoop does Control-M for Hadoop work with?

 

Control-M for Hadoop currently has been verified with MapR and Cloudera. However, Beta customers are running Greenplum successfully and so expect official soon. Additionally, we have used various other distributions internally including ASF 0.2x and 1.1.x.

 

How can we integrate Hadoop with workload automation job scheduling tools?

 

We believe the very best way to accomplish this is by using BMC Control-M together with its support for Hadoop. If that is not feasible for any reason (contact BMC if you feel that is the case because we believe we can change your mind) you can install Control-M for Hadoop to manage the Hadoop environment and then use a variety of utilities and best practices BMC can recommend to integrate with third-party tools you may have already installed.

 

What are some good sources of training for Hadoop?

 

I do not want to recommend any specific resources or providers however if you use your favorite search engine and enter “Hadoop training” you will find thousands of providers, various instructional YouTube videos, blogs and sites with free instruction. Additionally, all major Hadoop distribution providers offer both free and more in-depth paid training.

 

Will there be a link to listen again to the presentation?

 

You can access the recorded webinar here

 

How to replace ETL with Hadoop ? We still need to extract the data (as files uploaded into HDFS), to transform (through MapReduce) and Load them ?

 

There are two points to consider.

  1. Yes, you still perform ETL but now using an Open Source solution that may be significantly cheaper. In this way, it is not the process of ETL that is being replaced but rather the ETL tool(s).
  2. Frequently the ETL processing is necessary to either “normalize” the data or to prepare it in some way for analysis. You may be able to obviate this requirement by processing the “raw” data directly in Hadoop.

 

What are the advantages of Control-M over Oozie(batch schedular hadoop eco system) ?

 

Here is a short list. Please contact BMC to arrange for a presentation or demo:

 

  1. Control-M for Hadoop reduces the time and effort required to get jobs running in Hadoop. This is accomplished with a simple, graphical user interface that enables programmers to build flows using drag and drop functionality and with wizard-like definition forms that eliminate scripting
  2. Control-M for Hadoop makes operating Hadoop applications simple by providing out-of-the-box capabilities that are mandatory for all business services in an enterprise IT environment. These features include notification via email, automatic incident management, SLA management predictive analysis giving early warning and real-time simulation for SLA breach remediation, auditing, reporting, forecasting and the ability to easily review historical job executions.
  3. Control-M for Hadoop gives organizations the flexibility to design enterprise applications consisting of Hadoop and “traditional” applications using the broad platform, application and technology support provided by Control-M. Business services consisting of Hadoop and Business Analytics, ERPs, Mainframe, Unix, Linux, Windows, Web Services, File Transfers, virtualization and cloud technologies, enterprise backup applications, relational databases and many others.

 

Can you please give us the use case of Control-M in Hadoop environment from any of your clients?

 

Some of the examples provided by MetaScale, such as offloading processing-intensive workload from mainframes and ETL “re-hosting” are great use cases. Another interesting use case if from a company called ChipRewards which you can see here

 

Does Hadoop have presence in Latin America?

 

According to most analysts like Gartner and Forrester, North America frequently leads new technology adoption by a year or two. Although we have seen this trend in respect to the number of customers asking and/or talking about Big Data, nevertheless, this does seem to be a global phenomenon and there is interest in all major technology areas including Latin America.

 

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Share: |


-by Joe Goldberg, Solutions Marketing Manager, Control-M Solutions Marketing, BMC Software Inc.

 

This blog is a wrap up of all the questions asked in the June 25, 2013 Webinar titled Q&A “Migrate to Control-M V8 – Don’t hesitate

 

Q: I just migrated to Control-M 7. How long will that be supported by BMC?

 

BMC Control-M adheres to the official BMC Software policy documented on the Support site. Read it here.

 

Since the release cycle for Control-M versions is approximately every two years, that means that V8 is current and V7 is C-1. The next version is scheduled for end of 2014 at which time V7 will become C-2 and end of support will likely occur at end of 2016 when V10 should be released.

 

Q: Do you have any documents that compare the v.7 and v.8 functionality and maybe v.8 versus v.8.1?

 

A version comparison chart is available here.

 

Additional information is contained in the Release Notes for each release and fix pack.

 

Q: How many businesses are currently running v8 in prod?

 

As far as we know, there are approximately 40 customers in production with V8.  Additionally, we do know that about 250 downloads of V8 software have occurred.

 

Q: I have two Control-M servers that share databases. The two servers are used as one for primary and the other for manual failover. My
question is can I upgrade to version 8 on one server and not the other since they share the same databases?

 

The database schema has changed so it is not possible for two different versions of Control-m to access the same database.

 

Q: Can you send me the Control-M 8 Whitepaper?

 

You can get this Technical White Paper here.

 

Q: How to convert Version 6.2.04 jobdef/draft to v8?

 

No direct upgrade is supported from V6.2.04 to V8. However, you can build an interim V6.4  or V7, upgrade your 6.2 to the interim and then upgrade that environment to V8.

 

Q: Hi, is the webinar being recorded?

 

The webinar was recorded and the link will be posted on the recorded events page. You can find all recorded Control-M events here.

 

Q: Will Control-M 8 use any more/less/same network bandwidth than the older versions (6.3)?

 

A significant consideration for bandwidth usage is the size of each “job object” being transferred. The size of a “job” has increased due to additional features that are now available and so the amount of data will be larger by about 30%. Of course job infromation accounts for only a percentage of all traffic (mainly during New Day upload).

 

Q: Hi...I did not notice. Did he have to turn on the"track changes" feature or is it automatic?

 

After installation of V8.0.00.100 (fix pack 1) “Track Changes” is activated automatically.

 

Q: Joe mentioned FP1 details webinar. What is link to that?

 

You can find this recorded event above that took place on June 19th. The title is “Control-M Version 8: New interface with powerful new features”

 

Q: Does JCL Verify works with Pro/JCL?

 

BMC Control-M Workload Automation JCL Verify replaces Pro/JCL.

 

Q: Is the Backup CM/module available with R7 agents?

 

Yes. BMC Control-M for Backup is compatible with all currently supported versions of Control-M/Agents (V6.4, 7 and 8).

 

Q: Is there any plan for HP Data Protector backup tool support?

 

There is no current plan to add this support that I am aware of but that will be re-evaluated when BMC Control-M for Backup is considered for its next revision.

 

Q: Will you be distributing the PowerPoint for this session afterwards?


Yes. You can find the presentation here.

 

Q: In V8, will the EM utilities see the AJF on the mainframe?

After a diect conversation with the submitter of this question, it turns out he is looking for a facility similar to “ctmpsm” that runs on the Enterprise Manager and can display job information for Control-M for z/OS. Although there is no such utility today, I would suggest looking at:

  1. BMC Control-M/Self Service or
  2. Enterprise Manager APIs available either in Java or (with Business Process integration Suite) in Web Services and Messaging
    flavors.

 

Q: Do you have any educational material that can help us to introduce current CTM users after migration to v8?

 

There is a lot of instructional content within the Control-M Workload Automation Client itself. This includes embedded videos, a completely revised and easier to navigate Help and a “V7 to V8 Locator” application that helps you find equivalent functions and fields between V8 and prior versions. Additionally, there is web based training available from BMC Education.

 

Q: Is Control-M for Backup provided out of the box with control-M 8  or is it a paid add-on?  Same questions for WCM.

 

Control-M for Backup is available to all customers on BMC Simplified pricing. Workload Change manager will likely be a paid add-on.

 

Q: Have you heard of any complaints running Control-M WLA version 8 on a Windows VM?  I have had it hang up on me several times and I'm rethinking moving to that.

 

We are not aware of any issues related specifically to Control-M V8. The configuration of any application on a virtual platform must
be done with some care to ensure that the total workload assigned to all the virtual machines on a particular physical machine does not exceed the capacity of the physical hardware. We have seen situations where multiple VMs were placed on a single ESX Server, for example, and their total workload and/or their I/O and disk requirements far outstripped the physical hardware. Such an approach
certainly can result in poor performance. On the other hand, when properly configured, there should be no issue with hosting Control-M on a virtual machine.

 

Q: From where we can download trial version?

 

The Trial Installation is an option of the standard Control-M V8 installation. You can dosnload the installation package here:

 

Q: With the many changes from version 7 to version 8 when is BMC planning on releasing version 9?

 

Control-M is upgraded roughly every two years so V9 is currently planned for the end of 2014.

 

Q: Is there a date scheduled to begin v8 Certification?

 

Please visit these BMC Education links:

 

Q: Is it possible to use the latest Conversion Tool with Control-M server / EM & Agent all at v6.4.01 ?

 

The Conversion Tool creates XML output for either V7 or V8. It is possible to manually edit the V7 XML so that it can be imported into a V6.4 environment but there is no out-of-the-box functionality to perform this.

 

Q: When is next update/release for Oracle Module?

If the question is about Control-M for Oracle eBusiness Suite, the current version is V6.4.01. I do have any informatin about when the next release is planned.

 

Q: Have there been any improvements made to Forecast? Specifically more detailed information on the "WHY" function.

 

As far as I know there have been no enhancements in this area in V8. Please send me a private note at joe_goldberg@bmc.com and let me know what additional information you are looking for.

 

Q: Will there be any upcoming webinars for Control-M for Backup?

 

Although I am not aware of one at this time, we are always looking for topics of interest. Everyone interested in a dedicated webinar on this topic is encouraged to contact me or post a comment on Communities.

 

Q: I have a question about the current version we have 6.4. When is official support ended for this version? This could be a good reason to upgrade

 

End of support for V6.4 has not yet been announced but is expected around the end of 2014. Running on an unsupported release is certainly a great reason to upgrade.

 

Q: Is it full supported to do migration to v8 from v6?

 

Direct migration to V8 is supported from V6.3, V6.4 and V7.

 

Q: We have about 500 agents on version 7. and about 300 agents still at 6.2. Do I have to upgrade all 6.2 agents before going to version 8.

 

Version 6.2 is no longer supported but the technical capability to run V6.2 agents is still available for V8. Just ensure you set the agent protocol correctly.

 

Q: What job definition change from V7 to V8? steps, ftp jobs what considerations?

 

The changes are primarily in the external appearance and terminology. Internal database schema has changed relatively little for folders
and jobs. However the database as a whole has changed to support Workspaces, Checkin/Checkout, etc.

 

Q: Do you know if for v8 we need a license file for CTM Server fresh installations? If so, where can we get them?

 

There are no new license requirements that are specific to V8.

 

Q: Is Workload Change Manager an add-on.

 

Yes, this is expected to be an add-on.

 

Q: Can you have multiple users actively making changes to the same file?


If by "file" you mean Folder (or previously called Table), no, there is no way to do that. In V8, multiple users can read/browse a Folder but once a user Checks out a folder, no other user can update it until the folder has been checkied in again.

Q: If we will install V8 server on new server, will it pointed to existed EM and CTM databases from v.7 or we need to create new
schemas for v.8?

 

You will have to create a new database. This is done automatically as part of the install. An Upgrade Kit is also available to migrate the data.

 

 

Q: Hi Joe, it's recommended migrate from 6.4.500 to 8.0.500 directly, without install version 7?

 

Yes, absolutely. Direct upgrade to V8 is supported from V6.3, V6.4 and V7.

 

 

Q: Are there plans to include Quartz Scheduler jobs in the conversion tool?

 

There are no current plans for this support. Anyone interested in this please send me an email (joe_goldberg@bmc.com) or start a
discussion on BMC Communities.

 

Q: Is Control-M Agent 8.0 available as a native 64-bit application or is it still only 32-bit, Agent 7.0?

 

 

In V8 all platforms are available in native 64-bit flavor however the default installation is 32-bit even on a 64-bit environment due to the fact that the majority of Control Modules are still delivered in 32-bit mode. However, installation instructions are provided describing how to perform a 64-bit install. Please note that the option to select 64-bit is available only for a "stand-alone" agent installation. When the Agent is installed together with a Control-M/Server no mechanism is provided to request a 64-bit Agent install for the same account as Control-M/Server.

 

Q: Can you get a schema for version 8

 

The schema will be published on the BMC ftp site within a few days. However, a DRAFT version with only minor cosmetic changes still to come are available here:

 

Control-M Enterprise Manager Database Schema

Control-M/Server Database Schema

 

Q: The work for migrating to Control-M V8 from various platforms such as AutoSYS, Tivoli, etc. is approximately the same, or it varies
appreciably depending on the starting platform?

 

 

I believe the work is very similar for environments of similar size and complexity.

 

 

The postings in this blog are my own and do not necessarily represent the opinions or positions of BMC Software

Filter Blog

By date:
By tag: