Skip navigation
1 2 3 Previous Next

Control-M

227 posts
Share:|

Have you experienced issues with a Gateway that can't connect or have a Gateway download hang? Are you interested in learning more about Gateway architecture and/or how the gateway works? Would you like to learn how you could potentially identify and correct problems yourself?

 

To learn more about the Gateway architecture and troubleshooting, please join us for a connect with Control-M live webinar on  Wednesday, May 31st  where Neil Blandford will demonstrate:

 

Gateway Architecture

Debugging & Troubleshooting

•     Running the Health Check Utility

 

Don’t miss a live demo of these capabilities. There will be a Q&A after the demo. Register Now!!

Share:|

Did you know that Control-M allows you to develop your own custom client applications?

Would you like to speed up your development lifecycle by enabling developers to test code in a Control-M environment using continuous integration techniques? Automation API allows you to submit requests from your own applications to Control-M and get the responses as well such as performing job actions and changing Alert status.

 

To learn more about Control-M Automation API and how to use it, join us for a Connect with Control-M live webinar on Wednesday, April 26th  where Ruben Villa will demonstrate:

 

Installation & Configuration API

Job definitions as code

•     Managing active environment through custom DevOps tools

 

This is the link on YouTube for the recorded session:

 

Connect with Control-M: Control-M Automation API - YouTube

 

Here is the Q&A for this webinar (Connect with Control-M: Control-M Automation API)

 

________________________________________________________________

 

Q: Is this Automation API an extra licensed product?

A: No, you can use it since Enterprise Manager Version 9 FixPack 2

________________________________________________________________

 

Q: To access the API cli from the command line, should you be located under a specific path?

A: No.

________________________________________________________________

 

Q: Under the "Defaults" you can define any parameter from the job definition and its value?

A: Yes.

________________________________________________________________

 

Q: Is there any table that shows the references of a field in the job definition so I can refer to that correct name into the code?

A: Yes, you can consult the help documentation with ctm doc get

________________________________________________________________

 

Q: Can the order of the structure in the code can be changed without any impact? Or it has to keep the correct order?

A: Yes you can change the order of the parameters.

________________________________________________________________

 

Q: IS the flow option in AAPI equal to creating job dependencies with conditions?

A: Yes

________________________________________________________________

 

Q: Is this feature integrated with the WLA?

A: Yes, since Enterprise Manager FixPack 2

________________________________________________________________

 

Q: Does it matter what you name the flows?

A: No, you can use the name that you want/need.

________________________________________________________________

 

Q: What would a flow with 2 successors look like?  (2 successors simultaneously.)

A: You can define 2 flows in your JSON file to create 2 successors for the same Job.

________________________________________________________________

 

Q: When you create a job with AAPI, it's that definition inserted to the DB or just in the AJF?

A: Both, you can also modify this.

________________________________________________________________

 

Q: Is it possible to create a flow even if the job ended NOTOK?

A: Yes, you can configure how your flow will follow after a job ends not, e.g. rerun or set ok

________________________________________________________________

 

Q: How case sensitive are these commands (windows typically not case sensitive)

A: ctm commands are not case sensitive.

________________________________________________________________

 

Q: Can we import/export existing definitions to json on the WLA GUI or with exportdeftable or similar utilities? If not, is that planned?

A: This is not something available at the moment, but is planned for the future with AAPI.

________________________________________________________________

 

Q: Do all the API utilities have to be executed from the EM server machine?

A: No, the EM Server is the Endpoint for all the requests.

________________________________________________________________

 

Q: Can we call this JSON API job from a script?

A: Yes you can call the JSON from any other application.

________________________________________________________________

 

Q: Based on what permissions developer can access Control-M environments?

A: AAPI uses ctmcli and needs to get 2 tokens for login.  One from the GUI Server (GSR) and a second from the Control-M Configuration Server (CMS).  Because of this it needs login privileges for both as defined in the CCM under Authorizations for the user that is attempting to login.  Specifically in Privileges for "Control-M Configuration Manager" and "Control-M Workload Automation, Utilities........"

________________________________________________________________

 

Q: The online documentation has a few job types, like Hadoop and databases. Are other job types planned? Filewatcher, mainframe, and PeopleSoft are some of the ones that come to mind. If not, can we specify the parameters for other types as autoedit variables?

A: Currently this feature/product is designed to work in the way it was explained during this session, but

If you consider it could be an enhancement, please open a ticket so we can handle it and assist you properly.

________________________________________________________________

 

Q: Can you use this from a command line in a UNIX environment?

A: Yes, the developer toolkit can be used in a Unix Environment.

________________________________________________________________

 

Q: can we use the API to execute an asynchronous callback to control-m, to control a long-running process that does not return an explicit status?

A: You can use the ctm run status to get the status of any jobs or the full job log.

________________________________________________________________

 

Q: Can the AAPI be used to perform tasks, such as adding a condition?

A: Yes, you can add, delete conditions from AAPI.

________________________________________________________________

 

Q: Does the API code developed will have any dependency with the future enhancements of the product, job schema - as now v7 & v9 job schema has some differences.

A: At this moment AAPI only supports WLA Version 9 and will support the future releases.

________________________________________________________________

 

Q: Can I use the Run option to do an ad-hoc run for jobs that do not exist in the EM DB? Or do I have to Deploy first always?

A: Yes, as long as you define the jobs in the JSON file that you are using.

________________________________________________________________

 

Q: How do you manage who can and who cannot run jobs via AAPI?

A: You need to setup the EM Authorizations for this users, AAPI uses the login credentials and privileges from EM.

________________________________________________________________

 

Q: can we ask the WHY status of a job?

A: This option is not available for the moment, but you can run the status command for this.

________________________________________________________________

 

Q: Why does the user need to have access to the CCM?

A: Because of the configuration that we can do from AAPI regarding Agents.

________________________________________________________________

 

Q: When you run jobs on AJF using API, does it count towards Control-M job license?

A: Yes, as any other job in the AJF.

________________________________________________________________

 

Q: Will our own Application Integrator jobs types be made available as job types for the AAPI?

A: This option is not available for the moment.

________________________________________________________________

Share:|

Join us at one of the many events below whether on the road or in your hometown. We look forward to seeing you!

 

If you are in London, be sure to visit us in booth #503 at Strata + Hadoop World May 23-25th! Also, join our session May 25th at 12:05pm in Capital Suite 4: Ingest. Process. Analyze- Automation and Integration through the Big Data Journey by Alon Lebenthal, Sr. Manager, Solutions Marketing.

 

There are a couple of great DevOps events, fast approaching in June! Join us in Washington D.C. at the Marriott Metro Center for the ATARC Federal DevOps Summit. This educational, one-day summit will examine DevOps tools and techniques used by the Government to create an agile development cycle, providing greater efficiency and cost savings than traditional methods. We are also very excited to be a platinum sponsor for the DevOps Enterprise Summit (DOES) London in June!

 

May
May 8, 2017Openstack SummitBostonSheraton Boston Hotel
May 8-10, 2017Gartner ITOMOrlando

Hilton Orlando

May 9, 2017BMC DayBostonHotel Commonwealth
May 9, 2017Forrester SummitChicagoSheraton Grand Chicago
May 23, 2017Big Data LunchPhoenix*Details coming soon
May 23-25, 2017Strata + Hadoop WorldLondonExCel London
May 31, 2017BMC ShowroomMadrid*Details coming soon
June
June 5, 2017BMC DayTorontoShangri-La Hotel Toronto
June 5-6, 2017DevOps Enterprise Summit (DOES)LondonQueen Elizabeth II Conference Centre
June 7, 2017BMC Exchange- FederalWashington D.C.Marriott Wardman Park
June 13, 2017Control-M SeminarHouston*Details coming soon
June 13-15, 2017DataWorks SummitSan JoseSan Jose McEnery Convention Center
June 14, 2017Control-M SeminarCincinnati*Details coming soon
June 20, 2017Control-M SeminarBrasilia*Details coming soon
June 21, 2017Customer Appreciation Boat CruiseChicagoChicago's First Lady Cruise Ship
June 22, 2017Control-M SeminarRio de Janiero*Details coming soon
June 22, 2017Control-M SeminarMilwaukee*Details coming soon
June 28, 2017ATARC Federal DevOps SummitWashington D.C.Marriott Metro Center
Share:|

runaware asset image.JPG

We are excited to announce a new version of the enterprise Control-M trial is available today.This version includes a walk-through guide of the trial to ensure you can explore the individual use cases within Control-M and answer any questions you might have. Please take a moment to explore the guide and provide feedback on this new trial experience to Control-M Trial Feedback. 

 

We appreciate your time and thank you for being a part of our Control-M community!

Share:|

For the last 10+ years, batch processing and job scheduling have been categorized as Workload Automation (WLA) – defined by Gartner as technology [that] will underpin digital businesses by managing the movement of workloads to the most appropriate location to optimize resources in a heterogeneous environment (Source: Gartner). Today, we are seeing the 4th wave of IT automation emerge – and for Workload Automation that means going beyond the traditional definition to one that supports today’s agile approach to technology – Digital Business Automation.

 

Taking a look back, batch processing at its roots began on the mainframe and has advanced over time to run on distributed and virtualized environments. Control-M began its life in the early days of the mainframe era.  As platforms evolved, so too did the requirements for scheduling and managing the delivery of application job services for every company.  Batch processing became job scheduling, which became workload automation and now, Digital Business Automation.

 

Digital Business Automation is enabling the transition for companies to compete in the world of digital disruption by automating the way business happens – delivering the IT services that run the business – at the intersection of infrastructure, data, and applications.  

 

louvre.png

 

It is also important to note, the prior waves of automation didn’t replace each other, but are built upon each other. Just because we are in this 4th wave of automation doesn’t mean companies are able to drop existing technology and replace with new.  It means that legacy buildings like the Louvre pictured here must be modernized at the same time as new architectures are being added.

 

Companies are forced to face digital transformation every day. They are forced to compete with new, digitally born companies and 100 year old companies who are jumping on the transformation train. We believe companies that attack these challenges by rethinking IT automation will emerge as the digital business leaders in their industry.

 

Read the rest of our Digital Business Automation story here.

Share:|

Get insider access to BMC solutions and executives at one of our 2 upcoming BMC Days! Join technical experts and peers and learn about the latest BMC products and solutions during this IT management event series.

 

Be sure to join the Control-M sessions led by Robby Dick, Lead Solutions Marketing Manager and Joe Goldberg, Innovation Evangelist.

 

Join us at BMC Day Boston

 

Join us at BMC Day Toronto

 

Join us at the Federal BMC Exchange in Washington D.C.

 

Be sure to join one of our 4 Control-M sessions: Workload Automation in the Agile Enterprise, Control-M in the DevOps Application Delivery Pipeline, and Introducing the All New Control-M Managed File Transfer.

Share:|

Are you familiar with the Named Pool variables introduced in version 9? They give you an easy way to share information between jobs by extracting data from a job output and passing it to other jobs.  Using them can dramatically increases flexibility when creating jobs with new options for your job flow.

 

To learn more about Named Pool variables and how to use them, join us for a Connect with Control-M live webinar on Wednesday, March 29th  where Martin De Castongrene will demonstrate:

 

Different Variable Types

Named Pool vs SMART Folder variables

•     Sharing information between jobs

 

This is the link on YouTube for the recorded session:

 

Connect with Control-M: Named Pool Variables in Control-M 9 - YouTube

 

Here is the Q&A for this webinar (Connect With Control-M: Named Pool Variables in Control-M 9)

 

________________________________________________________________

 

Q: Are named pool variables available for the mainframe?

A: Yes, they are stored in IOAVAR.

________________________________________________________________

 

Q: What would variables in LIBMEMSYM be considered?

A: Variables defined with the LIBMEMSYM would be local variables to the job.

________________________________________________________________

 

Q: Is there a way to share variables across datacenters? What about MF and distributed?

A: The LIBMEMSYM

________________________________________________________________

 

Q: Will variables of any type appear to the Agent as environment variables, or only the local job ones?

A: Variables can be passed to the Agent by the use of the

   VARIABLE_INC_SEC configuration parameter defined in the Control-M Server config.dat.

________________________________________________________________

 

Q: Named pool variables can be accessed by all jobs even if the jobs are located in different Datacenters?

A: Named pool variables a contained to a single Datacenter.

________________________________________________________________

 

Q: Can these Named Pool variables be used in any Control-M version?

A: Support for Named Pool variables was added in the Version 9 of the Control-M Server and Control-M Enterprise Manager.

________________________________________________________________

 

Q: Is there a limit with the variables that can be defined in a pool?

A: No.

________________________________________________________________

 

Q: Is there any performance impact based on the number of named pool variables defined?

A: If a large number of Named Pool or any variables are sent the Agents there can be an impact.

________________________________________________________________

 

Q: Are Named Pool and Smart Folder Variables case sensitive?

A: Yes.

________________________________________________________________

 

Q: How do I clean the variables after defining them?

A: You can use the ctmvar command to delete them or set the value.

________________________________________________________________

Share:|

It has been a very long road.

 

Control-M has come a very long way in the past 10 years, we started with unparalleled application integration from the work the BMC development team put into our control modules. This led to the full revamp of our user experience with our new user interface, and now with features such as Self Service, Workload Archiving, Workload Change Manager and Managed File Transfer, Big Data integration with Hadoop, we are trekking farther into having a Digital Business Automation solution that meets close to 100% of our customer needs. One of the pieces of the puzzle that was missing was the ability to have developers tie their job definitions to the code they promote, and this gap has been filled. “Jobs-as-code”, the new catchphrase that symbolizes a developer’s ability to code in a given language, JSON in this case, have that language translate to job definitions that reside in Control-M, and trigger their applications. This is so much more than just jobs as code. This new trend is really about enabling companies, ensuring a real “shift left” mentality is now possible with the job definitions that are part of code promotions.

PrtScr capture_3.jpg

Test early and test often.

 

We know that our customers need to deliver top notch experiences to their own customers, but how can this be enabled? Digital Businesses across the globe rely on the quality of their applications to fuel their sales cycles, and to keep their customers coming back for more. How do you ensure quality? How do you minimize the risk of pushing that new great feature you’ve developed into production? One of the cornerstones of the new Control-M Automation API is just that. It serves as a mechanism by which developers can include their job definitions with their promoted code from the initial testing phases. Why does this matter? Just like with code, if bugs are stamped out early enough, those same bugs won't even make it past your development environment.

 

Swiss Army knife.

 

The Control-M Automation API isn't just about code promotion for jobs. There are a series of tools at your disposal whereby you can now monitor the status of any job currently running in your environment via REST service, or the Automation API development kit, which allows for Command-Line Interface calls directly to Control-M. Further, you can also provision components automatically, such as agents or control modules. Developers are now tying their end-requirements directly into the code they write. This is very powerful.

~.jpg

In the real world.

 

Let's hammer out an example. Every time a developer wants to ensure that their code gets automatically tested, they need to have that code made part of the current build. When the code is committed to their repository, an automated testing application, such as Jenkins, might kick in to compile and run the application. What environment does it run in? What resources does it require? Another shift BMC has been seeing with our customers is towards micro-services, encapsulated services that run on tiny compartmentalized sub-sections of host OS's such as Docker containers. What if instead of manually promoting and testing your Control-M job definitions, they are part of the promoted code, and that promoted code makes its way into your Development environment? What if your commit to your code repository triggers a build, which triggers the calling of the Control-M Automation API, which in turn puts the necessary job definitions into your Control-M Enterprise Manager? This also provisions either a virtual machine or a Docker container with the components you need to trigger your code. This is all a reality.

PrtScr capture_5.jpg

Stay agile.

 

This is the real goal here. Our customers need to place a focus on delivering quicker, with higher frequency, which may seem hectic, but with the right tools can be made reality. Staying one step ahead of their competition is what this is all about, and having a development life-cycle that is continuous can make this happen. BMC is striving to ensure that developers that need to make use of an enterprise-grade automation solution, as part of their application deployments, have the tools at their disposals, because, after all, we are developers too.

 

What’s on the horizon?

 

Big things. While our development team is hard at work delivering the next cutting edge tools for Control-M that will continue to foster IT innovation, our customers are already adopting the Control-M Automation API into their environments. A soon-to-be-made-available tool, the Control-M Automation API Workbench, is a totally self-contained virtual machine containing the entire tool-set of the Automation API, freely available to any company that would like to try it out, and to see how it would work for your environments.

 

With these tools at your disposal, your development teams can start making use of BMC Control-M in ways that were not previously possible, leading to what we hope to be a true transformation towards Digital Business Automation that is tangible, delivers value, and streamlines DevOps processes lining you up for future success.

Share:|

Ever wanted to be a part of something truly unique? Now’s your chance! Introducing the
Control-M Workbench Beta program!

 

What is Control-M Workbench?

Control-M Workbench is a complete, standalone development environment that allows developers to code, debug, and test workflows without requiring any additional services. Control-M Workbench provides developers with a Control-M sandbox as a virtual appliance that runs on MAC, Windows and Linux environments.

Shorten the delivery cycle with Control-M Workbench and connect development and operations teams to help avoid delays. The environment will be familiar for developers with JSON, REST APIs and node.js CLI for creating workflows. Control-M Workbench can be used to create all types of jobs including big data, database and other enterprise workloads.

 

When will it GA?

Control-M Workbench is planned for general availability around the middle of 2017.  The Workbench will be publicly available. You will be able to download the appliance and run jobs in minutes.

 

How do I sign up?

Be among the first to join the beta program and help evolve Control-M Workbench! Get a free DevOps t-shirt when you participate!

Sign up here: https://www.surveymonkey.com/r/Master_Onprem_ControlM

comm blog shirts.JPG

workbench command prompt.jpgworkbench mega flow json.jpg

Share:|

Did you know that Control-M allows you to develop your own custom client applications?

Would you like to speed up your development lifecycle by enabling developers to test code in a Control-M environment using continuous integration techniques? Automation API allows you to submit requests from your own applications to Control-M and get the responses as well such as performing job actions and changing Alert status.

 

To learn more about Control-M Automation API and how to use it, join us for a Connect with Control-M live webinar on Wednesday, April 26th  where Ruben Villa will demonstrate:

 

Installation & Configuration API

Job definitions as code

•     Managing active environment through custom DevOps tools

 

Don’t miss a live demo of these capabilities. There will be a Q&A after the demo. Register Now!!  

Share:|

Earlier this month I joined the BMC team in Dubai at the Gartner Symposium/ITxpo 2017. March was probably the best time to hold this conference as the weather is just about perfect and you get to enjoy Dubai and all it has to offer to the fullest. The BMC team along with our partner in the region, MIS, were sponsors at this event and we had some great conversations with other technology companies, our existing customers as well as prospects.

 

The attendees were a mix of IT executives, CxOs and IT practitioners from every industry vertical -financial services, oil and gas, airlines, retail, consumer goods and many others. The keynote opened with Gartner’s CEO Gene Hall who shared Gartner’s vision on how digitalization is impacting every aspect of business today and how it will completely re-shape society as a whole in the future. Gartner sees the need for a new type of infrastructure that will be needed to meet the needs of the digital world and they are calling it the "civilization infrastructure". They outline five important components that the civilization infrastructure will need to take into account. Existing core IT systems, Internet of Things, Customer Experience, Ecosystems that will enable digital engagement with suppliers, partners etc. and in the middle of all this is Intelligence i.e insights driven by data. I was particularly interested in what Gartner said about existing core IT Systems:

 

“This is how CIOs run and scale operations. It’s building on what’s already been built. It’s taking high performing traditional IT systems (such as the data centers and networks) and modernizing them to be part of the digital platform.”

 

So the gist of the message is that progress means building upon and not leaving behind and this is certainly true for Workload Automation. I have been part of the Workload Automation discipline for over a decade and this discipline has been known by names in the past such as job scheduling and batch processing. As we embrace the digital world, infrastructure is becoming more software defined, data is becoming more unstructured and volatile and application delivery much more agile. At the intersection of infrastructure, data and applications is where Workload Automation sits today. As companies look to build the civilization infrastructure they will need to become more adaptive with their automation strategy. Digital experiences will be powered by complex workflows that will span mobile applications, hybrid cloud all the way back to legacy systems and mainframe in the data center. As businesses and society in general adapt to the digital nature of life, Workload Automation as a discipline won’t stand still either and will have to become more adaptive in the new digital wave. This is was also the theme of my talk at the Gartner event titled “How to automate across infrastructure, disparate data and applications in a digital economy.” After my session I had the chance to mingle with many of the attendees. Oil and Gas is a significant part of the region’s GDP and given the drop in oil prices during the last couple of years every industry vertical in the region is now dealing with a budget crunch. Almost every IT leader I interacted with had one thing clear in their mind, while budgets a going to be tight for the foreseeable future but the mandate for innovation is still the top priority. This is now causing many to take a hard look at the approach they have taken to automating complex application workflows which has in many cases been a hodgepodge of point products and solutions often leading to lackluster service delivery.

 

The most intriguing conversation I had was with a CIO of a luxury watch retailer. The company operates close to 60 stores across the UAE and have recently expanded in the real estate arena as well. When I asked them what does digital mean for a luxury watch retailer they said they are anticipating an Uber or Netflix like moment in the luxury watch industry. Customers are looking for flexibility and choices and they foresee that a digital like platform like Uber or Netflix could emerge in the world of luxury watches, where customers would have a subscription service which would allow them to select a watch anytime based on the type of event they are going to and based on their current location. They can simply use an app to exchange it later for another watch for another event. Being able to deliver a digital service like this will mean designing a workflow that will start from a mobile app, will probably include a hop or two on the cloud and then at the very least will need to travel through their inventory management, order management, accounting and shipping systems. And of course this will need to be an automated process and one that can scale at the speed of business. This complex workflow will need to be something that can be built, run and managed from a single point of control, otherwise there will be no visibility to how this process runs end-to-end. It is difficult to control what you cannot see, so end-to-end visibility is key to operationalizing new digital services.

 

While there are lot of unknowns about the digital world we are looking to embrace one thing is absolutely clear to IT leaders that modernizing existing IT systems while building the future digital platform is going to be top priority. To wrap it up I have included some interesting things I learned at the event.

 

Close to half of the population in some leading Middle Eastern countries are shopping online, and the region has some of the world’s highest levels of GDP per capita.

 

Honda’s UNI-CUB is the hottest thing on wheels:

Honda is developing a motorized self-balancing vehicle that functions a lot like a unicycle. The UNI-CCUB is 24.4 inches tall, weighs about 55 pounds and travels at blazing speeds of up to 4 miles per hour. To operate you simply sit on it and lean in the direction you want to go.

 

Who is behind the wheel in Michigan?

No one! Companies can now test self-driving cars on Michigan public roads without a driver behind the wheel. The package of bills was signed into law in December 2017 comes with few specific state regulations and leaves many decisions up to automakers and companies like Google and Uber.

 

Data Data Everywhere

By 2020,there will be 26 billion devices permanently connected, 63 million connections per second, and 215 trillion connections in total. This will create a huge amount of data. Google has 1 exabyte of data, but IoT already has orders of magnitude more, and by 2020, IoT will create 27,445 exabytes of data.

Share:|

 

The world is exploding of data. Just google the words “data explosion” and have a brief look at the number of results you will get. I did and it is probably not a surprise that the first search result I have got was a Forbes article that was published last year under the title “Why Most Companies Can’t Deal with the Data Explosion”.

 

Earlier this month I have attended the first large Big Data event of 2017 – Strata Hadoop in San Jose California. This is probably my 5th or 6th Strata Hadoop event in the past few years and it is interesting to see how the Big Data industry is growing fast – some would say, as fast as the data is growing and yet others may say, not fast enough.

 

The number of open source technologies in the Big Data space continue to grow rapidly. While it’s great to see the rapid pace of innovation this also presents a challenge for companies on how to make best use of these technologies and absorb them in their own ecosystem. Also, as the data volumes are growing exponentially, Cloud is fast becoming a key component in Big Data implementations and a facilitator in large Big Data projects, with its flexibility and capabilities to support more storage and compute power.

 

It was one of the analysts who said a couple of years ago in one of these events the words that I have been quoting (never enough apparently) in the past few years – “Hadoop is not an island”. Now, more than ever, it seems to me as there is an across the board understanding that if you want your Big Data project to be successful, you need solutions in place that will assure you are enterprise ready and can scale. 

 

Let’s talk about scalability. It seems to me as the variation of the words scale / scalable / scalability was used in most if not all of the sessions I have attended, not to mention the various conversations I had at the exhibition hall. While many companies have seen successful in a Big Data pilot case, being able to scale and support the growing business demands is still one of the major challenges organizations implementing Big Data face along with the need to deliver applications fast.

Clearly, the time to value remains a hot topic on everyone’s agenda – the need to deliver value to business and to do it rapidly. This need is driving organizations to look into automation solutions or more specifically, scheduling solutions. I remember attending my first Big Data event many years ago and people looking at me puzzled when I asked them of what they are using to schedule their Hadoop jobs. Well, not anymore.

 

The usage of enterprise grade workload automation solutions is allowing customers not only to schedule their Big Data processes but also providing them the so much needed connectivity between the various platforms, applications and technologies in use to support their business initiatives. It provides the  “management layer” so Big Data developers can focus on how to obtain maximum value of the data. As Darren Chinen, Senior Director of Data Science and Engineering in Malwarebytes said in his interview to the CUBE “We had to evaluate where we wanted to spend our time” ( Malwarebytes Cube Interview ).

 

One last thought I had, as I was wandering around the exhibition hall, speaking to various companies and vendors. Big Data is no longer just a “cool” initiative. It has an important role to play in the digital world and is relevant to each and every industry. Be there or be …

Kelsey Adams

Control-M Brag Book #3

Posted by Kelsey Adams Employee Mar 22, 2017
Share:|

BMC digital IT powers 82% of the Fortune 500 companies. Our innovative software solutions enable businesses to transform into digital enterprises for the ultimate competitive advantage. Control-M is transforming digital enterprises across the globe and customers rely on Control-M every day to keep critical systems driving their business, running smoothly. Whether it be financial services, transportation or healthcare, Control-M is changing the lives and businesses of those who use it, and is behind the scenes of happy customers all over the world.

 

First Citizens Bank, a large US wealth management firm with over $30 billion in assets, runs their digital banking with Control-M. FCB, family owned since 1892 specializes in both personal and business online banking. FCB has very tight windows for ACH processing and if ACH isn’t delivered, there are strict penalties and fines. Before Control-M, they had no way to track the critical path. With Control-M, they are not only able to do that, but also schedule throughout their entire enterprise. FCB automates across both distributed and mainframe environments from a single point of control. “All of our digital banking is run through Control-M.” – Phil Raihle , IT Team Manager, Production Control. The integration with Peoplesoft is also a big plus for FCB. This saves their developers a lot of time by not having to create custom scripts in order to run jobs. They are simply able to use the templates Control-M provides for these tasks. “Long term it is going to save FCB a lot of time.”

 

Navistar engineers gain 20% of their work time back thanks to the power of Control-M. Navistar, a leading manufacturer of commercial vehicles uses big data to create new value added services, allowing truck drivers to improve vehicle uptime. OnCommand, Navistar’s remote diagnostics system creates more than 20 million data records per day. Before Navistar implemented Control-M, engineers spent a lot of time moving data and manually running scripts. Now this is done automatically and immediately with Control-M- saving engineers 20% of their work time. With Control-M for Hadoop, the staff is now able to manage job streams for big data projects using the same solution that supports their other critical business processes. “…We’ve just begun to tap the power of Control-M to help us use big data to enhance remote diagnostics, improve vehicle quality, and protect critical resources from unauthorized access, among other initiatives.” — Todd Klessner, Senior Data Operations Specialist, Batch Operations.

 

Florida Blue reduces batch failure rates by 50% using Control-M. Blue Cross Blue Shield of Florida, is committed to helping Florida citizens achieve better health by providing insurance to both businesses and individuals. Maximizing efficiency and minimizing cost are among their top priorities – these priorities require an advanced IT infrastructure. They are running dozens of critical systems on the mainframe and across UNIX, Linux and Intel platforms. The cross-application and cross-platform capabilities of Control-M help Florida Blue keep 50,000 jobs per day, running smoothly. Any interruption in service could prohibit employees from delivering high quality service to their members. Control-M Batch Impact Manager is a major component of keeping their failure rates low by proactively determining the impact of potential delays and finding the critical path. Control-M Batch Impact Manager also allows IT staff to fix problems before SLAs are missed and automated forecasting saves up to 2 hours of IT staff time every day. “Control-M automated scheduling software is in our DNA and it is rock solid.” – Rick Zarlenga, IT Production Support Manager.

 

What will Control-M do for you?

Share:|

Are you familiar with the Named Pool variables introduced in version 9? They give you an easy way to share information between jobs by extracting data from a job output and passing it to other jobs.  Using them can dramatically increases flexibility when creating jobs with new options for your job flow.

 

To learn more about Named Pool variables and how to use them, join us for a Connect with Control-M live webinar on Wednesday, March 29th  where Martin De Castongrene will demonstrate:

 

Different Variable Types

Named Pool vs SMART Folder variables

•     Sharing information between jobs

 

Don’t miss a live demo of these capabilities. There will be a Q&A after the demo. Register Now!! 

Kelsey Adams

Ditch Oozie?

Posted by Kelsey Adams Employee Mar 7, 2017
Share:|

Since Big Data is a relatively new and changing technology, tools can quickly become dated and ultimately replaced by the next latest and greatest tool. Many times, Big Data teams are too busy with their projects to keep up with the next greatest tool.

 

According to an Infoworld blog, “7 big data tools to ditch in 2017” , Oozie lands one of the top spots as a technology to leave in 2016. The blog notes: “I’ve long hated on Oozie. It isn’t much of a workflow engine or much of a scheduler – yet it’s both and neither at the same time! It is, however, a collection of bugs for a piece of software that shouldn’t be that hard to write.”

 

A recent 3rd party test found that Hadoop workflows can be developed 40% faster when using Control-M for Hadoop instead of Oozie (and other open sourced tools). Not only is Control-M faster than Oozie, it covers and automates many more of the tasks required to develop, test, promote, deploy, schedule, manage and secure workflows.

 

Our customers told us what their Big Data development challenges are and we listened. If you want to get your Big Data work done more quickly and easily, consider adding Control-M to your toolbox. Read the blog, A tool for the Times as Oozie is Past Its Prime by Basil Faruqui and find out how you can develop your Hadoop workflows 40% faster.

 

 

Filter Blog

By date:
By tag: