Share:|

BigData_2267x1146_white.png

Who needs information

When you're living in constant fear

Just give me confirmation

There's some way out of here

 

According to the Radio K.A.O.S page on Wikipedia, the event that inspired Roger Waters to write the lyrics of the song “Who Needs Information” was the 1985 miners' strike in Britain where a striking worker threw a concrete block off a motorway bridge, killing a taxi driver who was taking a working miner to his job. This was an example for how far people will go to pursue their monetary goals.

 

You can blame me for using Roger Waters, Pink Floyd, Tom Petty, Queen, or even The Matrix movie for monetary reasons as well, being an employee of a commercial software vendor, but if you are reading this blog there is good chance you as well work for a commercial company with stakeholders, owners or someone that cares about revenue and profitability. You are expected to help your organization be successful and to generate revenue.

 

Being successful means to have an advantage over competition, and the way to get that advantage these days, is information. The more you know about your market, about your competitors, and most important – your customers, the greater the chances you’ll be successful.

But there’s a catch. The more information you have, the greater the challenge it is to analyze it. Traditional technologies such as relational databases and data warehouses can no longer process the amount of data generated by social channels, or gathered from online websites and other machine generated data, at least not in the timeframes that allows you to take advantage of the processed results to drive relevant business decisions.

 

Technologies such as Hadoop and in-memory databases are becoming more and more popular these days. Big Data is no longer just a buzzword. If you search for use cases on the websites of all the major Hadoop distributers (for example Hortonworks, Cloudera and MapR), you will find plenty of stories describing how companies are taking advantage of Big Data technologies and Hadoop specifically to become or remain competitive.

 

But Hadoop is not an island, and it will not replace the traditional database platforms which all your business applications connect to. Commonly large amounts of data are processed by Hadoop and the results are then sent to a legacy data warehouse or back to a mainframe if Hadoop is used to reduce mainframe license costs for example. ERP systems such as SAP, Oracle E-Business Suite, PeopleSoft or others might be involved in the process as well, along with file transfers, direct access to databases, ETL or data integration activities and eventually business intelligence or analytics tools that are used to expose the information to business users.

 

So how do you make sure that all these systems are in sync?

How do you monitor the process from beginning to end and ensure that you are meeting your deadlines?

How do you manage changes across all these systems, making sure everything is audited and compliant with your company standards and policies?

How do you provide business users self-service access to their parts of the workflow?

 

You need a solution that that allow you to manage everything from a single point of control, to configure automatic recovery from failures so the amount of manual corrective actions is as minimal as possible (including recovery from point of failure to reduce downtime in case of a problem) and to define proactive SLA/deadline notification so potential delays or failures in critical services are identified as early as possible when you still have the time to fix the problem before you miss the deadline.

 

And how do you get all these capabilities and more? You select the leading workload automation solution in the market, the only one that really can handle all the applications and platforms that you need. It even has an online open community that allow users to collaborate and share custom integration they created. Specifically for Hadoop, it offers a native interface that can eliminate any scripting, homegrown integration workarounds, or the use of Hadoop-only limited schedulers such as Oozie.

 

If you want to learn more about Hadoop workflow automation or about centralized approach for automation of any application or platform, simply click on the links above or below...

 

It’s also the time now to register to BMC Engage, the largest BMC conference of the year, which will have a dedicated track for workload automation - while you can still get the early bird pricing

Make sure to attend Joe Goldberg's session on Hadoop and the "Elephant in Your Computer Room" (Session #386)

 

If you are planning to attend Strata+Hadoop World in New York City this year, make sure to stop by the BMC booth and see a live demo from one of our experts.

 

There is plenty information for you online (who needs it anyway?), but if you have any question or need more information, post it here as a comment.