Share This:


-by Steve Carl, Senior Technologist, R&D Support

The Cloud Computing Metaphor

It sometimes seems to me that the cleverest thing Gartner ever did was to create the concept of the hype curve or Hype Cycle . I like the term Hype Curve better because a hype cycle sounds like a song by Queen.[I want to ride my hype cycle, I want to ride my hype cycle!... great. I'm going to have that playing in my head the rest of the day]. For Cloud Computing you can argue the time frames for speed of adoption, because not everything moves at the same pace, but Gartner firmly places "Cloud Computing"in the initial "Technology Trigger" section, climbing the curve for all it is worth.

 

Yes: that is right. It has not even achieved the tip-top height of the Hype yet. You are going to hear about it even more! Not even counting this post(though I prefer to not believe that this post will be hype).

Convergence

In my last post, I talked about how we have been here before, and that post was largely meant to step back a bit from the day to day execution of technology and look at the concepts behind it.When looked at conceptually, the technical side of the Cloud is not all that different from the old glass house.

 

Another concept the Cloud reminds me of being the same kind as the term as"Web 2.0" was. In and of itself it is not a discrete technology, but rather a critical mass of various other technologies assembled and used in a particular way. Even that is not 100% accurate though, because cloud computing is not just one thing. Maybe "set of ways"?.

I rather like the idea, voiced in the comments of whurley's InfoWorld Cloud Computing Column that the idea for Cloud Computing came from someone seeing a network diagram (for perhaps the first time) and suddenly having a vision of what the concept of the network was.

 

At its very core Cloud Computing a very old concept, originated in the saying by Sun that "The Network Is The Computer." For Cloud Computing this is more true than ever.

Another Day, Another Cycle

If you think about a computing cloud, and what it actually is under the covers, there is a data center someplace (or set of places) with computers in it running transactions and servicing requests from the network. Where the data center is is masked by the network: as long as it is close enough to you that there are no significant speed of light delays, then you have no idea where it exactly is, or what it exactly is made from. Your transactions could be running on Linux or BSD or some other OS, on real hardware or as a virtual machine, and that Linux-or-other-OS could be on X86 or a mainframe. You do not know or care as an end user. You just want it to work when you access it.

 

What connects one to this data center is the network, and under that is the protocol of the Internet, TCP/IP. Where did that come from? Work done years and years ago by the Defense Department (DARPA), with a goal that the network should be resilient: survive war taking out parts of the physical wires by routing dynamically around the damage, with protocols that knew they were inherently unreliable so that they dealt with re-queuing and retransmission.

 

Perfect for a Cloud.

 

Many point to the commodity, rent a cycle nature as being a differentiator between the Cloud and other concepts, but wasn't that what "Utility Computing was about, at least in part?.

 

I can not help but think of the very first time I wrote a computer program.On a Telex machine, connected to a modem, accessing a shared IBM S/360installed someplace where that modem went. I know the mainframe was shared,because one of the other people that used that computer had access to an account from a different school and could see and print out their programs and data.

 

Our school rented time on that computer, and so did other schools. The mysterious computer on the other end of the network was chopped up into bits and rented by the TIP and the Kilobyte.  At the end of every school year, the student accounts were "re-provisioned", so that the next wave of students started with a clean slate. Yes, it was incredibly dated by today's standards,but it was conceptually not that different. A question of degree, not of concept.

Pendulum

We have been here before. The pendulum swings back and forth between centralized and decentralized computing, but here is where I think one of the major differences of the Cloud might be: It is potentially both centralized and decentralized.

 

One reason that people wanted to run screaming from the glass house was issues around span of control. The folks in the data center forgot who the customer was, and started acting like they were the reason that the computer was there. This was enhanced by things like the BOFH series of comedy bits, wherein a computer operator torments the end users for their personal pleasure. I will freely admit that this is not exclusively a problem of IT people forgetting who their customer is, as I have been on the receiving end over my many years, or customers who think IT's primary function is fall guy and punching bag. Let us just say that it is a troubled relationship that requires constant work from everyone.

 

Cloud computing gives people the option to not tolerate being treated poorly by their internal IT staff. Say "No" too many times to the end users request,and if they have a corporate card, they'll just end run you by buying the computing resources they need elsewhere. This is not that different than when the folks started buying PC's and departmental servers in order to be in control of their own destiny, and it points to the same possible set of problems. If people are sticking things in the cloud because they can, then at some point no one knows what is where or who owns it.

 

My first thought about data in the cloud is not that data is not secure,although that is certainly possible, but that data put in the cloud is unique,and that all it would take is the corporate card funding it being canceled to have the unique and possibly critical data disappear back into the disk cloud from whence it came, to be provisioned over the top of. Yesterdays critical data location is re-provisioned and is now occupied by today's shopping list.

 

Don't forget: Milk. Eggs. Backup data... Opps.

 

A truism in decentralized computing is that yesterdays experiment or escape from central support is today's crisis because something critical got lost someplace. Looked at another way: Just because it is in the cloud someplace does not mean that BSM and ITIL principals do not apply. If anything, it is the reverse.

Saving Money in the Cloud

It is very easy to see how a centralized large data center can be more cost efficient that anything that a small group of people / small business could build themselves. With scale comes leverage for discounts from the entire supply chain. By being able to choose the location the data center it can be located near inexpensive power. This is probably a bigger cost factor than most think about, but super-critical to keeping costs down. A data center next to a hydroelectric dam is both cheaper and greener than one near a coal fired power plant.

 

You can also make sure that there are qualified people nearby to support the data center. By using large commodity servers or even the mainframe,virtualization, provisioning, and so forth computers can be run at 80% plus utilization's rather than the more common 2% of the end user PC. That is not just good for the company providing the Cloud, but less expensive for the end user, and less impact on the planet as a whole. Triple win.

 

None of that concept is new, or unique to cloud computing. It is just adapting an old paradigm to the current generation of computer gear.

 

IBM has had, for years, the capability to provide variable capacity on demand in many of their computers. Stick a Mainframe one the data center floor,and enable (for example) 10 processors for the daily workload. Now quarter close hits, and sudden you need 20 CPU's. No problem. They are actually really in the mainframe, and can be enabled on the fly for a price. At the end of the quarter, disable them and you are back to 10 CPU's till you need them again.

 

The Cloud can do this as well. As long as every single customer of a particular Cloud provider does not go computer crazy all at once, the Cloud provider should have reactive capacity if it is well run... Here is the usual balancing act scaled up. Try to run at 80% capacity to make efficient use of resources, but have enough in reserve to deal with the workload bubbles. How big that "enough" is is based on careful measurement, modeling, historical data, etc. Golly: Sounds like the mainframe and Best/1 (OK: Called BPA now...) all over again.

Not Saving Money in the Cloud

Fair warning though: Most of the numbers I have seen about saving money in the Cloud quote a number around 30%. That may seem like a lot, but it would be easy to lose that savings quickly without Cloud purchasing discipline.

 

Think for a moment about the many many spam ads for free things like laptops. I looked into that a bit last night, and it turns out that it is a matter of terminology: Even when there was a real laptop or some other nifty thing to be had, it required going through and doing things like accepting offers that were not free: that had registrations, shipping, handling, you gave away personal information, and you spent a great deal of your personal time sorting through all of it trying to get to the not-so-free laptop/device for the least amount of money.

 

Now consider the distributed systems swing of the pendulum: It was always less expensive than the mainframe. Except when it was more expensive. It had lower up-front costs, but it also had hidden costs all over the place. Support contracts, and time tracking hundreds of computers rather than just a few. Virus outbreaks and the time spent dealing with those. Lost data because it was not on backups cost what exactly?

 

The Cloud has the same issues: If it is not approached as just another computing resource: If all one can see if zero up front costs, then how long till everyone with a corporate card is charging up some tasty Cloud? How long till that 30% is long gone because the right hand department and the left hand department are doing the same thing in two different clouds from two different vendors?

 

The Cloud will save you money right up until it doesn't, and we have been there before.

Rapid Provisioning

It is pretty easy to see how one key to success in the Cloud would be the ability to rapidly provision operating systems and applications.If you look over many of the offerings from many of the Cloud vendors that is exactly what they have. Rapid provisioning is a success factor for a Cloud offering, but it is not a defining feature of the Cloud.

 

Rapid provisioning is one of the holy grails of the in-house data center as well, and, full disclosure here, something we offer via our Server Configuration Automation née BladeLogic tool. If rapid provisioning is the key factor one is after with the cloud then it is possible with provisioning tools to create in-house computing clouds.

 

Taking the example from the previous section, with rapid provisioning there is a solution to the 80%/20% problem. Assuming that there are computers that are set up to create a progression to production, then there are probably Alpha, Beta, pre-GA, and maybe other levels or flavors of test / QA / Customer support resources either in or near the data center. Don't want to take a chance on your critical data leaving your premises and being manipulated in the Cloud? Create your own cloud inside your own enterprise. Take the idea of Cloud Computing, and implement it internally. In fact, take the whole attitude of the Cloud: the the customer is the person paying, and that the computing resource is a tool: a means to an end, not the end itself. <Craig Ferguson> I knoowwww! </Craig Ferguson>... weird to think about it that way to a computer geek sometimes.

API or not to API

It does not really take any special API's to create a computing cloud. TCP/IP and all the things that run on it are enough: See Google Docs, or any other AJAX based application for details. When I wrote this entry, I mostly used Google Docs to create the HTML, and for me, it was off in the Cloud. I don't know where the actual HTML is being stored: Which of many of Google's servers have it. For all I know, I am being shuffled back and forth between multiple computers, statelessly on my end, and statefully on their end. This is not to say that one can not create API's to enable the Cloud, like Microsoft's Azure. When Cloud enabling legacy applications API's can be downright handy it seems. Be that as it may, having or not having an API is not a defining feature of the cloud, and in fact the API's run over .... TCP/IP.

Cloud Defined

It appears to me that the definition of Cloud Computing is a very fuzzy,nebulous, cloud-like thing. At its simplest all that is required is two network enabled devices, and a network between them. The network is the cloud, just like in the classic diagrams. The network still, after all these years, is the computer. Probably why Sun is big into Cloud Computing.

 

 

The postings in this blog are my own and don't necessarily represent BMC's opinion or position.