What is Cloud Computing?
Before I get into my own cloud-related thoughts in subsequent posts, let’s make sure we’re all on the same page re: what cloud computing actually refers to! Here’s a quick primer from Andrew McAfee, who is a principal research scientist at MIT’s Center for Digital Business and the coauthor, with Erik Brynjolfsson, of the e-book, Race Against the Machine (Harvard Business Review Press, 2011).
The cloud computing industry is growing and evolving rapidly–and also generating lots of jargon. As a result, it can be difficult to understand exactly what the cloud is and how its offerings differ. To oversimplify just a bit, those offerings can be divided into three categories: raw computing capacity, computers that are ready for software, and software itself.
The first of these, called INFRASTRUCTURE-AS-A-SERVICE (IaaS), is the most basic; it’s a server or servers out there in the cloud, or a bunch of storage capacity or bandwidth. IaaS customers, which often are tech companies, typically have a lot of IT expertise; they want access to computing power but don’t want to be responsible for installing or maintaining it.
The second tier is called PLATFORM-AS-A-SERVICE (PaaS). This is a cloud-based platform that companies can use to develop their custom applications or write software that integrates with existing applications. PaaS environments come equipped with software development technologies like Java, .NET, Python, and Ruby on Rails and allows customers to start writing code quickly. Once the code is ready, the vendor hosts it and makes it widely available. PaaS currently is the smallest segment of the cloud computing market and is often used by established companies looking to outsource a piece of their infrastructure.
SOFTWARE-AS-A-SERVICE (SaaS), the third category, is the largest and most mature part of the cloud. It’s an application, or suite of applications, that resides in the cloud instead of on a user’s hard drive or in a data center. One of the earliest SaaS successes was Salesforce.com’s customer relationship management software, which provided an alternative to on-premise CRM systems when it was launched in 2000. More recently, productivity and collaboration software—spread sheets, word processing programs, and so on—has moved into the cloud with Google Apps, Microsoft Office 365, and other similar offerings.
Customer offerings share a few similarities across these three categories. First, customers rent them instead of buying them, shifting IT from a capital expense to an operating expense. Second, vendors are responsible for everything “beneath the hood”—all the maintenance, administration, capacity planning, trouble-shooting and backups. And finally, it’s usually fast and easy to get more from the cloud—more storage from an IaaS vendor, the ability to handle more PaaS projects, or more seats for users of a SaaS application. (Source for all of the above: Harvard Business Review, pgs 128-129, Nov. 2011).
Next post: What are the benefits of cloud computing?
Comments
Saul
Much as I aepipcrate your diss in the second paragraph of your reply, the arithmetic for calculating the cost is quite simple, but depends on having knowledge that cloud customers are selected for not having.Most anyone who has been writing PHP for more than two years can get up and running on a VPS in only a few minutes longer than it would take with GAE, but also be able to migrate to many other VPS providers without substantially more pain. The point of cloud computing isn’t having your trivial site up in 2-3 minutes, is it? I thought it was all about that scalability.On FB, my point is exactly that they have the same infrastructure internally, because there’s nothing magical about the cloud, it’s just infrastructure. But they’re not renting space on Google or Amazon, because it’s not cost effective at the high end. Which leads back to one of my core points, which is that the cloud is neither technologically interesting nor cost effective over VPSes or having physical machines.I would say you’ve hit the nail on the head about the last 30 years of technology. I think if there were zero innovation apart from hardware continuing to become cheap and widely used, I would have trouble saying life is worse than it is now. Definitely the last 20 years. We could be using Standard ML, the only language with fully specific semantics, but instead we switched from C to C++ and Java, bringing with them different kinds of unbearably inhuman complexity. We could be using Plan 9, the successor to Unix with truly integrated networking and distributed computation–a true platform to build cloud-like services on–but instead we’re using the Unix clone written expressly to the lowest common denominator. We could be using message passing instead of threading, we could be using a truer relational calculus rather than SQL, and so forth. It’s as though we chose wrong every time. We live in a world without a reliable networked filesystem, and instead we have so many logging frameworks in Java we have to have a meta logging framework to abstract over them. Now that I think of it, I would miss ZFS, HTTP, and Haskell, but I can’t think of too many other successes we’ve had in the last 30 years.Like most pessimists, I would say I’m merely a realist. The cloud never was your savior, so there’s no sense pining for the days when you thought it was. It never was anything more than marketing. If you want to do something good for technology, find a way to delete two lines of code for every line you write from now on. Avoid busy work but accept that some code just has to be written by a human and not hidden behind an abstraction layer.Thanks for listening to my crazy rant. I’ll return to irrelevance now.