By Alistair Croll
Conference Chair, and Principal Analyst at Bitcurrent
Cloud computing is an umbrella term for delivering computing platforms on a pay-as-you-go basis. It's widely used to describe everything from virtualized managed hosting, to software-as-a-service providers, to everything in between.
Economically, cloud computing is simply standardization. Cloud providers get economies of scale they can pass on to users, and have the ability to quickly allocate resources to customers. This means that they can offer dramatically shorter contracts -- charging by CPU-hour or gigabyte instead of by month or cage.
Technically, cloud computing is abstraction. The cloud abstracts out certain functions -- storage, for example -- and the cloud's user doesn't care whether the data is stored on a hard drive, in memory, or in a box buried in the earth, provided that it behaves reliably. Recent advances in virtualization, combined with the broad adoption of a few programming languages and HTTP as a universal front-end, make this achievable. Beyond this definition, however, there's a lot of contention. Every vendor is trying to cloudwash their offerings. SOA companies are claiming clouds are SOA; management companies are claiming they manage clouds; and on-premise data center technology is claiming it's a "private cloud."
One thing's for sure, however -- once you cut through the hype, computing as a utility has its place in the CIO's toolbox and promises to change how we think about buying and using IT.
Do you agree with this definition? Join the conversation on Facebook.