John Underwood is a Technical Evangelist at ThreeWill. He has nearly thirty years of software development experience. He is an experienced technical instructor with superior presentation skills and is proficient in delivering standard curriculum as well as developing and delivering custom curriculum.
What’s the forecast for Cloud Computing?
As I sit here authoring this post I’m glancing out the window and looking at nearly 6 inches of snow on the ground. For someone that grew up in the Atlanta area and still lives there, that’s a statement I don’t get to make very often. (Disclaimer: those of you that grew up in colder climates are rolling your eyes and snickering right now; please humor me, I’m still enough of a kid inside to get excited about the rare snowfall here in the South).
I can remember the feeling as a kid at the mere mention of snow in the forecast. I’d make a point to watch the 6 o’clock news to get the forecast from Johnny Beckman or Guy Sharpe on one of the local channels. I can remember the anticipation building up to the big day. I can also remember the feeling when the event failed to live up to the hype (which seemed to happen quite often). There were so many times when it “just missed us.”
My most vivid memories, however, were those times when the weather event exceeded the forecast hype. The winters of 1973 and 1982 come to mind as years where we had major snow or ice storms. While my parents may have hated it, as a kid I was in heaven. No school, sledding, throwing snowballs… I could go on and on.
As a computer geek I can see a similar pattern in the way new technological fronts roll in. I’ve always been enthusiastic about “new things.” I can remember early in my career I would pore over publications like “Info World” or “PC Week” looking for a screen-shot or blurb about the latest, coolest technologies. And, much like the snowy weather, a select few lived up to the hype while many did not.
So, obvious weather metaphors aside, what’s the forecast for Cloud Computing?
It seems right now that there’s a fair amount of misunderstanding about what Cloud Computing actually is. Many seem to equate it to any sort of internet-connected resource (“to the cloud!”). Others say that it’s just another term for outsourcing your IT infrastructure to a third-party. While there are probably nuggets of truth to both of these statements, based on what I learned at a recent cloud computing conference these characterizations miss the true nature of “the cloud”. More than anything it’s about being able to employ computing resources on-demand.
Imagine a consumer-facing company that wishes to run a web-based promotion related to the Superbowl. They anticipate high demand on their servers… but how much? …and for how long? Trying to stand up the necessary server capacity for such an event would involve a lot of educated guessing. Purchasing too many servers would result in wasted money; not purchasing enough would result in a failed promotion, embarrassment, and a loss of reputation to the company. And, even if the planning was perfect, what happens to the servers once the promotion is over? Excess capacity that sits unused still generates utility costs, support costs, etc.
Imagine this same scenario, however, where there was an available computing infrastructure that could provide additional resources on-demand. Additional CPUs, memory, and storage could be brought to bear as needed to meet the demand and, once the demand has subsided, the resources are no longer employed. The company pays only for what it needs. The provider of the cloud services takes on the burden of providing capacity and passes the savings from economy of scale on to its customers.
So then, the forecast about Cloud Computing:
- It’s probably not going to completely replace existing infrastructure, but rather enhance it
- It’s all about meeting some kind of seasonal spike in demand
- It’s about paying just for the computing resources that are needed, and nothing more – CPU, memory, and storage become purchased commodities
Used properly, cloud computing has a real opportunity to exceed the hype. As a developer or IT professional you should consider adding cloud computing to the mix any time you have to meet a spike in demand on computing resources.
And, hopefully, this “forecast” will be a bit more accurate than the ones for snow in the South.