A talk by Tom Raftery of GreenMonk at a conference earlier this month in Dublin, Ireland poses our title question. And the answer isn't as clear-cut, or as green, as you may think.
The problem is doing the math and figuring out exactly what you are replacing with any cloud-based provider, where that provider's data center is, and what kind of efficiency and fuels profile is being used by the provider - and your own data center - to make the calculation meaningful.
"As Cloud Computing providers are not publishing any data around Cloud Computing's energy consumption, then it is impossible to say just how energy efficient Cloud Computing is," he said during his talk.
Just because you moved your apps from your data center to a cloud provider and turned off your old and creaky servers doesn't mean you saved the planet. Yes, you saved your energy bill - if indeed you actually turned off those servers. But your provider may not have any more efficient servers on their end. You assume they do, what with virtualization and running a bunch of VMs on a single machine, but still.
Google, Microsoft, Facebook and Apple have gone to great lengths to built very energy-efficient data centers all over the world. But they are located in places where they can get cheap electricity per kilowatt hour and, more often than not, these are places that are burning coal to produce the juice. Raftery concludes his talk by mentioning Jevon's paradox, which relates to the industrial revolution and how it created more of a demand for coal-fired industries, which were efficient but still consumed more fossil fuels.
N.B. Raftery was our most recent Cloud Contest judge, but this was a volunteer position.