Cloud computing, it is often claimed, is a good way for companies to reduce their carbon footprint. The reality, as Tom Raftery explains on Greenmonk, is much more complicated than that.

In context, Raftery is writing about a pair of reports from the Carbon Disclosure Project (CDP) and Verdantix. Raftery argues that "Cloud Computing – The IT Solution for the 21st Century" (PDF) and an addendum for France and the UK (PDF) are fundamentally flawed.

Why is that? In the case of the first study, the assumption is that reducing energy consumption will reduce carbon emissions. Raftery says that the flaw is assuming there's a direct relationship between energy usage and carbon emissions. "If I have a company whose energy retailer is selling me power generated primarily by nuclear or renewable sources for example, and I move my applications to a cloud provider whose power comes mostly from coal, then the move to cloud computing will increase, not decrease, my carbon emissions."

In the latest report, the assumption is that companies in France and the UK will move their applications to the cloud hosted in France or the UK. Where are most cloud hosting providers? Depends on which provider you're looking at. Quite a lot of it is in the U.S., which, as Raftery notes "has one of the most carbon intensive electrical grids in the world. France, on the other hand, with its high concentration of nuclear power (78%) has one of the least carbon intensive electricity grids in the world." The UK sits just above the world average, according to the same data.

Migrating a workload to the cloud can have a positive effect, of course, but it's hard to tell. Raftery suggests that cloud providers need to do a much better job of being transparent about the locations of the data centers, along with the carbon footprints. Without that information, you don't really know how green the cloud is after all.