CardMaster.com. Along with that were the database passwords and back end of a major U.S. Public Broadcasting Service news show website (Gwen Ifill's Washington Week), including daily updates from panelists on the stories they cover.At 1:00 a.m. on Sunday morning I was doing routine maintenance on my personal Amazon Web Services account and instead found myself looking at something I had no right to be seeing: A database with 800,000 user accounts to the e-card site
Guest author Jonathan Siegel is a serial entrepreneur and founder of the cloud applications consultancy ELCTech.com as well as a handful of cloud startups. Jonathan's book, Electric Connections, is due out in June of this year.
I am an early adopter, business builder and owner of a cloud consultancy. On Sunday morning I went to clear out my personal Amazon Web Services account of excess files after seeing huge usage numbers from a report by CloudSplit. For those technically inclined, I was clearing out my S3 buckets and moving the few files that I wanted to save into an EBS disk instead.
My EBS disk ran out of space and I went to use a feature called EBS Snapshots. Snapshots are like a tape backup of your EBS disk drive. That's when I noticed something odd: My EBS Snapshot account was filled with hundreds of snapshots, when I knew I had only made a handful. I wondered, Why do I have access to these backups? Were these backups made by my teammates? Shared snapshots from Amazon? Or something else...
What I saw were backups of Enron emails, a genomics database and then two made my stomach turn - a database for 800,000 user accounts to CardMaster.com and the database and site files for the Washington Week website. Yeah, the Enron emails are a non sequitur and the genomics database was likely meant to be public. But the other two, there's no way they were intended for the public, yet here they were - marked as public and available to me or any other Amazon cloud user.
How Did This Happen?
Amazon is the largest and longest running public cloud computing platform. It has pushed the boundaries of technology infrastructure for us users. In fact, it has given us tools that are more powerful than anything we previously had available in our own small datacenters. This is great, because before we needed to hire trained Cisco or NetApp administrators in order to do basic tasks as our websites scaled. This was expensive and added another step - a delay - to our deployments. Amazon's infrastructure commoditizes much of this technology into simple Web calls; paste some XML to Amazon and your website gets a full incremental backup to live-networked NAS. But as Stan Lee has warned us: With great power comes great responsibility.
By giving programmers control of the network and storage, we've empowered developers to take on system administration chores. This power has come too quickly or is being digested too lightly - as my discovery has shown.
In the case of PBS's Washington Week there was quick acceptance of the issue. "It was human error and nothing personal was exposed," said Kevin Dando, PBS's Director of Digital Communications. "Although we weren't aware of the issue initially, it was easily corrected. Because of Amazon's strong audit capabilities we could pinpoint the error and fix it quickly."
Despite numerous attempts we were unable to reach CardMaster.com.
This highlights a deeper issue in the cloud today: Despite what you may think, cloud security is not sexy. We are seeing products that address the baseline needs of cloud functionality, like Amazon's dashboard and the support sites for the cloud. They focus on the sexy: deploying mobile apps, auto-scaling, grid processing and other buzz-word-friendly features. But the dirty truth is that the cloud has a whole new user profile acting as administrator and needs a new set of tools and expectation management to ensure that little mistakes make little problems and not big ones.
Remember: This is not something that Amazon did wrong. This is an intentional switch thrown by Amazon's users that allowed their data to be public to any other Amazon user. The users did not mean to hit that switch and it's unclear whether those users would have found this issue without my notification.
This is the switch in Amazon's Web Console. It can be more subtle when packaged deep within cloud-assisting tools:
And Why Me?
A spokesperson for Amazon pointed out that snapshots were private by default and users must choose to share them. According to Amazon, "in general users understand this feature very well as this is no different than users explicitly choosing to share their data by any means." However, as we've seen, users are obviously making their data inadvertently public. Amazon said they were updating their documentation "to provide more explicit guidance on this feature," and that they would be "reaching out to the few who may be unknowingly sharing their snapshots."
The question, though, is: Is it too easy to accidentally make your data public - and whose role is it to play data cop?
This leads to me, at 1 a.m., and finding security leakage with Amazon's cloud customers while doing unrelated housekeeping. Look, I'm anything but an IT Security guy; I've got enough on my plate to worry about. For god's sakes, I have 6 kids! Moreover, I'm an outspoken supporter for moving companies to the cloud - and I exclusively recommend Amazon's cloud because of its reliability and features. Why is it me that finds this security issue - one that has been open since January of this year if the Snapshot dates are accurate.
This tells me that there is a pattern about to be replayed: That the users on the cloud today are a motley crew. That we need more supervision and hand-holding - whether we like it or not. That powerful services like CloudKick and CloudSplit need to be encouraged to add security as a top-priority feature. And we need to budget for their services and embrace their boring, yet hyper-important role as perimeter guard and security inspector.
If I were to try to keep this security problem in the bag - and avoid alerting the community - I would be fostering a sense of complacency that is antithetical to the marketplace needs. The cloud is so young that when we find a problem we need to admit it and find real, workable solutions. Since the cloud represents new ways of doing things, it gives us new ways of getting in trouble, and we need a lively forum for nipping these issues in the bud and laying a framework for ongoing success.
If you are on Amazon's cloud, I can't stress enough that you need to immediately go to your AWS Management Console. Check at a minimum that your Snapshots, for every Region, are marked PUBLIC only if you mean them to be available to ALL other Amazon Web Services users. I've already checked mine. If you find data that you did not intend to make public, you need to engage your security team to remove the snapshots from the public and mitigate any data exposure.
Hopefully this gets chalked on the wall as a lesson learned - and we continue our march to the cloud with a deeper appreciation of our security support needs. This isn't about calling people out. I work in the cloud and am passionate about its development. These mistakes could very well have been ones I made - or any other cloud user. To move the cloud forward we need to encourage a dialog about our new found power, new paradigms and new needs in the cloud.