Home The Cloud Can Save Us Billions…But Can We Afford it?

The Cloud Can Save Us Billions…But Can We Afford it?

Yesterday the Oregon state treasurer’s office announced that it has seen power consumption in its data center drop 25% in the first month since it adopted a virtualized infrastructure.

That kind of example makes it seem like cloud computing and virtualization are viable options for leaders at the state and federal levels of government. It’s a correct assumption. But the reality is all together different.

This week the Obama administration ordered a stop in upgrades to 30 major information technology projects, a decision that, according to The Washington Post, impacts about $20 billion in government spending. The projects were designed to upgrade computer systems that manage financial information and transactions for federal agencies.

The news reflects a paradox for the Obama administration. It is a big proponent of cloud computing but it faces pressures from all sides to cut expenses. Elections are coming up and the GOP has some influence. In that respect, the spending cut is as much about politics as it is about managing technology infrastructure.

Thanks to the recession, anxiety about job loss is in full bloom. In this environment, moving IT assets to the cloud or adopting virtualization can be perceived as a potential threat.

That dynamic creates a challenge for IT. Ongoing advances in virtualized and cloud computing environments could help governments save billions of dollars. But what exists instead is a patchwork network with little data exchange between government entities and even less accessibility to information about the running of government. It’s a problem with implications at the local, state and national levels.

Terror Threat: Data Failure?

The patchwork network problem is apparent at the highest levels of government. Sharing between federal agencies is impeded as the ability to share information often involves extracting it from silos and then aggregating and analyzing it for subsequent collaboration and review. The data from each silo has to be viewed almost in singularity. That can be time consuming, expensive and error-prone.

Umar Farouk Abdulmutallab, the Nigerian who tried to blow up a plane over Detroit, had been entered in the Terrorist Identities Datamart Environment (TIDE) system before he attempted his attack. TIDE contains the list of about 550,000 known or suspected terrorists. He had been reported by his father to the United States Embassy in Nigeria, who had believed his son had been radicalized. But no one prevented Abdulmutallab from attempting to attack a Northwest Airlines flight as it started its descent into Detroit last year on Christmas Day.

From The Atlantic:

“Umar Farouk Abdulmutallab’s name was in the database, officials have said, along with biographical information and the warning provided by his father, who told the CIA’s chief of station in Nigeria that Abdulmutallab had fallen in with terrorists. Why Abdulmutallab’s name was not forwarded to the State Department or the FBI for further review, especially in light of warnings about Nigerians preparing to attack the United States, is the focus of an intense investigation. Datamarts like TIDES are only as good as the info that goes in and only as good as the common format it is compared against.”

And this from Wired:

“[A] Justice Department inspector general report earlier this year found that the FBI was mishandling the watch list and was failing to add legitimate suspects under terrorist investigation to the list while also failing to properly update and remove records from the list, subjecting U.S. citizens to unjustified scrutiny.”

It’s conceivable that better data storage, analysis and optimization could have provided the capability to discover information about Abdulmutallaband and get it to the people who need to make quick decisions.

The task of passing critical information relies on processes that require checking multiple sources. Why the FBI is mishandling data is a reflection on how we have been dealing with the process of modernizing legacy environments and outmoded database environments.

A dearth of funding mires many government agencies. Even the antiterrorism database has been slated for budget cuts. Again, from The Atlantic:

“According to one official, who asked not to be identified because intelligence budget matters are classified, the administration and Congress slashed the budget for the National Counterterrorism Center by at least $25 million. Those affected, the official said, included employees responsible for maintaining the Terrorist Identities Datamart Environment (TIDE) system, which contains the list of about 550,000 known or suspected terrorists.”

Starving IT: Legacy Environments

It’s almost come to the point of starving for some agencies. The IT environment becomes so antiquated that sustaining life is all that matters. Virtulization is impossible as changes to the entire system are required even to access data.

The state of New Jersey, for instance, has a payroll system that is 41 years old.

What’s striking is how poorly the New Jersey government is serving the public by not updating these aged systems. Information that should be freely available to the public is so locked up that it can’t be reached. Only the most important agencies get the funding needed to keep systems modernized.

According to The Press of Atlantic City, the problem became apparent when the newspaper’s requests for computerized records from two state agencies couldn’t be granted because of severe technical limitations.

“In one case, the records are not even kept on computers. In another, The Press was told agency operations would halt if it attempted to copy the computerized records requested.

Situations like these mean that independent – or even state – analysis of certain records to find trends or trouble spots is impossible. For example, The Press sought to analyze complaints about injuries inflicted on customers by nail salons. Without computer technology, that work would be overwhelming or cost-prohibitive. The newspaper also sought to analyze complaints against cable television providers. The analysis was not feasible given the outdated technology.”

In the article, New Jersey State Treasurer Andrew Sidemon echoes the sentiment about the lack of funding:

“But Ebeid said funding has not allowed widespread modernizing of systems, only for maintenance of what’s installed now. The gulf between departments, where some run these “legacy” systems while others have been modernized, has depended on raising the funds themselves.

But most other departments rely on the general fund. Those, Ebeid said, have been left with the old technology from previous generations. Young information technology (IT) staff, fluent in current programming languages, have even been trained to work with those old machines.”

Legacy systems are often maintained by an older generation of IT professionals who were schooled in the complexities of heavyweight applications tied to central-server networks. Managing these networks is a bit of an art form. Technicians and engineers each wield their own unique set of skills. They write scripts that repair problems. Over time, it’s analogous to using too much duct tape.

The Politics of IT Savings

And then there is – as we mentioned earlier – the issue of jobs. Chief information officers from the state and federal level gathered earlier this month at the Government Technology Research Alliance conference. Their remarks at the time highlighted the problems associated with jobs and the adoption of modern technologies.

According to Government Computer News, there’s a push by the Office of Management and Budget to consolidate data centers. It makes sense. There are 100 federal agencies. Each one has anywhere from five to 20 data centers.

Ken Griffey is transition manager for NASA’s National Center for Critical Information Processing and Storage (NCCIPS), a federal shared services data center. NCCIPS hosts data centers for the Homeland Security and Transportation departments, as well as the Navy’s supercomputers.

Government Computer News wrote that:

“Pure common sense says that we can save billions if we consolidate,” Griffey said.”I wonder how practical that is going to be. NASA has 10 fiefdoms. NASA appears to be an agency on the surface, but it is very politically driven for each state that has a NASA center.”

The challenge, he said, results from the fact that NASA data centers provide jobs. It’s unlikely that any state is going to volunteer to give up its data center and the 1,000 or so jobs the center provides.

“I see those as obstacles to data center consolidation. The obstacles are more in the political arena,” Griffey said.

Success Stories

While there are significant obstacles, success stories do exist. GCN points to the state of Utah, which consolidated from 35 data centers down to two. The state now has enough scale that it can provide data center capabilities to cities and counties. The cities and counties then pay for data center services out of their operational budgets.

And in the state of Oregon, its treasury department isn’t content with just reducing power consumption. It’s also replacing its servers in two phases. In the first phase, 37 servers have been virtualized. The department says that equates to about $46,000 in hardware costs. In the second phase, the state will save about $373,000.

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.