The Obama Administration's 2012 budget includes $126 million for the development of exascale supercomputing. The last budget marked out only $24 million for supercomputing.
Exascale computing systems are said to be capable of 1,000 times the processing power of the fastest computer currently operational, the Chinese Tianhe-1A supercomputer.
The Department of Energy's Office of Science will get $91 million, while the National Nuclear Security Administration will receive $36 million, if the budget is approved by congress.
Advanced computing has a DOE total budget about of $465 million, an increase of 21% over 2010.
Supercomputers are used to model complex systems. The higher-functioning the supercomputer the more accurate a model can be, whether of weather, war or global warming. Currently, supercomputer processing speeds are rendered in terms of a petaflop, one quadrillion floating point operations per second.
Exascale computing, which most experts believe will be achievable by 2021, will increase this a thousandfold.
The ability to compute in exabytes seems increasingly necessary as the amount of data available increases cataclysmically. Eight years ago there were only about five exabytes of data online. Two years ago, that amount of flowed over the Internet in a month. But recent estimates put the monthly Internet data flow at 21 exabytes.
The problem with reaching this milestone is not so much computing development as it is power requirements. According to supercomputing specialist Peter Kogge, the development of exascale is liable to hit a "power wall."
"(S)uccess in assembling such a machine will demand a coordinated cross-disciplinary effort carried out over a decade or more...to find the right combination of processing circuitry, memory structures, and communications conduits -- something that can beat what are normally voracious power requirements down to manageable levels."
To get more of big data on, download ReadWriteWeb's free report, "The Age of Exabytes: Tools & Approaches For Managing Big Data.