SAP blog post by SAP's Chief Scientist Ike Nassi who compares in-memory computing to Ground Hog Day, the film starring Bill Murray. In the film, Bill Murray plays a character who gets caught in a time warp, living Groundhog Day over and over, like some existential trap.Clever headline in an
Nassi says the advent of in-memory computing is like seeing the same inflection point that was seen in the days of VisiCalc and Lotus 123. He writes that many thought the spreadsheet concept was trivial and not worth a serious person's time. In-memory computing was dismissed in some respects as people perceived the real opportunity being in analytics and big database tools.
But it turned out that the "what-if" analytics and the instant "end-user programmability of large data sets" that spreadsheets provided were actually what made the PC revolution happen. The whole point is that VisiCalc and then Lotus 1-2-3 could not have been done without fast access to "in-memory" data. Single-handedly, large spreadsheets were the program that drove the need for more DRAM and in effect "sold DRAM" to PC users. Spreadsheets needed to have all of the data in DRAM in order to make real-time analytics, decision-support and "what-if'ing work.
The "Groundhog" point here is that in-memory is once again driving the need for business to get questions answered fast. In the early days of the PC era, the spreadsheet forced Apple, Microsoft and Lotus to make more memory available. Nassi says it is why Apple increased the memory capacity of the original 128K Macintosh to 512K. In part, the move was made to support Lotus.
He says today "businesses are going to need systems that have far more addressable memory, holding entire analytic "what-if" databases; and, more importantly, they need not be just off-line computations, but an integral part of the decision-making process."
It will become standard to do what-if questions and track the impact across the enterprise.
So, is Nassi correct? Is it Groundhog Day all over again?