Recorded Future is a startup technology company that described itself as a "temporal analytics engine." It tries to uncover and analyze very faint signals, basically in order to predict the future. It's backed by Google Ventures and the data-loving VC firm IA Ventures.

Today, Recorded Future articulated its vision of the future of news. By news they don't just mean what's broadcast on TV at 5 and 11, they mean current events of interest to people seeking actionable information. The gist of the company's argument is this: real-time web publishing, best exemplified by the news-breaking social network Twitter, is ultimately a race to the bottom. Eventually the time between things happening and their entering the cycle of news recycling that goes on for days or weeks will drop from the 10 or 20 minutes that it's at right now...to zero. That's a losing proposition for competitive news gatherers, the company says, and will be replaced in the future by an endless competition to get better at predicting the news earlier and earlier, before it happens. It's a compelling argument, I think, and well worth considering.

Above, Recorded Future's map of what it calls "the extended news cycle."

Is it really possible to predict the future based on a giant index of digital information? Google has said it aims in the future to serve up what you want before you even ask for it. And people have always said that those who don't learn from the past are doomed to repeat it. (There's some very interesting discussion on this topic going on over on my Google Plus account right now.)

I'm apt to believe that there is a good chance that with enough data, analyzed smartly enough, many events are predictable with enough accuracy that it would be useful. Is that where the next arms race of analytics software will be fought? I wouldn't be surprised. I am willing to bet that Google in particular will offer in the not-so-distant future, if it can pull it off, prediction or recommendation technologies based on the massive swaths of data it is ingesting and analyzing from the web and search, from web traffic, from spoken word analysis, from sensors in self-driving cars and other signals. I was very surprised that the company shut down its Smart Meter platform last month, but perhaps it's focusing on data collection in industries where it can own more of the technology stack than in energy.

Walmart is already fine-tuning how it stocks the shelves across its empire of stores based on what it learns from peoples' posts to Twitter. So is this stuff for real? I think it's only a question of how truly useful it ends up being.

Below: In the future, people will be impressed by impressive videos.

Compare this with the real-time web of today. Twitter is famous for breaking news of earthquakes, political scandals and developments in many industries (especially technology).

From News to Pre-News

From Recorded Future:

"The early nature of such signals obviously makes them very attractive. At the same time, these are subtle signals, and it will take judgment, statistical rigor, or the like, to take advantage of effectively and confidently.

"Tricky issues also remain in identifying prescient signals. These range from the technical (efficiently and accurately organizing references to time in news) to the psychological (how we go about researching and analyzing information that may indicate a future event).

"In summary, the nature of news continues to change, and the game of analyzing it for actionable information as is shifting from news to pre-news to early event detection - that's where the future is and the value lies."


"There are opportunities to detect this event, whether from a Taiwanese blogger seeing the semiconductor factory exploding or a sudden co-occurrence of tweets or inferred information in a collection of option trades, and being early here can certainly capture value," Recorded Future writes. "However, we can expect the delay from event to news story to keep shrinking rapidly. This is frankly yet another race to the bottom."

I know in my own personal experience using the real-time web to find early news, that's been the case. Years ago I was one of the first news writers to subscribe to company RSS feeds from key vendors in my beat through SMS and IM. I would get updates in minutes, automatically, and write up the news before anyone else. That strategy grew my early career fast, but now almost everyone I compete with does the same thing. That's why when one of the major companies online puts up an important blog post, you'll see 5 blogs publishing coverage of it within the next 15 minutes.

Time to detect that news after it happens is, arguably, no longer a competitive advantage. Companies like Recorded Future believe they know where and how to look before events happen - to try and discern clues about what will happen in the future.

I would argue though that same strategy is true of after-action real-time news discovery. There is still a competitive advantage in knowing where to watch for news updates, even if there's no longer any competitive advantage to consuming widely-watched sources faster.

Pre-cognition service providers might argue that they know how to show you where to go in order to "skate where the puck will be," but I'm not convinced there's not still plenty of advantage to be found in strategic determination of where to watch for real-time events.

"It goes without saying," Recorded Future says none the less, "that the ability to capture value (be it economic, strategic, tactical) is directly proportional to how early one can detect and execute."

Of course that's true - but we'll see how well the predictors can execute their detection and thus provide opportunities for the rest of us to execute our responses. I'm not quite ready to give up on real-time news as too slow, yet.