The twentieth anniversary of the World Wide Web was celebrated worldwide a few days ago, which says something quite compelling about the state of the Web itself. As ReadWriteWeb Editor-in-Chief Richard MacManus personally verified two years ago by interviewing the guy who thought up the thing in the first place, the Web was established in 1989.
Historically, facts have been considered impediments to good stories. That is until recently, when not even the literary equivalent of blinking neon “FAIL” signs have stopped entirely apocryphal stories from propagating like mercury. Just ask any Internet Explorer user with a high IQ: When the opportunity for a juicy headline arises, such trifle things as facts, math, and common sense don’t even amount to blips on the Web’s radar.
This is a serious problem. Facts, math, and common sense– not coincidentally – are all principal ingredients of the jobs we do. Jobs are the things that keep us grounded in reality, and prevent us from becoming permanent residents of social networks.
In the real world (which has yet to receive a hashtag), there are some fairly serious matters to attend to. For the past quarter century, the majority of the intellectual work we do daily has taken place in the memory space of processors no further away from us than arm’s length. As we head into the next four years, let alone 25, this fact will no longer be a certainty. Not only data but processing power is rapidly moving to centralized, flexible, self-portable constructs that can be resized and relocated as conditions warrant.
Tell me: When you can access all the applications and documents you use on your PC today on your iPad, how often will you be using your PC?
The Web is the workplace of the cloud. Thus, “going to work” for many more people means logging on. And even for those in manufacturing whose work is done with their hands as much as with their minds, the value of that work may only be determined in a market that is becoming less physical, more virtual by the day. As that happens, the integrity of the little bits of code that identify us, that certify our transactions, and that associate us with the things we say and publish, will be threatened more than ever before.
This thing we call the Web – whose 20th anniversary is coming up on its third anniversary – is losing not only veracity but substance and definition as well. More studies and analyses trumpet the news that the world of the Web is moving to HTML5. Yet the people Web developers depend on to set the standards for them — to declare just what HTML5 is – can’t conclusively verify that such a standard exists. So nearly every recognized name in software – Google, Apple, Microsoft, Adobe among them – is asserting itself as the HTML5 standard-bearer, as they all set sail in different directions. Meanwhile, the simple, fill-in-the-blank online forms that the online retail industry uses for customer transactions, may have to be subdivided into technology categories in order to work reliably for as many as five different, simultaneous browser standards. Unless, of course, online forms steer clear of the confusion and stay just as they are — in which case, remind me again why we needed HTML5.
As we move more of our business’ and our own personal data into virtual environments like the cloud, database size is ballooning. And as that size increases linearly, time-to-access increases exponentially. One solution that architects are considering is destructuralization of databases, replacing relational procedures for loose couplings. While that may improve accessibility in theory, in studies thus far, it reduces security in direct proportion. The “fail whale” could soon become the emblem of efficiency in this decade, unless developers come up with truly revolutionary new concepts in storage maximization, and soon.
Finally, it wasn’t anything that the architects of the Patriot Act could have ever considered. But the law that effectively compels U.S. service providers to open their customer databases to warranted federal inspections has compelled cloud service customers in Europe, the Middle East, and China (think about that last entry seriously for a moment) to build virtual firewalls around the United States. They’re writing clauses into their service contracts that prevent virtual image data from being transported into servers inside U.S. borders, and in some cases from being shared with service providers anywhere in the world that are headquartered in the U.S. What had been considered a simple act of precaution in the post 9/11 era has evolved into what European legislators frame as no less than a human rights issue.
These are trends that require not just a journalistic, but a scientific, level of examination. The more we amuse ourselves with the IQ levels of certain browser users, and elevate the authenticity of folks named “Anonymous” to that of people with real jobs, the less we’re able to concentrate on the real problems ahead of us. The ability of the Web in recent years to characterize the precise nature and dynamics of these trends is best summarized by this quote from Dr. Seuss: “Oh, the noise! Oh, the noise! Noise! Noise! Noise!… The NOISE! NOISE! NOISE! NOISE!”
The dozens of you who have followed me over the last 27 years will recall that if it’s one thing I hate, it’s the noise. (That, along with the expectation that in person, I really do sound like Boris Karloff.) You know that I’ve made it my mission to wade placidly amid the noise and waste, and come out with clarity and insight that can be put to use in business and in your everyday work. Starting this week, I join my long-time colleagues David Strom and Joe Brockmeier in providing this same clarity and insight to the readers of ReadWriteWeb’s Enterprise channels. My hope is that a quarter century from now, when the world celebrates the 25th anniversary of the Web, one of its many milestones will be this one period in time — this brief, shining moment where, for at least a few thousand readers, the virtual world made at least as much sense as the real world.