Guest author Alan S Cohen is a serial enterprise technology veteran, and most recently a Vice President at Nicira, which was acquired by VMware for $1.26 Billion.
Years ago I was delivering a keynote at a conference when my Yahoo Messenger client popped up on the screen on my computer (and the screens to which I was projecting). In front of hundreds of people, I received this plea from my 11 year-old daughter:
"Daddy, I cannot get the printer to work. Can you help me?"
After pausing to type back, "I will call you in a half hour," and closing the application, I received chuckles from a sympathetic and knowing audience of IT people. Right then I realized something: everyone who owned a computer and associated peripherals was the IT person when they went home. Moreover, because operating systems, applications and devices rarely worked flawlessly - despite the marketing hype - they required an enormous amount of human intervention to do even many of the basic tasks. It hit me: In the consumer IT revolution, I was nothing more than "human middleware."
The good news is that it got better. First, I cleaned out all the PCs and turned us into a Mac household. But as a purveyor of multiple technologies to thousands of companies over the past 20 years, I started to find a similar pattern across the entire enterprise technology food chain. The U.S. Bureau of Labor Statistics estimated there are about 350,000 network and systems admins in the U.S. alone. Trust me, once you start to look for IT human middleware, it's like looking for spiders - you are never more than 10 feet away from one.
Why does all this human intervention exist? You would think that if people spend almost $4 billion a year on hardware and software, it would, ahem, work as promised out of the box. Let me posit a few potential explanations:
- The transition from mainframe computing to client-server 25 years ago transformed the landscape with thousands of new applications and players, but it also created a lot of marginally compatible technologies. "Standards" and "protocols" just did not get the job done.
- Industry hyper competition forces vendors to push out products fast, frequently bypassing the time to thoroughly test key elements of systems in production environments. The fast eat the slow.
- When trained experts handle system complexity, it shifts operating expenses from the vendor to the buyer. Instead of perfecting and automating hardware and software operations, they let customers pick up the tab for finishing the product in production. Imagine buying a flat screen TV and then having to pay an expert to come over every month to reconfigure it.
- More nefariously, creating certification and deep training programs creates a loyal base of richly rewarded specialists.
While it is only natural that a certain amount of human expertise is required, especially in complex systems, there has to be a limit to how much extra time and effort it should take to use a piece of software or hardware. And to be fair, almost every IT person I know wants to spend their time building new applications and services, not babysitting infrastructure.
Software Eats The World: The Rise Of Developers And The Decline Of Operators
Traditionally IT vendors sell to "operators," people responsible for the administration and maintenance of systems. These are the people who go to the big annual vendor conferences, wear crazy hats and dance to aging rock bands like Blues Traveller at some Las Vegas Hotel. The shift to Cloud Computing, built on the back of virtualization technologies, is turning this upside down. Indeed, all the new energy in the cloud computing communities including OpenStack, CloudStack, Amazon Web Services, etc., is about shifting the power to the developers.
As Marc Andreessen pointed out last year in a contribution to the Wall Street Journal, Software is Eating the World, the rise of the software economy is transforming and consuming entire industries. Software is also eating IT. Virtualization - which allows you abstract and generalize compute, storage, and now networking technologies so that can be consumed as software - is automating the very factory floor of IT. Just as robots took over the manufacturing process in industries like automobiles, a similar phenomenon is happing in computing.
This is a good thing.
Unlike carbon-based life forms, distributed software systems do not call in sick, almost never are distracted by Facebook and rarely make configuration mistakes. As more and more applications and computing move into virtualized clouds, the burden of operating computers drops dramatically, including the need for human middleware. Hence another argument for Cloud.
This means the new power jobs in IT will shifting to developers: people who create applications and that drive businesses forward or create bone-dry simple IT infrastructure to make it happen. The new IT elite is directly tied to the bottom line and velocity of business.
That does not mean IT administration jobs are going away overnight - or that they are unnecessary. But do you want to be on the team that creates big binders of configuration files or the team that crunches a Big Data application that opens up a new market? Peter Sondegard of Gartner estimates by 2015, there will be 4.4 million jobs worldwide devoted to the support of big data.
Are You Human Middleware?
If you work in IT, I have quick three-question test to determine if you career is headed to permanent human middleware or if you are the positive side of history:
- Do you spend hours and hours with 200-page (even 2,000-page) manuals?
- Are there dozens of training companies clamoring to help you build your career?
- Do most of your causal shirts sport vendor logos?
Which side of the bed do you want to wake up on tomorrow?
Image courtesy of Shutterstock.