Google Chrome." That's because Google has done a masterful job positioning Chrome as the high-performance, secure browser.Ask any power user, blogger, or journalist which PC web browser is the fastest or the most secure, and the answer will almost always be, "
But a bumper crop of poorly performing, quickly coded Google Plus extensions for Chrome threatens to damage the reputation Google engineers and marketers have worked so hard to establish.
Chrome's blazing fast speed is why Microsoft and Mozilla invested in performance optimization for Internet Explorer and Firefox, respectively. They've caught up In the three years since. But in speed test after speed test, Chrome still places first or a close second against competing browsers in key performance categories.
Chrome also regularly emerges unscathed from honeypot hacking competitions. Google has gone so far as to offer a $20,000 bounty to any hacker who could crack Chrome's code.
Google's engineering and PR work is paying off. Chrome's position as "the fastest, safest browser" is the reason why Microsoft's Internet Explorer, Mozilla's Firefox, Apple's Safari and other browsers continue to cede market share to Google in the PC browser wars.
The Opposite of Mission-Critical
Chrome's success is due, in large part, to Google's positioning of the browser in consumers' minds as the best of the bunch for the central mission of browsing the Web. But that positioning is under threat by a growing crop of user-generated add-ons that extend the functionality of Chrome while also destabilizing it.
Developers from around the world rushed to create Chrome extensions that filled gaps or added functionality to Google Plus. They paid attention to Plus user complaints and in short order loaded up the Chrome Web Store with hundreds of new extensions (I counted 172 at the time of this writing).
Word of these extensions spread rapidly across the Google Plus social network and several became "must have" tools. One of the first to go viral was Surplus. It was also one of the first to regularly slow down, destabilize or crash Chrome. Another is G+ Me, which is now taking Google Plus by storm. It too gained a reputation for slowing Chrome to a crawl or causing bizarre behavior while users were using Google Plus.
Watching the stream of posts on the Google Plus social network, it became clear that many users were unaware their extensions were bombing the browser. Some users gave up and switched to Firefox or another web browser. Others became annoyed and even angry, blaming Google Plus or Chrome for the instability. Some users, of course, deduced the root cause and either spread the word or contacted the developer of the errant extension.
With applications such as Google Plus, Google Apps and Google Docs, Chrome becomes more of a platform and less of a simple web browser. Google wants to give users a rich experience, with a variety of open source extensions developed by third parties. But there is growing frustration among users related to low quality or poor performance of these add-ons.
Extensions can have performance problems of their own (e.g., inefficient, quickly hacked together code) that affect the perceived user experience. Or they may have some hard-to-anticipate dependencies on external data streams, web services or servers and affect overall Chrome's performance indirectly. All of these issues can and do result in substantially reduced browser performance and a degraded user experience. And that takes some of the shine off the Chrome brand.
A Call for Quality for Extension Developers
Here's my plea (on my bended knee) to the talented guys and gals who build extensions for Chrome: Please performance-test them before releasing to the web-wide world. Don't force unsuspecting Chrome's users to be your alpha-version crash test dummies.
Chrome extensions can be performance tested in at least two ways: on their direct performance and impact on the browser.
- Indirect impact will be an effect of an extension on rendering a web page originally unrelated to the extension. The end user impact, or lack thereof, can be measured by recording a few use cases on a few representative web pages with a performance monitoring tool. Look at page load timings before and after installing the extension, or set some thresholds. Replay the use cases to analyze whether the performance changed. (This can also be thought of as performance regression testing.)
There are several free and commercial tools available to extension developers for conducting direct and indirect impact testing, so there's no excuse for not testing code early and often.
Google built a better, faster browser, and the rest of the industry followed suit. The result: the web experience improved for everyone overall (and that, by the way, is Google's true objective: a faster web is better for people using SaaS apps, cloud solutions, and search).
As developers, lets show our appreciation for what Google has wrought for users everywhere. Lets ensure the extensions we build to improve the user experience actually do, in practice, make it better.