Home Blur Wars: Google’s Camera App vs. The iPhone 5S And A Real Camera

Blur Wars: Google’s Camera App vs. The iPhone 5S And A Real Camera

You know when a photo has that fuzzy background thing going on and the subject is sharp while the rest looks all dreamy? That technique, usually achieved by setting a camera’s controls to a wide aperture—the smaller the f-stop number, the bigger the aperture—remains one of the easiest shortcuts to a photo that makes people go ooh.

Not one to stand by tradition, Google just released a camera app for Android that does exactly the same thing—except you can adjust the focus and that dreamy blur effect after you take the picture.

Ooh.

Can Free Software Mimic Expensive Hardware? 

The feature is called “Lens Blur,” and it’s built into the new Google Camera app, available in the Play store for Android. Android’s impressive in-house photo software (though not always superior optics) have been more robust than those in iOS for a while, likely thanks to Google’s 2012 acquisition of the excellent photo app Snapseed.

Lens Blur pulls off a qualitatively similar trick to the one that earned the Lytro light field camera so much buzz when it debuted. (Now Lytro is back with an even crazier light field camera, the $1599 Illum, available for pre-order now.) Lytro builds dedicated hardware that allows focus and depth of field to be adjusted after the fact, by letting a camera take in more data (the “light field” idea, detailed at thesis length here).

Unlike the custom Lytro hardware, Android’s new camera app Lens Blur feature pulls it all off through depth mapping, which renders the results less than optimal if you treat it like a 2D Instagram pic. More from Google’s Research Blog on the brains behind the blur:

Lens Blur replaces the need for a large optical system with algorithms that simulate a larger lens and aperture. Instead of capturing a single photo, you move the camera in an upward sweep to capture a whole series of frames. From these photos, Lens Blur uses computer vision algorithms to create a 3D model of the world, estimating the depth (distance) to every point in the scene.

Here’s an example — on the left is a raw input photo, in the middle is a “depth map” where darker things are close and lighter things are far away, and on the right is the result blurred by distance:

Playing Around With Lens Blur

For the sake of comparison, we took a few comparison shots using the new Google Camera app, an iPhone 5S and a Sony RX100 II. The comparison isn’t about image quality, which of course differs wildly between very dissimilar shooters.

Instead we’re looking at how (and if) a few different categories of device pull off that dreamy shallow depth of field effect—the blurred points known as “bokeh” in this style of shot. As any photographer knows, not all bokeh are created equal—the quality of the effect varies quite a lot among devices and lenses. Most of all, we just wanted to see what makes Google’s new trick tick.

The shots below are both taken with a Nexus 4 using Lens Blur. Note how things get a little dicey when the depth isn’t as simple as single foreground object vs. distant background. 

Lens Blur vs. Other Cameras

iPhone 5S

Sony RX100 II

Lens Blur on Google's Camera App

While the iPhone 5S’s f/2.2 lens didn’t feel like doing much in the way of blur, the Sony RX100 II humored our test at f/1.8 in aperture priority mode. Google’s Lens Blur did a nice enough job blurring the background, but it didn’t like the angled depth of that tiny jaguar much.

iPhone 5S

Sony RX100 II

Lens Blur on Google's Camera App

Again, the iPhone 5S didn’t really give us a shallow depth of field—super close macro shots are where it really shows off—but Lens Blur did a pretty nice job here.

iPhone 5S

Sony RX100 II

Lens Blur on Google's Camera App

Lens Blur did not like the concave depth of this little bowl. Its effect is obviously the most successful when the depth mapping is a little less mind-bending. 

All told, Google’s new camera app is pretty cool, taking the selective focus feature so readily abused by Instagram users and ramping it up a few notches. It doesn’t work for every kind of shot—but when it does, it’s awfully dreamy, isn’t it?

Header image by Cee Webster; sample images by Taylor Hatmaker for ReadWrite

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.