The mobile augmented reality (AR) industry has seen a tremendous amount of growth over the last year. No longer are we simply holding our phones up looking around for nearby coffee shops – now our magical pocket computers can recognize images and augment them in real-time with 3D graphics. Mobile AR browsers like junaio and Layar have begun to venture into this realm, but a new player, Popcode, has a different spin on the mobile AR interface and how we interact with objects in the real world.
Rouli Nir at Games Alfresco brought Popcode to my attention today with his article introducing Extra Reality Ltd., the British company behind the technology. Available for Android only at the moment, the free app lets users activate AR experiences by scanning one of Popcode’s unique codes.
The codes consist of a series of dots and dashes (like Morse code) placed above and below the Popcode typographic logo. By scanning this marker, the app will quickly download the necessary assets to load a 3D experience based on the markerless tracking of an actual real-world object. Some of the examples in the video below include interactive business cards, maps and t-shirts.
There is something I like about the idea of using a simple marker to navigate to a larger markerless experience. I didn’t have to flip through menus or search for the t-shirt example in order to use it. Simply pointing my phone at the Popcode made jumping right into the experience much quicker, and more natural.
There are still uses, of course, for traditional menus and searches, but for object-based example like these, scanning a code to launch it makes a lot of sense. And by using a prettier typographic code (rather than a blocky QR code), it makes these experiences more accessible and raises awareness the way the proposed standardized AR logo would.
Eventually, however, it would be nice to be able to automatically recognize the markerless experience without needing the marker to unlock it – but this is a technological restraint at the moment. Phones and cellular networks aren’t really fast enough to query a database with a live stream of video looking for a potentially complex image-based marker. That’s why it’s so much easier to use the simple black and white markers to act as the gatekeeper, because they are smaller and can be stored locally on the device.
Of course there are projects like Google Goggles that are helping the evolution of this kind of interface, but the give and take between what is done on the phone versus in the cloud is still limiting. Either way, Popcode looks very interesting, but I wonder about how many unique codes they can actually make with their system (without adding more rows of dashes and dots).