When Steve Jobs and Apple announced some of the new features and APIs that would be available in the upcoming iPhone OS 4.0 upgrade, they managed to sneak in a feature that has gone largely unnoticed. On a slide showing a smattering of new APIs, “Full access to still and video camera data” sat quietly at the bottom of the screen, and when Jobs named off a few of the APIs, he left this one out. Mobile augmented reality (AR) developers, who have been champing at the bit for access to raw iPhone camera data insofar as to petition Apple for it, immediately took notice of the feature.
But wait, can’t apps already access the camera? Until OS 4.0, this was only partially true. Yes, developers could access the camera and include either stills or video in their applications, but the ability to actually analyze a live video feed has been severely hampered. Previously, applications could pull screenshots of a video feed at a rate of around 15 frames per second, but now these applications will be able to analyze the raw video feed.
This will help applications that currently rely on analyzing video stills to complete their tasks much faster, and will open the door for new applications to process the live data for new types of visually aware apps. Mobile AR developers have been previously forced into creating “blind AR” apps that are merely using the video feed as a backdrop on which to place geo-tagged markers. With access to the raw video data, the environment being captured from the camera will now be able to play a much larger role in accurately placing these markers into 3D space.
While in Colorado for Boulder Startup Week I’ve had the chance to chat about AR with a few developers and engineers who are excited for live video access in iPhone OS 4.0. Both Brendan O’Connor from SimpleGeo and Vikas Reddy from Occipital agree that this new API is a huge step forward for mobile AR applications – something both companies are looking to delve into further in the near future.
Layar CEO Raimo van der Klein, who chatted with me via email Thursday, says the advancements in the new iPhone OS are “a great opportunity to improve Augmented Reality experiences” across the mobile platform. “We are very excited about this additional API,” he added.
It will be very interesting to watch the advancements in mobile AR and other fields as developers start discovering new ways to make use of live video data. With a rumored front-facing camera coming to the next generation iPhone, mobile AR could also take some steps forward that will allow experiences much like those seen in desktop-based applications that use a webcam. For that to happen, however, developers have needed the access to live video data. Now they have just that.