Take two open source projects, do a little creative hacking and ingenuity and what do you get? The Android-Kinect project. An engineer that goes by the name DDRBoxman hacked a Galaxy Nexus smartphone with his a projector, a PC and Microsoft’s Kinect API and was able to use “touch” based gestures to control the user interface by interacting with the projection. Everybody has been waiting for The user experience brought to us by the film Minority Report. Well, this engineer might have brought us closer than any other hack before.
DDRBoxman works through something called Recursive Penguin, which from its website we cannot tell is a personal project or some type of company. The Facebook link on Recursive Penguin leads to an Android developer by the name of Colin Edwards that works for a mobile development studio called Ironclad Mobile (which is now called Uncodin), based in Austin, Texas. Uncodin has funding from the Bill & Melinda Gates Foundation to create a an app to boost math test scores for 9th graders and has some funding from DARPA for a mobile training application, according to the Facebook page.
DDRBoxman downloaded the Android 4.0 ICS source tree from the Android Open Source Project (AOSP) and created a custom ROM for his Galaxy Nexus. He then sends command to the Nexus with TUIOForAndroid. TUIO is, “an open framework that defines a common protocol and API for tangible multitouch surfaces” according to TUIO.org. The PC is then configured with the touch interface through the open source Kinect API and voila! We have a tangible user interface on the wall.
The Kinect API is fascinating. Some of the greatest innovations of using motion-based input methods are being cooked up at the MIT Media Lab where they use Kinects and its API as a cheap implementation of motion computing. That includes 3D interfaces, motion tracking and an array of other innovations.
Now, think of the potential with mixing Kinect with Android. One of the untapped potentials of Android is that it is not just a mobile platform. Android could run on set-top televisions boxes or control all of the electronics in your household. The concept of the “smart home” takes a step forward in the marriage between the two open source projects. It is all the more delicious that the sources come from two companies that have historically been at each other’s throats: Google and Microsoft.
Google announced a framework at I/O last year that can bring Android to all of your appliances or devices. Called Android@Home it was the first notion that Google has projected that Android could be have far more uses and be more ubiquitous than most people originally believed. Now, add the Kinect API to Android@Home and you could walk into your kitchen, wave to turn on the lights, program your microwave from across the room with just a few waves of you hand. Then, go into your living room where you have an Android smartphone hooked to a projector running the Netflix app and stand in the middle of the room, swiping the air until you find your viewing material for the evening.
This all sounds like some crazy science fiction movie a la Minority Report. It is not. The fact of the matter is that right now, this technology exists. The hack by DDRBoxman is just the beginning. Within the decade, we will see this type of functionality in homes across the world.