Two German researchers, Simon Bergweiler and Matthieu Deru, came up with a way to explain the heady concept of the semantic web, aka “Web 3.0,” to everyday people who aren’t as steeped in technology advancements and lingo as perhaps we are. To do this, the researchers set up an experimental kiosk that lets you use semantic web capabilities with only an iPhone and a swish of your finger.
The kiosk, or “shared interaction space” as they prefer calling it, uses MP3 files to demonstrate semantic web technologies. MP3 files were chosen because they are easy to understand as being “things” that can have additional data attached to them (“artist,” “album,” “year,” etc.). This additional data in MP3 files is stored in “ID3 tags,” which are basically the portion of the file that tells the computer about that extra information. An MP3 file on an iPhone then is already a semantically annotated object which can be read by a computer.
Wait, What’s the Semantic Web Again?
Why is this considered semantic technology? Because the core idea behind the semantic web, the next big leap in computing technology is a web where “things” (like MP3 files, for example) can be read and understood by computers. A semantic web would better understand our search queries and how objects were linked with each other. The understanding that comes from that network of linked information could even bring about a sort of artificial intelligence, as the web could then deliver information to us that the human mind wouldn’t have been able to access on its own. That’s somewhat of a simplification of the semantic web, but it works for our purposes in understanding this experiment.
The Semantic iPhone Kiosks
To demonstrate semantics in action, participants in the experiment would place an iPhone on the kiosk’s surface and watch as a circle appeared around it. Next to the iPhone, a list of songs arranged by artist, title, or genre would then appear. Elsewhere on the screen were things called “spotlets,” or intelligent information agents that performed actions when the songs were dragged to them. For example, one spotlet played the MP3 when it was dragged there, another played YouTube music videos from the same band.
To better understand spotlets, you can check out this YouTube video of spotlets in action.
The tabletop computer looks to be very much like a homegrown version of Microsoft’s Surface computer, except for the fact that the cameras that detect the action taking place on the screen were strung up above the computer instead of housed inside and underneath the screen as they are with Surface. The researchers are calling this the Comet system, short for “Collaborative Media Exchange Terminal.”
The rest of the interactions that take place on the computer’s screen are the sort of natural user interface actions that we’ve come to expect from touch screen technology. You can actually touch and drag the MP3s from one spotlet to the next, playing music, watching videos, and getting recommendations for other songs you might like.
According to a CNN article about the technology, the researchers will also soon be launching a web site version of their system, where you’ll be able to drag icons with a cursor instead of your finger. You’ll also be able to use speech commands to interact with the system, which could be an interesting development in home entertainment systems.
This iPhone interactive kiosk is a great example of how the semantic web doesn’t have to be a dry concept, unexciting to anyone who isn’t a technophile or programmer. Instead, it shows the promise of what the semantic web could bring and how it could impact our everyday life. Keep your eye on this team for more information about their technology in the future – they are definitely a group to watch.