You've seen those eye tracking heat maps that show where most people look first when they land on a web page - why not turn eye tracking technology like that into a replacement for your mouse or your finger on a touchscreen?
That's what a Danish startup called Senseye claims to be doing; they say they've got software for Android that uses the front-facing camera to track a user's eye movement and then uses that to control what happens on the phone's screen. They're not alone in working on doing that kind of work, either. Eye tracking could be a big new way that users interact with their devices.
If the company can really pull this off, Senseye could join the ranks of Microsoft's Kinect, Surface and the touchscreen mobile devices in what people are calling the Natural User Interface (NUI). A Swedish company called Tobii announced in-car eye tracking technology this week as well and these aren't isolated innovations.
First there was the command line for controlling computers, then there was the Graphic User Interface - now there's the NUI. A natural user interface is defined as a human-computer interface that is transparent or invisible. It's exciting but just beginning to be explored. Eye tracking as NUI? Apple has been rumored for several years to have licensed Tobii technology for tablet use, but it's not clear that such use would really work very well.
It seems a little hard to believe that the same level of detailed control over a cursor that you can get from your finger touching the screen can be captured by watching your eyes, but Senseye says it works and it's winning European startup awards. Blogger Martin Bryant profiled the company on The Next Web last night and posted the video you can also watch below.
Robert Stevens, CEO of THiNK Eye Tracking, wrote an in-depth post almost two years ago about the challenges that eye tracking technology faces before it can enter consumer markets like this. The biggest issue is that eyes move around a lot, they need to in order to see, but they are far harder to track accurately than fingers touching a screen.
Let's assume it really works though. I don't know if you're really going to keep it running all the time so it can catch you peeking at your phone and turn it on, for example. I'm also not sure that eye movement is really going to be a faster system of input than the options we've already got. Maybe it's just different and appropriate for different circumstances.
I never understood speech interfaces either until I was riding my bike down a street a few weeks ago, unsure of the exact address of the business I was headed to. I pulled out my iPhone, launched the Google App and said its name. The address popped right up and biking while talking to your phone is less risky, I suppose, than other types of input.
Most of the time speech input is slow, imprecise and socially awkward but it clearly has its place, like when you're in transit.
Tobii introduced its own eye-tracking video game last month, too. Asteroids, it seems, is a particularly good use case. Angry Birds? We'll see. Stevens of THiNK Eye Tracking called Tobii's game "Brilliant PR" but emphasized that "eye control is of little practical utility for most people in most situations."
In response to Stevens' critical comments, Tobii UX designer Joakim Isaksson offered the following in a LinkedIn thread last month.
"When eye tracking first starts making its way into the mainstream market, it will simply complement current methods of user input, such as mouse, keyboard, touch and voice. My favorite example of this so far is a nifty feature we have developed called MouseWarp, where the mouse cursor jumps to where you are looking when you start moving it towards your point of regard. It's extremely simple, but especially when used in conjunction with a trackpad on a laptop it is also very powerful. [See this prototype Tobii eye tracking system for controlling a Lenovo laptop which made an appearance this Spring.]
"As the market matures, we will start seeing computers and devices that anticipates what you want to do, for example pre-loading content when you look at something instead of when you click it. As for computer games, eye tracking makes it possible for the game world to react to your gaze; imagine walking in to a bar in GTA7 and when you accidentally gaze at the dodgy guy in the corner he flips out and pulls out his baseball bat. On top of that, graphics rendering can be made more intelligent; why render everything on screen with full Vertex count and Anti-Aliasing, when you're only really looking at 1 % of it at any given time?
"There are countless other possibilities that eye tracking enables, and most of them have yet to be discovered."
The Read/Write Implications
There's certainly a lot of gee-whiz going on here, but there are also read/write implications to consider. If eye tracking can prove a faster, more efficient way to do some things on computers, then that could mean those things get done more often. It's nice to think about the human eye being an active participant in media, not just a passive recipient of broadcast messages. If I could sort assets, navigate large maps or otherwise interact with large quantities of information with my eyes, I think I would do more sorting of information.
Is sorting a form of writing? I don't know that it is, but I don't think eye tracking is likely to facilitate richer input than that.
But it's probably a mistake to think of eye tracking happening in isolation. Used in conjunction with multiple other forms of input, like touch, mouse, sensors, keyboards, etc. it could be a great asset to a more complex set of user interface options.