Juniper Research has released a new study with a bold prediction about the future of human interaction with technology. It found that “gesture and motion control will become vital for certain forms of human-computer interaction in the coming years.”
The study found that by the end of 2016 there will be roughly 168 million devices that utilize motion or gesture tracking. These devices include wearables, virtual reality, and more.
With its current adoption and growth rate, the study suggests that there will be as many as 492 million motion and gesture-tracking devices as early as 2020. This is a 280% growth, which is an extraordinary adoption rate for such an extreme change in the way users interact with computers.
See also: Gesture control goes mainstream
This widespread adoption isn’t going to be automatic. The study notes that more traditional devices such as smartphones and PCs will be a pain point for adoption. As little as 5% of these devices are expected to include some form of gesture or motion-sensing interface built-in.
For products like virtual reality headsets, motion and gesture tracking are an essential method of interaction. Being able to track your hands, head, and body movement enables you to have an immersive experience with the system.
Wearables such as smartwatches and glasses are also a big focal point for the technology. Tiny screens and small touch-sensitive surfaces limit a wearable’s usability. By improving motion tracking and adding gesture control, these devices will become inherently more useful, and more adoptable.
The author of the research paper noted, “VR and wearables have shown the way that gesture and haptics can provide fresh ways to interact with technology.”
The pathway to more widespread adoption of these interfaces includes companies taking the risk and rethinking how humans have interacted with computers.
Companies like Leap Motion are working on this very problem, using a small external device to provide a form of gesture tracking – enabling users to execute commands with a movement rather than the keyboard and mouse.
Gesture-tracking way paved by Kinect
Even years ago Microsoft’s inclusion of the Kinect with its Xbox 360 console paved the way for a new way to interact with the PC. Companies began producing gesture-tracking devices for the PC, and developers starting working on ways to put the Kinect to work in Windows.
The problem here is that while this creates a new way to execute commands on the PC, it doesn’t actually change the user interface. Juniper notes: “at the moment this will simply extend current functionalities, holding back adoption across devices as a whole, unless the UI (user interface) paradigm changes.”
For now, the keyboard and mouse will continue to be the primary way we interact with our PCs. The touch screen will continue to be how we interact with our phones. The difference between what we have now and minority report depends on companies being brave enough to make gesture and motion tracking an integral part of the user experience, not just an add-on.