Computers that simply do the sames things faster and faster are becoming boring. Been there, done that. But a device that can detect and interpret your emotions? Or intelligently organize a meeting, knowing that one of the participants is jogging at the time? That’s a more interesting proposition. Intel, perhaps surprisingly, is working on both.
At the Intel Developer Forum this week, the chip giant has serious business on tap, presenting the latest iteration of its Core microprocessor line and laying out its software initiatives, including updates from its McAfee security division.
On Monday, however, Intel debuted a book of science fiction stories. Dubbed Imaging the Future And Building It, the book includes a number of stories – from professional authors like Madeline Ashby and Karl Schroeder, plus more pedestrian efforts from analysts like Rob Enderle. But the most interesting bits come in the introduction – where Intel lays out its vision of the future.
Over the last few years, Intel futurist Rob Johnson explains, Intel has been running a “futurecasting lab,” where the company whiteboards what the future will look like. The effects-based models help guide Intel’s product development; Intel is working on its 2019 model right now.
Everything Changes In 2020
In 2020, however, “something remarkable happens,” Johnson writes. “As we pass 2020, the size of meaningful computational power approaches zero.” In other words, with a microprocessor that small, you can put a computer in just about anything.
“When you get intelligence that small, you can turn anything into a computer,” Johnson writes. “You could turn a table into a computer. All of a sudden, it’s possible to turn your shirt, your chair, even your own body into a computer.”
And in some sense, that’s what Intel showed off in a series of demonstrations on Monday – intelligent interactions between various devices, some containing their own electronic eyes and ears. The goal was to use technology as a bridge between man and machine to facilitate context.
If this sounds like the sort of blue-sky forecasting you might hear at an academic conference, you’re not far off. For years, Intel has employed a small team of anthropologists and other social scientists to translate what the company manufactures in its fabs into real-world technology. And this year it pushed into art.
Dishing Out A Smart Bowl
Take, for example, the showcase exhibit: what Intel called “Display without Boundaries” – essentially an smart bowl. The display intelligently connected a video projector and Microsoft’s Kinect for Windows to create a projected image that interacted with the surface of the bowl. The bowl served as both a display and a controller. And the bowl not only sensed the user’s fingers, but photos could be “swiped” from the bowl to a more traditional wall-mounted display.
“This could be my photo album, you know. I could touch those, bring them up on the wall, interact with them, enlarge them, this could be my photo wall,” said Carl Marshall, a graphics software architect and research scientist at Intel. The idea, he said, was not only to establish a display where ever it could be used, but to create an emotional connection between data and a physical object.
That was the same theme evoked by Margie Morris, a clinical psychologist employed by Intel, whose “Mood Map” projected a collection of images collected by Instagram onto a similar photo wall – and attempted to sense their mood.
Traditional sentiment analysis works fairly well for text; describing a person as “awesome,” for example, is almost always a positive statement. But photos can be a much more difficult nut to crack for a computer, which may be unaware of the photo’s context. Morris’ project used some easy clues – hashtags – to try and sense the mood, while the color of the filter provided others.
“Instead of a thumbs up / thumbs down, you’re trying to establish an emotional connection,” Morris said.
The Mood Map does two things, Morris explained: it provides an emotional “map” of the photos, outlining them in a particular color, according to the assessed mood. But it also allows users to track the emotional path of a particular image depending on the mood of the user, and assign their own “mood” tag to the image itself.
Context Is King
A number of other demos also attempted to provide context, typically in a more generic way. A handful of Intel researchers are working on projects to sense context from a device’s sensors, most often the phone. For example, if the Intel research framework senses a user’s phone moving (because its owner is out jogging, for example), it won’t ring a user’s desk phone for a scheduled conference call. A related social framework would “sniff” the phone’s microphone and try to determine if the user was in a car, then send him a voice message instead of a text so as not distract the driver. All of this would require the user’s permission, of course.
The overall goal, Intel said, was to eliminate what author Madeline Ashby called our “hermit crab” relationship with technology, where our digital history is defined by the devices that we have used, used up and discarded.
This not only has implications for conservationists, but consumers and manufacturers alike: forging an emotional connection between consumer and technology means that users will likely value and hold on to an electronic device far longer than they would otherwise.
Photos by Mark Hachman.