I’ve seen several tweets this morning calling Microsoft’s plan to offer emotion-based advertisements through Kinect “creepy.” Indeed, only the most hardened technologist would think that having their body movements, speech patterns and reactions analyzed through Kinect’s camera is anything but unsettling. But how is this much different than what we already do every day through the click signals we send to Google, Facebook and dozens of other websites?
Microsoft filed a patent application on June 7 that implies the company is working on advertising targeted to emotions picked up on Kinect. The company has already suggested it would incorporate emotion detection into Xbox games.
Now, however, those emotions may be used to sell you products. Feeling blue and in need of some comfort food? Perhaps you see an ad for a special on mac and cheese at the local grocery store. Celebrating a promotion? The liquor store is offering a deal on a case of champagne. Chronically depressed? Here come the ads for Lexapro.
“An advertisement engine selects advertisements that are emotionally compatible based on the assigned emotional states and the desired emotional states provided by the advertisers,” Microsoft said in its patent.
Revealing the Truth, Even When You Don’t Want to Tell It
Google and Facebook are both getting increasingly better at figuring out who you are based on the clicks you make (or don’t make) while on their sites. One of the biggest problems for Facebook, however, is that the signals you send are often revealing a person you aspire to be, or at least the person you want your friends to think you are. How many times have you seen a status update from someone proclaiming love for a husband/wife/significant other, only to see the person change their relationship status a few days or weeks later?
Likewise, we search for a lot of different products on Google; some are definitely related to consumer research, while others may just be research, work-related or searches made out of plain boredom. Google and Facebook are locked in a race to figure out which can do a better job of targeting ads to you, but both rely largely on information you consciously input into their systems.
That is ever-changing: The next generation of analytics will pay just as much attention to what you don’t click, as well as what your friends click and don’t click, as what you personally click. There are already some sophisticated language recognition programs that can tell with a surprising amount of accuracy whether the person who wrote a status update or tweet was in a good mood or bad mood.
Now imagine if that insight could be coupled with a full body scan that could reveal all of your emotions, even the ones you may be trying to hide.
The Future Is Closing In
This isn’t as nefarious as it all may sound. There are, of course, useful applications with such emotion-sensing devices. A nutrition and personal coaching app that knows you binge on food when you are depressed may suggest an endorphin-boosting walk or jog before you start craving cookies. A productivity app may suggest you shift focus and work on something else, or even call it a day, when it senses you’re getting too easily distracted from what you’re working on.
The ethical question is whether emotion-detecting ads are fair when they essentially manipulate us by playing on feelings that we may not even be aware that we’re experiencing. There are dozens of pending lawsuits against casinos that reeled in despondent and addicted gamblers when they were in dire straits, knowing those gamblers were looking for one last chance to win back their losses.
The same thing can be applied to social networks and search engines. Most marginally tech-savvy users understand that the ads they’re seeing on Facebook are there because they have indicated a preference for one thing over another. But do those ads put us at a disadvantage when we can no longer determine why we’re seeing them, as may be the case when they’re linked to an emotion detected through our word choice, our Kinect camera or by some other method?
Photo courtesy of Barone Firenze / Shutterstock.com.