The European Commission is giving financial backing to a company that
claims its technology can read your emotional state by just having you
look into a webcam. There is some sceptical reporting of this story here.
Highlights:
"Realeyes is a London based start-up company that tracks people's facial reactions through webcams and smartphones in order to analyse their emotions. ...
Realeyes has just received a 3,6 million euro funding from the European Commission to further develop emotion measurement technology. ...
The technology is based on six basic emotional states that, according to the research of Dr Paul Ekman, a research psychologist, are universal across cultures, ages and geographic locations. The automated facial coding platform records and then analyses these universal emotions: happiness, surprise, fear, sadness, disgust and confusion. ...
[T]his technological development could be a very powerful tool not only for advertising agencies, but as well for improving classroom learning, increasing drivers’ safety, or to be used as a type of lie detector test by the police."
Of course, this is utterly stupid. For one thing, it treats emotions as if they are real tangible things that everyone agrees upon, whereas emotions research is a messy field full of competing theories and models. I don't know what Ekman's research says, or what predictions it makes, but if it really suggests that one can reduce everything about what a person is feeling at any given moment to one of six (or nine, or twelve) choices on a scale, then I don't think I live in that world (and I certainly don't want to). For another, without some form of baseline record of a person's face, it's going to be close to impossible to tell what distortions are being heaped on top of that by emotions. Think of people you know whose "neutral" expression is basically a smile, and others who walk round with a permanent scowl on their faces.
Now, I don't really care much if this kind of thing is sold to gullible "brand-led" companies who are told that it will help them sell more upmarket branded crap to people. If those companies want to waste their marketing and advertising dollars, they're welcome. (After all, many of them are currently spraying those same dollars more or less uselessly in advertising on Twitter and Facebook.) But I do care when public money is involved, or public policy is likely to be influenced.
Highlights:
"Realeyes is a London based start-up company that tracks people's facial reactions through webcams and smartphones in order to analyse their emotions. ...
Realeyes has just received a 3,6 million euro funding from the European Commission to further develop emotion measurement technology. ...
The technology is based on six basic emotional states that, according to the research of Dr Paul Ekman, a research psychologist, are universal across cultures, ages and geographic locations. The automated facial coding platform records and then analyses these universal emotions: happiness, surprise, fear, sadness, disgust and confusion. ...
[T]his technological development could be a very powerful tool not only for advertising agencies, but as well for improving classroom learning, increasing drivers’ safety, or to be used as a type of lie detector test by the police."
Of course, this is utterly stupid. For one thing, it treats emotions as if they are real tangible things that everyone agrees upon, whereas emotions research is a messy field full of competing theories and models. I don't know what Ekman's research says, or what predictions it makes, but if it really suggests that one can reduce everything about what a person is feeling at any given moment to one of six (or nine, or twelve) choices on a scale, then I don't think I live in that world (and I certainly don't want to). For another, without some form of baseline record of a person's face, it's going to be close to impossible to tell what distortions are being heaped on top of that by emotions. Think of people you know whose "neutral" expression is basically a smile, and others who walk round with a permanent scowl on their faces.
Now, I don't really care much if this kind of thing is sold to gullible "brand-led" companies who are told that it will help them sell more upmarket branded crap to people. If those companies want to waste their marketing and advertising dollars, they're welcome. (After all, many of them are currently spraying those same dollars more or less uselessly in advertising on Twitter and Facebook.) But I do care when public money is involved, or public policy is likely to be influenced.
Actually, it seems to me that the major problem here is not, as some seem to think, the "big brother" implications of technology actually telling purveyors of high-end perfumes or watches, or the authorities, how we're really feeling, although of course that would be intensely problematic in its own right. A far bigger problem is how to deal with all of the false positives, because this stuff just won't work - whatever "work" might even mean in this context. At least if a "traditional" (i.e., post-2011 or so) camera wrongly claims to have located you in a given place at a given time, it's plausible that you might be able to produce an alibi (for example, another facial recognition camera placing you in another city at exactly the same time, ha ha). But when an "Emocam" says that you're looking fearful as you, say, enter the airport terminal, and therefore you must be planning to blow yourself up, there is literally nothing you can do to prove the contrary. Dr. Ekman's "perfect" research, combined with XYZ defence contractor's "infallible" software, has spoken.
In a previous life, but still on this blog, I was a "computer guy". In a blog post from that previous life, I recommended the remarkable book, "Digital Woes: Why We Should Not Depend on Software" by Lauren Ruth Wiener. Everything that is wrong with this "emotion tracking" project is covered in that book, despite its publication date of 1993 and the fact that, as far as I have been able to determine, the word "Internet" doesn't appear anywhere in it. I strongly recommend it to anyone who is concerned about the degree to which not only politicians, but also other decision-makers including those in private-sector organisations, so readily fall prey to the "Shiny infallible machine" narrative of the peddlers of imperfect technology.
- You are fearful. What are you about to do? Maybe we'd better shoot you before you deploy that suicide vest.
- The computer says you are disgusted. I am a member of a different ethnic group. Are you disgusted at me? Are you some kind of racist?
- Welcome to this job interview. Hmm, the computer says you are confused. We don't want confused people working for us.
In a previous life, but still on this blog, I was a "computer guy". In a blog post from that previous life, I recommended the remarkable book, "Digital Woes: Why We Should Not Depend on Software" by Lauren Ruth Wiener. Everything that is wrong with this "emotion tracking" project is covered in that book, despite its publication date of 1993 and the fact that, as far as I have been able to determine, the word "Internet" doesn't appear anywhere in it. I strongly recommend it to anyone who is concerned about the degree to which not only politicians, but also other decision-makers including those in private-sector organisations, so readily fall prey to the "Shiny infallible machine" narrative of the peddlers of imperfect technology.
Not to put too fine a point on it, but "Positivity Lady" did a post-doc in Ekman's lab.
ReplyDeleteThis has been a fine day, indeed.
http://tinyurl.com/sibleger