Tuesday, January 4, 2011

"When Computers Keep Watch"

This article from the New York Times (January 1, 2011) describes advances and uses of computerized analysis of visual images of people, including face recognition. The first example is of a system that monitors a prison yard in an annual training exercise for correctional officers.
Perched above the prison yard, five cameras tracked the play-acting prisoners, and artificial-intelligence software analyzed the images to recognize faces, gestures and patterns of group behavior. When two groups of inmates moved toward each other, the experimental computer system sent an alert - a text message - to a corrections officer that warned of a potential incident and gave the location.
Other examples include a computer-vision system that reminds hospital personnel to wash their hands when they are supposed to; another mounted behind a mirror that can "read a man's face to detect his heart rate and other vital signs;" a third can "analyze a woman’s expressions as she watches a movie trailer or shops online, and help marketers tailor their offerings accordingly."

Like most pervasive technologies, these computer-vision systems clearly have the potential to be beneficial in many ways, but also could easily be misused to violate privacy and cause other kinds of harms. As I read the article, the possibility of abuse by employers occurred to me before I reached this passage:
At work or school, the technology opens the door to a computerized supervisor that is always watching. Are you paying attention, goofing off or daydreaming?
Some people will argue that such a use would be justified because it would lead to great productivity and a thriving economy. Others, such as myself, would call it tyrannical; and I'd go on to say that there may be problems with ever-growing economies.

The examples above of the mirror that reads vital signs and the computer that monitors the reactions of shoppers or movie watchers are made possible by the research of Rosalind W. Picard and Rana el-Kaliouby at M.I.T. They have worked "for years" to apply "facial-expression analysis software to help young people with autism better recognize the emotional signals from others that they have such a hard time understanding" and co-founded a company, Affectiva, to market the software.

I am most alarmed by the use of these technologies to improve marketing and advertising, the practical science(s) of behavior control. Big business has the money and the incentive to propel the use of this software far and fast. What if the marketers actually perfect their art? Perfect marketing is perfect behavior control, and it might be reached under the flag of economic development with the blessing of our dominant paradigm. I find the fact that this may be made possible by the work of people who wanted to help people with autism bitterly ironic.

Ken Pimple, PAIT Project Director

No comments: