All of that was triggered by these two stories in the Times. The first is just about a handful of Google Glass wannabes. Irritating, invasive, privacy-decomposing - but we're getting braced for the flood.
Cyborgs stalking us all - the actual zombie invasion? - "Seeking a Staredown With Google Glass" | @NYTimes http://t.co/S9QaDAb78cBut along with it comes this:
— Kenneth D. Pimple (@Ethical_PICT) October 13, 2013
Computational voice analysis and diagnosis of mood, personality - "In a Mood? Call Center Agents Can Tell" | @NYTimes http://t.co/1qITVvLKn6Somebody give me a "Come on, now!"
— Kenneth D. Pimple (@Ethical_PICT) October 13, 2013
Here are my two favorite passages:
The more invasive audio mining also has the potential to unnerve some consumers, who might squirm at the idea of an unknown operator getting an instant entree into their psyche.That's an understatement.
“It seems to me that the biggest risk of this technology is not that it violates people’s privacy, but that companies might believe in it and use it to make judgments about customers or potential employees,” says George Loewenstein, a professor of economics and psychology at Carnegie Mellon University. “That could end up being used to make arbitrary and potentially discriminatory decisions.”I don't know about you, but to me this is a no-win proposition. If the software works as advertised, it's the most severe invasion of privacy we're likely to see until Isaac Asimov's pscyho-probe comes around. If it doesn't work, but people believe in it, it'll be another source of confusion and another tool in the power-abuser kit.
No comments:
Post a Comment