Monday, August 24, 2009

Case study: The Presence Clock

Comments on and discussion of this case study are welcome; please use the Comment function. - Ken Pimple

Sensing Presence and Privacy: The Presence Clock

Kalpana Shankar, Ph.D.,
Assistant Professor of Informatics and Computing,
Indiana University Bloomington

Oliver McGraw, B.S.

Tom lives about two hours away from his 89 year-old mother, Judy. He has been quite worried about her since she fell last year. Since she is otherwise healthy, she insists that she does not need a medical alert bracelet. So Tom purchased a pair of Presence Clocks for her birthday.

The two analog Presence Clocks (see picture) are equipped with motion sensors and lights to record motion and presence. The clocks are connected to each other via the Internet. In this way, a family member does not need to be present at the time of remote activity in order to “see” it. Each of the owners of the clocks can sense at a glance the remote activity of the other.

Tom put one of the clocks in Judy’s kitchen and the other in his own. When the new clock at Judy’s house detects movement in her living room a green light (near the 3) begins blinking on Tom’s clock. It also indicates when the last time Judy was in her kitchen; the intensity of the blue light at the hour markers on the clock face shows how much time in that hour someone has spent near the clock; for example, the bright light at 4 and the dull light at 12 indicate that someone spent quite a bit of time near the clock in the 4:00 hour, and not as much time in the 12:00 hour. Similarly, Judy’s clock lets her know when Tom is in his kitchen or when he was last there. Through these clocks, Judy and Tom can both feel as though they have had contact with the other during the day and he can be reassured daily that she has not had an accident.

After a week Judy began to feel uncomfortable, like the Presence Clock is an invasion of her privacy. Tom has started to ask her specific questions: Where did you go this afternoon? Why did you not get up until nine? She knows that she is not as healthy as she used to be, but she wants to stay independent. It was all she could do after her fall last year to persuade him that she did not need to move into an assisted living facility. So, she keeps the clock and does not complain. If it makes Tom feel better maybe he won’t make her move. She decides the clock is better than having to move, but she is worried about what Tom will think of next.

Although Tom certainly seems to be a loving, caring son who means well by his purchase of the Presence Clocks what is clearly troubling in this scenario is the way in which the clock, combined with the use Tom makes of it, invades the mother’s privacy, paternalizes her and ultimately is used to disrespect her autonomy (i.e. her ability to run her affairs as she chooses).


Sandra Shapshay, Ph.D.,
Assistant Professor of Philosophy,
Indiana University-Bloomington

Upon first hearing about this new technology, and having young children, my first thought was: that’s kind of a like a baby monitor, but for checking in on seniors. How similar are these technologies?

Well, unlike the baby monitor, the Presence Clock does not pick up actual sounds (or in fancier models images and sound). Another dissimilarity is that the Presence Clocks pick up and transmit presence on both ends, whereas the baby-monitor is a one-way surveillance device. The original intention of these devices is also different: The designers of the presence-clock saw it largely as a kind of gentle “social networking device”—to allow people to feel more present in each others’ lives; whereas the baby monitor obviously has no real “social networking side.” So, there are significant differences between the Presence Clock and the baby monitor, but in the use that Tom is making of the clocks in this case, there is something similar going on and it is unsettling.

Like almost any technology, the Presence Clock can be used in an ethically responsible or troubling manner. A hammer can be used by a carpenter to build a chair, or by a robber to break into someone’s house. I’m sure the Presence Clock could be used in a completely innocuous fashion, to help people feel better connected to each other, to make seniors and their children all feel safer. But in the case at hand, we see one way in which the technology can be abused.

Judy is an apparently competent 89 year old woman. But the case suggests that since Judy’s fall, Tom has been treating her as less than competent to make her own decisions – note “It was all she could do after her fall last year to persuade him that she did not need to move into an assisted living facility.” Indeed, one can detect a subtle threat behind the use of the Presence Clock: Keep it, or else you’ll have to move out of your home. Judy is understandably disturbed. Imagine if you had a bout of depression, recovered, but then were told by a paternalizing adult child: You need to use this device or I’ll do what I can to have you committed to a psychiatric facility. Imagine if your son has a good rapport with your psychologist, so that he could probably make good on his threat to have you committed. It is not much of a stretch to see that Judy is in an analogous position. If she refuses to use the Presence Clocks, Tom might take more aggressive measures to have Judy moved to an assisted living facility, something she really doesn’t want.

Now in both the case of the formerly depressed parent and the case of Judy, it might be that the clock could actually be a benefit to all parties. Say, if I should have a relapse of my depression, and stay in bed for the entire day, while my clock is in the kitchen, then my son might have an early warning about the situation. If Judy should have another fall and would not be able to get to her kitchen all day, she might very well be thankful that Tom was minding his Presence Clock.

Nonetheless, if this expected benefit for me did not outweigh the burdensome invasion of my privacy, then I think in such a situation I would object that my autonomy – that is, my ability to run my affairs as I see fit – as a currently competent adult was being disrespected. I’m being treated as less than fully autonomous – to be asked to give up some of my privacy, which I cherish – through the pressure to use the Presence Clock. It is the hallmark of a liberal society, such as ours, that competent adults be allowed to make decisions for themselves, so long as they don’t hurt anyone else in the process. Judy may be putting herself at greater risk without the Presence Clock, but indeed, as a competent adult, that is her prerogative.

So this leads to my first ethical worry about this technology: It may be used even by well-meaning relatives and friends to treat adults with less respect for their autonomy than they deserve, especially if adults are vulnerable in some way. It is worrisome that their vulnerability may be used to leverage the relinquishing of some personal privacy with this device.

Does this ethical worry about this potential use of the Presence Clock mean that its sale or use should be restricted by law? Not really; a baby monitor might be used in a similar fashion as a surveillance device on adults, but I don’t think that the possibility of the technology’s abuse is a bona fide reason to restrict its sale, singling it out among many other technologies that may be thus abused, e.g. hammers, baseball bats, household bleach, etc. Notwithstanding, it would be a good idea to raise awareness among seniors about this technology, and to empower them to say “no” to its use if this should represent an unwelcome invasion of their privacy, and one that is not outweighed by potential benefits, all things considered.

There is another ethical worry I have with this technology, which is much more diffuse and part of a wider cultural phenomenon in the United States: The substitution of technological connectedness for actual presence of adult children in their parents’ lives. It is not uncommon for grown children to move rather far away from their parents. A recent New York Times article described that the average distance of adult children from their parents in the U.S. today is 2 hours driving time. For better or for worse, ageing parents are being cared for less and less by their adult children and more and more in assisted living facilities and nursing homes. It seems inherent in the logic of the Presence Clock, as applied to relations between seniors and their adult children, to substitute virtual for actual presence of adult children in their ageing parents’ lives.

Insofar as this is the case, does the Presence Clock merely rationalize a troubling situation, put a circular band-aid on the problem? Does the Presence Clock make it easier for adult children to feel a bit better about not living up to their filial obligations? Or is the Presence Clock a technological support to seniors and their adult children – making a potentially bad situation a lot better given these sociological realities? Do adult children fulfill their obligations better with technologies such as the Presence Clock? Do seniors find the Presence Clock comforting or a second-rate form of connectedness? Obviously, there are large and difficult questions here dealing with the nature of filial obligations, and whether the current state of care for seniors is right or good.

It is likely that some seniors prefer lesser involvement of their children in their daily lives and that some would prefer more. Each family dynamic is individual and highly complex. But I would like to voice this more ephemeral worry that the Presence Clock may be part of a larger, worrisome trend in relationships between adult children and seniors that does not conduce to the flourishing of seniors or the flourishing of the relationship between seniors and their adult children.

In sum, the Presence Clock might very well fill a safety and connectedness need for seniors and their adult children. However, like any technology, it may be deployed even with the best of intentions in a manner that diminishes the privacy of seniors and disrespects their status as competent adults.

This case and commentary were prepared for and presented at the annual meeting of the Association for Practical and Professional Ethics, Cincinnati, Ohio, March 2009.

Copyright © 2009, Kalpana Shankar, Oliver McGraw, and Sandra Shapshay. All rights reserved.

Permission is hereby granted to reproduce and distribute copies of this work for nonprofit educational purposes, provided that copies are distributed at or below cost, and that the authors, source, and copyright notice are included on each copy. This permission is in addition to rights of reproduction granted under Sections 107, 108, and other provisions of the U.S. Copyright Act.

Thursday, August 20, 2009

“Engineering Towards a More Just and Sustainable World”

As if you needed any additional reasons to attend the PAIT workshop, you might also be interested in attending a mini conference at the APPE annual meeting. The following is from the APPE Web site:

A mini conference, “Engineering Towards a More Just and Sustainable World” will be held Saturday afternoon, March 6 through Sunday Noon March 7, 2010. Registration is $40 for those registered for the preconference workshop, “Ethical Guidance for Research and Application of Pervasive and Autonomous Information Technology (PAIT)” or for the Annual Meeting. Registration for the Mini Conference alone is $70.

Ken Pimple, PAIT Project Director

Marc Rotenberg in Cincinnati March 5, 2010

Here's another good reason to participate in the PAIT workshop: Marc Rotenberg, Executive Director of the Electronic Privacy Information Center (EPIC), will deliver the keynote address at the APPE annual meeting the day after and in the same hotel as the PAIT workshop. The following is from the APPE Web site:

The keynote speaker for the Nineteenth Annual Meeting will be Marc Rotenberg, the Executive Director of the Electronic Privacy Information Center (EPIC) in Washington, DC.

Marc Rotenberg teaches information privacy law at Georgetown University Law Center and has testified before Congress on many issues, including access to information, encryption policy, consumer protection, computer security, and communications privacy. He testified before the 9-11 Commission on “Security and Liberty: Protecting Privacy, Preventing Terrorism.”

Dr. Rotenberg has served on several national and international advisory panels, including the expert panels on Cryptography Policy and Computer Security for the OECD, the Legal Experts on Cyberspace Law for UNESCO, and the Countering Spam program of the ITU. He currently chairs the ABA Committee on Privacy and Information Protection and is the former Chair of the Public Interest Registry, which manages the .ORG domain. He is editor of Privacy and Human Rights and The Privacy Law Sourcebook, and co-editor (with Daniel J. Solove and Paul Schwartz) of Information Privacy Law (Aspen Publishing 2007).

He is a graduate of Harvard College and Stanford Law School and served as Counsel to Senator Patrick J. Leahy on the Senate Judiciary Committee after graduation from law school. He is a Fellow of the American Bar Foundation and the recipient of several awards including the World Technology Award in Law.
Ken Pimple, PAIT Project Director

Wednesday, August 12, 2009

Call for Participation

We've issued a Call for Participation for the PAIT workshop. It's available from the PAIT home page, as are our registration form, travel subsidy policy and application, and working definition. Stay tuned for more to come.

Ken Pimple, PAIT Project Director

Friday, August 7, 2009

Ethics and Assistive Technology Survey

Researchers at the University of British Columbia recently posted a survey on Ethics and Assistive Technology Survey. The group's early survey on robot ethics appears still to be open.

From their Web site: "The purpose of the surveys is to facilitate well-informed discussion and explore attitudes about complex issues related to ethics (including animal welfare) and scientific and technological developments."

Thanks to Colin Allen (a member of the PAIT Planning Committee) for bringing these surveys to my attention.

Ken Pimple, PAIT Project Director

Monday, August 3, 2009

Working definition

Comments on and discussion of this working definition are welcome; please use the Comment function. Slight changes made 10/30/2009. - Ken Pimple

The PAIT Planning Committee developed these working definitions to help guide our efforts. They are intended to be useful rather than conclusive.

For the purposes of this workshop, we consider terms such as “pervasive computing,” “ubiquitous computing,” “ubicomp,” “everyware,” “ambient intelligence,” and “ambient computing” to be roughly synonymous. We use the term “information technology” to highlight the important role of hardware not usually associated with computers, such as advanced sensing and communication devices, involved in most pervasive IT. Our shorthand for these technologies and their application is PAIT.

Definition: Pervasive IT devices are small and/or unobtrusive (compared to a desktop computer, for example) and can be embedded in everyday objects (e.g., carpets, clothing, doorways, toys) to collect and/or act upon data generated by or important to human activity. Often the data collected can be wirelessly transmitted, stored, and shared on the Internet. In some instances, several devices will share data and work together toward a common goal. Some will be unobtrusive and generally unnoticed while others will interact perceptibly with people (asking questions, giving reminders).

Some pervasive technologies are also autonomous, or self-directing.

Definition: Autonomous systems are typically computer-based devices augmented with sensing devices beyond those found on a typical desktop computer, including analogues to vision and hearing. An autonomous system can operate for extended periods of time without direct human intervention and alter the way it performs by learning from its own experience. Some autonomous systems can also adapt to particular environments (e.g., by moving safely through a particular house) and some can perform based on non-linear calculations (e.g., Bayesian inference) such that performance cannot be completely predicted or characterized from the system’s programming. Many autonomous systems act only on and through data (as do most desktop computers), but others also act on the physical world (e.g., by welding joints). The latter are considered robots without regard to their physical shape or mobility status (they need not be humanoid and they can be bolted to a factory floor).