Wednesday, December 12, 2012

Call for papers: IEEE International Symposium on Technology and Society

Call for Papers

2013 IEEE International Symposium on Technology and Society

Social Implications of Living in a Smart World

June 27-29, 2013
Toronto, Ontario, Canada

ISTAS'13 invites original and unpublished work from individuals active in the broad theme of the conference. Authors should submit their papers online. All papers that conform to submission guidelines will be peer reviewed and evaluated based on originality, technical and/or research content/depth, correctness, relevance to conference, contributions, and readability. Acceptance of papers will be communicated to authors by email.

We welcome submissions for ISTAS's Technical Program in the following areas and not limited to:

Wearable Computing: cell-phone view, webcams, necklacedome, wearcams, wearcomp, wristwatch computer, point-of-view technologies, mobile CCTV, EyeTap, Google Glasses, sensors, nanotechnology, biomedical devices, implantables, skinput, affective computing, body area networks, cyborgs, interaction-design

Augmediated Reality: vision systems, graphical-based systems, multimedia, interactive systems, location-based services, geolocation mapping, Web 3.0, 4G, sensory inputs, biofeedback, augmented reality, diminished reality, mediated reality, reality mining, remembrance agents, humanistic intelligence, artificial intelligence

The Veillances: surveillance, counter surveillance, dataveillance, sousveillance, equiveillence, inequiveillence, mcveillance, uberveillance, veillance studies, obtrusive, unobtrusive, secret, covert, priveillance, uber analytics, big data

Everyday life: data logging, moblogging, glogging, lifelogging, 24x7 ubiquitous audio-visual recording, state/condition monitoring, permission (opt-out/opt-in), private space, public space, transparent society

Social concerns: privacy, social sorting, data-driven analytics, security, fair use/availability, equity, consent, complexity, rates, power/control, government authority, RF transmissions, resilience, regulatory issues, legislation, sustainability, autonomy, living off the grid, transhumanism

Additional papers on other traditional fields of interest to SSIT also are welcome.

ISTAS '13 will be a transdisciplinary event for engineers, designers, scientists, artists, researchers in the social sciences, law and humanities, decision makers, entrepreneurs, inventors, commercializers, etc., as well as polymaths, and anyone who is a DASSTEMist (Designer, Artist, Sustainist, Scientist, Technologist, Engineer, and Mathematician).

Important Dates

Papers up to 5000 wordsJanuary 31, 2013
Abstract only (presentation only)  February 28, 2013
Author notificationMarch 31, 2013
Final camera-ready copyApril 29, 2013
ConferenceJune 10-12, 2013

A selection of extended papers will be published by invitation only, in a special issue of IEEE Technology and Society Magazine in March 2014 guest edited by Steve Mann and Katina Michael. The author guidelines for the special issue in IEEE T&S Magazine can be found here.

Sunday, September 30, 2012

Call for abstracts: Geographic Information Ethics and GIScience

Call for AAG 2013 Abstracts: 

Geographic Information Ethics and GIScience

http://dusk.geo.orst.edu/aag_ethics13.html

Following successful sessions at the 2009-2011 AAG meetings on ethics and GIScience, we are organizing sessions to continue and expand discussions of this important topic. Ethical engagements with the multitude of GIS applications and uses, whether surreptitious or overt, have marked recent developments in the field. Indeed, the variety of applications of geographic information science & technology (GIS&T) has led the U.S. Department of Labor to highlight geospatial/geographic technologies as the third largest high-growth job field for the 21st century. While the potential benefits and risks of geographic technologies are becoming well known, these sessions provides a forum to engage ethical issues. For instance:

  • Geographic technologies are surveillance technologies. The data they produce may be used to invade the privacy, and even the autonomy, of individuals and groups.
  • Data gathered using geographic technologies are used to make policy decisions. Erroneous, inadequately documented, or inappropriate data can have grave consequences for individuals and the environment.
  • Geographic technologies have the potential to exacerbate inequities in society, insofar as large organizations enjoy greater access to technology, data, and technological expertise than smaller organizations and individuals.
  • Georeferenced photos, tweets and volunteered (and unvolunteered) geographic information can reveal private information. Those data that are increasingly publicly available and used to study societal phenomena raise significant privacy concerns. 

Papers in this session again engaged with the above issues in relationship to GIScience, including such topics as:

  • case studies, curriculum development, or the pedagogy of teaching GIS ethical issues; 
  • issues of privacy, surveillance, inequity, erroneous or inappropriate data concerning geographic technologies;
  • codes of ethics and conduct of professional organizations;
  • GIS professional development;
  • reflections on the changing nature of ethical issues in GIS&T

These sessions are co-sponsored by the AAG GI Systems & Science and Ethics, Justice, and Human Rights Specialty Groups.

To participate: 

  1. Please register for the AAG 2013 meeting and submit your abstract online following the AAG Guidelines (http://www.aag.or/cs/anualmeeting/call_for_papers).
  2. Please send your paper title, PIN, and abstract no later than Wednesday, October 20 to Rodolphe Devillers (rdeville@mun.ca), Francis Harvey (fharvey@umn.edu), or Dawn Wright (dwright@esri.com).

Sunday, July 15, 2012

"Imagining Tomorrow's Computers Today"

ScienceNOW's Jop de Vrieze has published an interesting interview (July 15, 2012) with Brian David Johnson, "principal engineer and futurist" at Intel, the world's largest chip manufacturer, about how people will interact with computers in the near future. Not surprisingly, the interview is upbeat, with a focus on how Intel can anticipate what people will want from more-advanced computers. Here's the passage I find most suggestive about ethical issues.
Q: You study the interaction between humans and computers. What do you foresee to the next ten, fifteen years?
B.D.J: Looking at the past, technology has been about command and control. In the future it will be about relationships. Our technologies will get to know us and we'll become more tightly connected. That has an impact on what we do productivity-wise, but even more it connects us to the things and people we love. Siri, the personal assistant built into your iPhone, is an early example of that. You literally talk with your phone and it can talk back to you. 
Q: In what way does the development of chips play a role in this?
B.D.J: As we move closer to 2020, the size of computational chips is becoming so small that it is approaching zero. This means we could literally turn anything into a computer. Your tea glass, the table, you name it. There is a switch coming, where we do not have to ask: “Can we turn that into a computer,” but we know we can and we wonder: Is there a use to do it? That is what we have the social scientists for. We do not study markets, we study people.
"In the future [technology] will be about relationships." The computers in our every possession, including tables, beds, and clothing, will be an extension of ourselves and will know us well. As will anyone who has access to the personal, "private" data stored and transmitted by our computerized objects.

Intel doesn't study markets; it studies people and their interactions with computers. Is Intel interested at all in the interactions of people, corporations, and governments, with other people's computers?

Ken Pimple, PAIT Project Director

Friday, June 29, 2012

"Will Google's Personal Assistant Be Creepy or Cool?"

This piece by Jenna Wortham (New York Times, June 28, 2012) asks an astute question.

I hadn't heard about Google Now before. It appears to be a kind of pocket butler that will come with the next version of Android. It will connect disparate information about your location, your calendar, and your preferences to give you advice on how to get through the day. Google has a Flash video that makes the thing easier to understand than any merely verbal explanation.

This e-butler (my coinage), according to Wortham, has the potential to "feel like a menacing stalker."
Google Now may also cause people to realize exactly how much data and information Google actually has about their routines and daily lives. And that might cause some people to be very, very uncomfortable, regardless of how useful the service is.
Wortham ends the essay with a particularly nicely written passage on the world of tomorrow (or five minutes from now):
We’re at the beginning of an era, the adolescence, of just beginning to understand what information we want to share and keep private, and when we don’t have a say in the matter. But we’re learning that our data exhaust, the small particulate matter that we deposit around the Web and world through our browsers and mobile devices, is becoming a very powerful tool in aggregate, and that large companies are hoping to use it to their advantage.

Ken Pimple, PAIT Project Director

Monday, June 25, 2012

"A Weapon We Can’t Control"

I've written on Stuxnet five times and mentioned it once on this blog. I was appalled at the first news of Stuxnet I came across, not simply because of the virus' power, but because it appeared that Stuxnet was created by the United States or Israel or both. Most importantly, when a computer virus is let loose on the world, it becomes available to bad actors who can modify it for their own purposes. It's almost as if the bombing of Hiroshima and Nagasaki gave would-be bomb builders 98% of everything they needed to build their own a-bombs.

The author of this op-ed piece, Misha Glenny (New York Times, June 24, 2012), observes:
There is no international treaty or agreement restricting the use of cyberweapons, which can do anything from controlling an individual laptop to disrupting an entire country's critical telecommunications or banking infrastructure. It is in the United States' interest to push for one before the monster it has unleashed comes home to roost.        
We might be headed toward a new Cold War in which mutual destruction can be triggered by any one of thousands of sophisticated programmers. To me, this is just as scary as the previous cold war, in part because it's much more complicated. A treaty would not be a complete solution, but it would be a valuable tool.

Ken Pimple, PAIT Project Director

Thursday, June 21, 2012

Katina Michael on microchips in humans

In this 15-minute video, Associate Professor Katina Michael of the University of Wollongong (Australia) gives a quick overview of the history of computing, from ENIAC to implantable microchips, then presents three scenarios of likely (or at least possible) uses of microchips in humans.

I very rarely watch anything on YouTube that's over 2-3 minutes. This one is well worth the time. It's informative, gripping, and eye-opening.


Ken Pimple, PAIT Project Director

Friday, June 15, 2012

Call for papers: Ethics of Social Networks for Special Needs Users

Call for Papers
 Ethics and Information Technology
Special Issue on Ethics of Social Networks for Special Needs Users

Millions of persons all around the world are regular users of social networking sites. Their number is still increasing. Online Social networking practices often raise unforeseen problems with regard to the rights, needs, and interests of the vulnerable, e.g. children, the elderly, and the persons with disabilities. These categories represent what we call “special needs users” and their social networking practices raise specific challenges. Understanding, supporting or helping specials needs users poses problems of e-inclusion, access to social networks, protecting them from harm and exploitation, and accommodating their special needs, supporting their emancipation and political participation, as well as encouraging solidarity with and among these groups.

This special issue invites submissions of original research exploring the interplay between Ethics, on-line social networks, and special needs users. We are particularly interested in contributions that identify ethical issues and their resolution by devising policies and proposing design solutions to the problems identified. Social sciences and Interdisciplinary studies have seen an increased number of papers related to Facebook, Google+, LinkedIn. Most of the literature reflecting on ethical questions associated with these technologies does not go beyond the consideration of individuals’ privacy. In this special edition, we wish to explore a broader range of ethical issues raised by social networks, with a specific focus on the special needs users including children, elderly and persons with disabilities.

Values that come to mind in this context are wellbeing, voice, equality, autonomy and freedom, usability. Researchers are invited to propose papers addressing the key question of this special issue: what are the specific ethical considerations that need to be addressed in the design, deployment and governance of social networks use by special needs persons? Original articles on for example the following themes are welcome:
  1. minimum age and protection of minors;
  2. effect of a daily use of social networks on kids development including school performance;
  3. cyber-bullying, harassment and violence arising from SN usage amongst children;
  4. accessibility of elderly or disabled persons to SN;
  5. digital divide and e-inclusion; 
  6. ethical issues such as: identity, agency and autonomy for special needs users; 
  7. generational gaps and solidarities arising from SN usage; 
  8. types of solidarities arising from SN usage.
The editors at Ethics and Information Technology are seeking articles for a special issue in these areas. Submissions will be double-blind refereed for relevance to the theme as well as academic rigor and originality. High quality articles not deemed to be sufficiently relevant to the special issue may be considered for publication in a subsequent non-themed issue of Ethics and Information Technology.

Closing date for submissions: 30 September 2012

To submit your paper, please use the online submission system, to be found at www.editorialmanager.com/etin

Please contact the special guest editors for more information: Caroline Rizza and Ângela Guimarães Pereira.

Or the managing editor, Noëmi Manders-Huits

Ethics and Information Technology (ETIN) is the major journal in the field of moral and political reflection on Information Technology. Its aim is to advance the dialogue between moral philosophy and the field of information technology in a broad sense, and to foster and promote reflection and analysis concerning the ethical, social and political questions associated with the adoption, use, and development of IT.

Friday, June 1, 2012

"Obama order sped up wave of cyberattacks against Iran"

An alarming article in the New York Times (Obama Order Sped Up Wave of Cyberattacks Against Iran, David E. Sanger, June 1, 2012) claims
From his first months in office, President Obama secretly ordered increasingly sophisticated attacks on the computer systems that run Iran’s main nuclear enrichment facilities, significantly expanding America’s first sustained use of cyberweapons, according to participants in the program.
The effort was begun in the Bush administration.

The article is
based on interviews over the past 18 months with current and former American, European and Israeli officials involved in the program, as well as a range of outside experts. None would allow their names to be used because the effort remains highly classified, and parts of it continue to this day.
It's a long, detailed, and sobering article. My basic reaction to this news isn't much changed from what I express in an earlier post.

Ken Pimple, PAIT Project Director

Friday, May 18, 2012

Facebook is your friend. Or not.

About a month ago, the New York Times told us "Facebook Offers More Disclosure to Users" (Kevin J. O'Brien, April 12, 2012). Facebook, eager to "address concerns about the personal information it collects on its users ... would provide any user with more about the data it tracks and stores." Facebook did this before, in 2010, by giving users "a copy of their photos, posts, messages, list of friends and chat conversations."

The new version goes beyond that: it
includes previous user names, friend requests and the Internet protocol addresses of the computers that users have logged in from. More categories of information will be made available in the future, Facebook said.
So Facebook is going to let us get a better idea of how much and what kind of data it collects from us. That's mighty big of them.

But, you know, after Facebook's initial public offering (IPO) today, Facebook will have stock holders to keep happy. Which means that Facebook will have to make more money, and then still more money, and then much more money. How will they do it?

As Fred Cate, my colleague at Indiana University, puts it:
When Facebook investors and founders rake in billions of dollars on Friday, they are making that money by selling little bits of each of us.... What Facebook is selling is us. [IU law professor: Facebook IPO will 'sell little bits of each of us', May 16, 2012]
Cate, Distinguished Professor, and C. Ben Dutton Professor of Law, and Director of IU's Center for Applied Cybersecurity Research, knows what he's talking about.

Shouldn't Facebook users at least get a cut of this action?

Ken Pimple, PAIT Project Director

Monday, April 2, 2012

"Police Are Using Phone Tracking as a Routine Tool"

A recent article published in the New York Times ("Police Are Using Phone Tracking as a Routine Tool" by Eric Lichtblau, March 31, 2012) reveals that local police forces - not just the FBI and CIA - are now using cellphone tracking as "a powerful and widely used surveillance tool ... with hundreds of departments, large and small, often using it aggressively with little or no court oversight."

If that's not bad enough, some carriers are profiting at their customers' expense.
The practice has become big business for cellphone companies, too, with a handful of carriers marketing a catalog of “surveillance fees” to police departments to determine a suspect’s location, trace phone calls and texts or provide other services. Some departments log dozens of traces a month for both emergencies and routine investigations.
The police, naturally, claim that the practice saves lives. "Law enforcement officials" contacted by the Times "said the legal questions were outweighed by real-life benefits."
The police in Grand Rapids, Mich., for instance, used a cell locator in February to find a stabbing victim who was in a basement hiding from his attacker.
I assume that the police are mostly tempted to use tools that they find helpful, which is good and proper. But to disregard the legality of such use - that's another matter.

The law in this area is not yet clear, so perhaps the police and the cellphone companies should be given the benefit of the doubt. But I, for one, am unwilling to do so until they convince me that warrants aren't necessary.

Special thanks to the American Civil Liberties Union, which obtained "5,500 pages of internal records ... from 205 police departments nationwide" and provided them to the Times.

Ken Pimple, PAIT Project Director

Wednesday, March 21, 2012

"The Snails of War"

Future headline: "PETA protests robotics facility."

Or has that already happened?

It isn't surprising that researchers with an eye on funding from DARPA are trying to create "tiny, self-powered animal/machine hybrids as an alternative to tiny robots" ("The snails of war"" by James Gorman, New York Times March 20, 2012). Insects and other small critters already know how to move and sense things and whatnot. All you have to do to make them into weapons or surveillance tools is create some kind of controlling mechanism, a power supply for the electronics, and suitable hardware (weapons, microphones, whatever). Eventually the technology will be commercialized (probably with surveillance, but not weapons) and everyone will want a iSlug.

It's a strange world where the potential for military applications can make the strangest of dreams come true. Or maybe I'm just strange to think it is.

Ken Pimple, PAIT Project Director

Thursday, February 23, 2012

"Cars that drive themselves"

Always at the cutting edge of risky behavior, Nevada has "released draft rules to govern self-driving cars, which it has approved for testing on its public roads" (Cars that drive themselves, MSN Money, February 22, 2012). The car will have to have a sober, licensed driver behind the wheel. But it'll be legal to text while driving.

Thanks to Donald Searing for bringing this to my attention.

Ken Pimple, PAIT Project Director

Wednesday, February 22, 2012

"Google to Sell Heads-Up Display Glasses by Year’s End"

The New York Times brings us information of a product whose time has come: Google to Sell Heads-Up Display Glasses by Year’s End by Nick Bilton (February 21, 2012). Augmented reality is on its way.

Read the comments, too.

Ken Pimple, PAIT Project Director

Tuesday, February 21, 2012

Privacy in your car, online, in your pocket, in your back yard

A cluster of articles in the New York Times highlight a number of technologies, practices, and laws that threaten our privacy. Taken in chronological order:
  1. Private Snoops Find GPS Trail Legal to Follow by Erik Eckholm (January 28, 2012). The good news is that on January 23, 2012, "the Supreme Court held that under the Fourth Amendment of the Constitution, placing a GPS tracker on a vehicle is a search," meaning that the police will have to get a warrant to do so. The bad news is that the a mere $300 can let you buy "a device no bigger than a cigarette pack, attach it to a car without the driver’s knowledge and watch the vehicle’s travels — and stops — at home on your laptop." You can easily keep track of your teenager, your spouse, your business rival, a potential victim. In some of these cases it would be legal, in others not; but it would be so easy to do and so hard to get caught. (See also Privacy, Technology and Law by Barry Friedman (January 28, 2012) for more on recent Supreme Court decisions.)

  2. Should Personal Data Be Personal? by Somini Sengupta (February 4, 2012). Whether it should be, it isn't; rather it's a commodity so valuable that "In the United States alone, companies spend up to $2 billion a year to collect" data we routinely share on the Internet. Differing cultural attitudes toward privacy - splitting Europe and the U.S., for example - make it difficult or impossible to regulate the ways that international enterprises can collect and use personal data online.

  3. Mobile Apps Take Data Without Permission by Nicole Perlroth and Nick Bilton (February 15, 2012). "Companies that make many of the most popular smartphone apps for Apple and Android devices — Twitter, Foursquare and Instagram among them - routinely gather the information in personal address books on the phone and in some cases store it on their own computers." That we now know about this is a step forward.

  4. Drones Set Sights on U.S. Skies by Nick Wingfield and Somini Sengupta (February 17, 2012). But in a step backward, a new Federal law requires the Federal Aviation Administration (FAA) to allow commercial use of drones - small, remote controlled, inexpensive aerial surveillance devices.
    “As privacy law stands today, you don’t have a reasonable expectation of privacy while out in public, nor almost anywhere visible from a public vantage,” said Ryan Calo, director of privacy and robotics at the Center for Internet and Society at Stanford University. “I don’t think this doctrine makes sense, and I think the widespread availability of drones will drive home why to lawmakers, courts and the public.”
It's a brave new world out there. Or will be soon.

Ken Pimple, PAIT Project Director

Wednesday, February 8, 2012

Call for abstracts: 5th World Congress on Social Media, Mobile Apps, Internet and Web 2.0 in Medicine, Health, Biomedical Research

Call for Abstracts
Medicine 2.0'12 @ Harvard, Boston 
Sept 15-16, 2012 

Medicine 2.0 is the annual academic World Congress on Social Media in Medicine and Health, this year with a new focus on mobile health applications. Last years' 2011 World Congress at Stanford was attended by 450 participants from 26 countries, with over 100 speakers, and sold out.

Medicine 2.0 is about the future of medicine using emerging technologies such as mobile apps, social media and Internet-based approaches for public health and medicine, and looks beyond the hype of other health 2.0 conferences, by focussing on an evidence-based, data driven approach. Medicine 2.0 is the only conference in this area with a research focus (although we also have a practice and a business track).

Full papers of conference presentations can be PubMed-indexed and may be eligible for publication in the leading ehealth/mhealth journal, the Journal of Medical Internet Research (Impact Factor: 4.7).

As a published researcher or potential speaker in this field we hope that you will submit an abstract in the following areas:
  • Blogs and Twitter in Health 
  • Building virtual communities and social networking applications for health professionals
  • Building virtual communities and social networking applications for patients and consumers
  • Business models in a Health 2.0 / Web 2.0 environment
  • Science 2.0, collaborative biomedical research, academic / scholarly communication, publishing and peer review
  • Consumer empowerment, patient-physician relationship, and sociotechnical issues
  • Ethical & legal issues, confidentiality and privacy
  • Health information on the web: Supply and Demand
  • Innovative RSS/XML applications and Mashups
  • Personal health records and Patient portals
  • Public (e-)health, population health technologies, behavior change applications
  • Digital Disease Detection and Biosurveillance using Twitter and other social media/mhealth/Internet sources
  • The Quantified Self: tracking behavior and health
  • Search, Collaborative Filtering and Recommender Technologies
  • Semantic Web ("Web 3.0") applications
  • The nature and dynamics of social networks in health
  • Usability and human factors on the web
  • Virtual (3D) environments, Second Life
  • Web 2.0 approaches for behaviour change and public health
  • Web 2.0 approaches for clinical practice, clinical research, quality monitoring
  • Web2.0-based medical education and learning
  • Wikis
  • Business modelling in eHealth
  • Communities in health care
  • Digital Learning
  • e-Coaching
  • Health disparities
  • Human-Computer Interface (HCI) Design
  • Online decision technology
  • Participatory health care
  • Persuasive communication and technology
  • mHealth Applications
  • Ubiquitous, pervasive ehealth; domotics; Internet of things
  • New and emerging Technologies
For more information: http://www.medicine20congress.com/ocs/index.php/med/med2012/schedConf/cfp

Sunday, January 15, 2012

"What Fueled Twitter’s Success?"

How did Twitter become so popular? The simple answer is that nothing succeeds like success - the same reason we still have the QWERTY keyboard.

A more sophisticated answer can be found in this article by Paul Hyman (ACM News, January 12, 2012), which is a report on a technical article (Modeling the adoption of innovations in the presence of geographic and media influences by Jameson L. Toole, Meeyoung Cha, and Marta C. González), but from me you're going to get a very minimal, two-part explanation.
  1. Lots of people who lived near each other adopted Twitter early on. The service became available in late March of 2006, and by early August of 2009 "nearly 3.5 million people signed up for Twitter, mainly in cities with high concentrations of young, tech-savvy early adopters like San Francisco and Boston."
  2. Celebrity praise of Twitter - Ashton Kutcher and Oprah Winfrey are mentioned - also boosted the number of users dramatically.
Together, these two forces pushed Twitter users to "a tipping point of 13.5% of the population." And the rest is history.

So if you want your product to "go viral," release it first in locations with lots of likely early adopters and induce celebrities to plug it. It would probably help to use Twitter or Facebook.

Ken Pimple, PAIT Project Director

Thursday, January 5, 2012

Call for papers: Personal and Ubiquitous Computing

Personal and Ubiquitous Computing
Special Issue on Security and trust in Context-Aware systems

Important dates
  • Manuscript submission:   28.02.2012
  • First round of reviews:    31.03.2012
  • Submission of revisions:  21.04.2012
  • Acceptance notification:  21.05.2012
  • Final manuscript due:      18.06.2012
  • Publication date:             Summer 2012
Scope
Over the last several years, studies considering impacts of security and trust in personal and ubiquitous computing have become an independent research field within the Pervasive Computing area. One strand has concentrated on using context data to establish security or authentication, while a second strand considers trust in context and services provided by remote devices. A third strand, motivated by corporate applications, focuses on resilience of an instrumentation of distributed sources.

Research also considers the question how much information can be obfuscated to protect the privacy of a user without preventing the correct operation of a given application. Methodologically, new models for the specific attack scenarios, security threats and counter-effects in wireless sensor networks and context-aware mobile systems need to be developed. 

Clearly, these strands intersect varied disciplines, including acquisition and classification of context, cryptography and fuzzy authentication, sensor networks, information theory and interface design.

The objective of this special issue is to provide a platform to bring together the above strands and other emerging paradigms of research in this area and thereby provide further impetus to research on this class of problems.

We solicit original papers and tutorial surveys on the following list of indicative topics.
  • Context-based mobile wireless authentication
  • Context-based device pairing
  • Securing context-aware applications
  • Sensor-, context-, and location-based authentication
  • Spontaneous secure context-based device interactions
  • Autonomic and dependable computing
  • Methods and techniques for self-configuration, self-healing, self-protecting systems
  • Flexible and secure orchestration of ICT services
  • Establishing and managing trust in cyber-physical systems
  • Anonymous/pseudonymous context aware mobile computation
  • Legal and social issues of security and privacy for mobile devices
  • Perception of security and privacy in mobile computing
  • Resilient cryptography
  • Entropy of context based keys
  • Fuzzy cryptography
  • Security with noisy data
  • Usability aspects of secure and privacy-preserving context-aware systems
  • Mechanisms that improve a user's awareness of, and control over,  privacy and security
  • Agent-based methods and architectures for trust and security in  Ubiquitous Computing
  • Contextual reasoning methods for privacy and security in Ubiquitous Computing
  • Ontology-based and knowledge-based methods and architectures
As usual, the above is not an exhaustive list but an indicative one.

Submission Process
Prospective authors should submit a pdf of their manuscript via EasyChair at
https://www.easychair.org/account/signin.cgi?conf=stpuc2012. Formatting should follow the PUC-guidelines (see http://www.springer.com/computer/hci/journal/779 for more details). Submissions should not exceed 8000 words. 

Prior to submitting their papers for review, authors should make sure that they understand and agree to adhere to the over-length page charge policy presented in the PUC guidelines.

Guest editors
René Mayrhofer, Hedda R. Schmidtke, Stephan Sigg

Contact 
SecurityAndTrust2012@easychair.org

(With thanks to Colin Allen for sharing this with me.)
 
Ken Pimple, PAIT Project Director

Wednesday, January 4, 2012

"Robotics: Morals and machines"

This book review by Braden Allenby (Nature 481:26-27, January 5 2012) covers Robot Ethics: The Ethical and Social Implications of Robotics, ed. Patrick Lin, Keith Abney, and George A. Bekey (MIT Press: 2011). Allenby characterizes the book as "a timely round-up of sensible ethical and policy responses to advances in robot technology" and notes (among other good qualities) that the book "succeeds as a stand-alone text, with its varied contributors striving for objectivity and avoiding hyperbole."

I'll be ordering my copy soon.

Ken Pimple, PAIT Project Director

"The Future of Moral Machines"

This article by Indiana University professor Colin Allen (New York Times, December 25, 2011) is a clearly-written and engaging short introduction to the controversies surrounding "moral" machines.

I made the mistake of reading a few of the comments written by Times readers, the majority of whom seem to have either misunderstood the article or not bothered to read beyond the title. One wonders why people so disdainful of the New York Times and/or philosophy would torment themselves by reading the Times' "forum for contemporary philosophers on issues both timely and timeless" known as "The Stone."

At any rate, I do recommend that you read and enjoy the article itself.

Ken Pimple, PAIT Project Director