Tuesday, September 21, 2010

"Code Known as Flash Cookies Raises Privacy Concerns"

This article in today's New York Times concerns the perennial problem of online privacy. The cookies used by Adobe's Flash player are different from those with which most of us are familiar, and are not deleted by the methods available in most browsers to manage cookies. The purpose of Flash cookies is also worrisome:
“The core function of the cookie is to link what you do on Web site A to what you do on Web site B,” said Peter Eckersley, a technologist at the Electronic Frontier Foundation. “The Flash cookie makes it harder for people to stop that from happening.”
Many people appreciate the same-site services that ordinary cookies can provide - showing content based on what I have already read on that site, for example. But sharing that information with a different site is another issue altogether. No doubt it can be used to great advantage for consumers, but it is also an order of magnitude more intrusive than ordinary cookies.

The article touches on several issues that arise across many pervasive technologies. Flash cookies, being unfamiliar and "transparent to the user," are covert. They compile information without our consent or knowledge. And although they can be useful and welcome to users, in the end they are implemented in the pursuit of profit, whether by making our Web experience better (the benign purpose) or by manipulating us - or worse - by exploiting our personal interests.

Giving businesses and organizations who use these technologies the benefit of the doubt, I have to say that this practice is rude at best. Imagine shopping malls studying their security camera footage to analyze which stores individual consumers visited and used the information to send them targeted advertising. Wouldn't that be obviously intrusive? Why can't Internet-based businesses understand that we don't like to be stalked? Why do so many people assume profit automatically justifies a practice? Shouldn't we decide first whether a practice is acceptable at all, and then determine whether it is suitable to be used for profit?

Ken Pimple, PAIT Project Director

Monday, September 13, 2010

"How to Train Your Robot (to Lie)"

In an earlier entry I expressed my discontent with our current terminology about robots - specifically, that we use the word "robot" to cover too many distinct types of artifacts. This article from ScienceNow illustrates another problem with how we talk about robots, somewhat reminiscent of Jaron Lanier's concerns over The First Church of Robotics.

The claim in the article's title is that a robot has been taught how to lie, but it has only learned how to lay a false trail. This can be seen as deception, or misdirection, or perhaps even camouflage - all of which can be seen in non-human animals and even plants - but hardly lying. No doubt the work represents a breakthrough, but is the hyped-up language necessary or productive? Am I the only one who thinks longingly of the days when science news was a bit more sober and restrained?

Ken Pimple, PAIT Project Director

Wednesday, September 8, 2010

Surveillance for preschoolers; when algos takes over

Two tidbits today.

(1) An editorial in today's New York Times, Keeping Track of the Kids, expresses a "worry that we are all becoming a little too blasé about our scrutinized lives" in which we accept security cameras, mobile telephones that allow us to be tracked minute-by-minute, and RFID chips on preschool children. The editorial's final sentence: "Though it may seem innocuous to attach a chip to our preschoolers’ clothes, do we really want to raise a generation of kids that are accustomed to being tracked, like cattle or warehouse inventory?"

It's a nicely phrased question. There are, of course, other ways to put it, like, "do we really want to raise a generation of kids who are accustomed to being protected from predation, not to mention simply being lost?"

What, indeed, would be lost - or gained - by this generation's getting accustomed to being tracked? Their elders apparently became so accustomed with very little fanfare.

(2) Colin Allen, to whom my thanks, brought this ABC program, The flash crash (August 29), to my attention. The program is summarized as follows:
A few months ago the US share market plunged l000 points in a few minutes, and trillions were traded both up and down. What caused it, and can it happen again? Tiny high frequency computer algorithms - or algos - roam the markets, buying and selling in a parallel universe more or less uncontrolled by anyone. Did they go feral, or was it the fat finger of a coked out trader? In September US regulators bring out their findings.
One has to be more interested in stock markets than I am to listen to the entire program, or read the entire transcript, but you can find the good stuff easily enough at the very end - about the last 2 minutes in the audio. That's when Colin Allen himself sums up the underlying problems. Enjoy.

Ken Pimple, PAIT Project Director

Tuesday, September 7, 2010

"The Boss Is Robotic, and Rolling Up Behind You"

This article from the New York Times (September 4) is worth a look, if only for the graphic of 5 similar robots and a couple of useful videos of the robots in action. These robots are essentially video conferencing tools on wheels, remotely operated by one person (whose face typically appears on the robot's screen) from an ordinary desktop computer.

The appeal of this technology in certain settings is obvious: An entrepreneur with offices and staff in two or more cities can meet with his staff daily without having to travel; a physician can assist in a health clinic hundreds of miles away. I don't think these robots will become ubiquitous, though.

The terminology bothers me a bit, though. I wish that we had widely-used terminology that could distinguish between, for example, factory robots that are bolted in place and perform well-defined and highly patterned movements; mobile robots with some choice-making capacity but very few functions (the Roomba® is a good example); remote-controlled mobile devices like the ones described in this article; and fully mobile machines with good deal of choice-making capacity. There might be other categories of which I am unaware (or not bringing to mind just now. I also think it would be useful to distinguish between human-shaped (humanoid) robots, robots so human-like that they could be mistaken for a person (think C3PO vs Data), and non-humanoid robots (R2D2).

If you know of such terminology, please post a comment sharing it with me and the readers of this blog.

Ken Pimple, PAIT Project Director

Monday, September 6, 2010

2nd World Congress on Computer Science and Information Engineering (CSIE 2011)

Call for Papers and Exhibits

17-19 June 2011, Changchun, China

CSIE 2011 intends to be a global forum for researchers and engineers to present and discuss recent advances and new techniques in computer science and information engineering. Topics of interests include, but are not limited to, data mining & data engineering, intelligent systems, software engineering, computer applications, communications & networking, computer hardware, VLSI, & embedded systems, multimedia & signal processing, computer control, robotics, and automation.

All papers in the CSIE 2011 conference proceedings will be indexed in Ei Compendex and ISTP, as well as included in the IEEE Xplore (The previous conference CSIE 2009 has already been indexed in Ei Compendex and included in the IEEE Xplore). IEEE Catalog Number: CFP1160F-PRT. ISBN: 978-1-4244-8361-7.

Changchun is the capital city of Jilin province, situated in the central section of China's northeast region. There are many natural attractions to entertain residents and visitors around Changchun. The grand Changbai Mountain renowned for its spectacular landscape, charming scenery, glamorous legends, as well as rich resources and products, has been praised as the first mountain in the northeast, outstanding as one of the China’s top-ten famous mountains. Other attractions in or around Changchun include Songhua lake (Songhuahu), Jingyue Lake (Jingyuetan), Changchun Movie Wonderland, Changchun Puppet Palace (Weihuanggong), Changchun World Sculpture Park, and Changchun World Landscape Park, etc.

Important Dates
  • Paper Submission Deadline: 20 September 2010
  • Review Notification: 15 November 2010
  • Final Paper and Author Registration Deadline: 6 January 2011
Contact Information
If you have any inquiries, please email us at CSIE2011@cust.edu.cn.

"A Strong Password Isn’t the Strongest Security"

This article from the New York Times (September 4, 2010) points out that strong passwords, on their own, are useless against keylogging software (the kind that captures your keystrokes and sends them to a bad guy somewhere who then has your username and password, no matter how strong). Cormac Herley, a Microsoft security expert, is quoted as saying, “Keeping a keylogger off your machine is about a trillion times more important than the strength of any one of your passwords."

Furthermore, requiring strong passwords encourages users to write them down, usually in places they can easily be found. So much for security.

What I find most interesting about this article, though, is the emphasis on the responsibility of  system administrators for security. All the rules for strong passwords shifts an unreasonable - and possibly counter-productive - burden onto end users, who might be able to guard against keyloggers, but certainly not as well as sysadmins. As Mr. Herley says, "It is not users who need to be better educated on the risks of various attacks, but the security community. ... Security advice simply offers a bad cost-benefit tradeoff to users."

We're all in this together, to be sure, but  lets delegate responsibility appropriately.

Ken Pimple, PAIT Project Director

Friday, September 3, 2010

"Technology Leads More Park Visitors Into Trouble"

This article, published on August 21 in the New York Times, describes some of the ways in which technology - cell phones, cell phone cameras, satellite location devices - have been used to dangerous or costly effect in U.S. national parks. The stories recounted all strike me as caused by a lack of common sense, or by simple stupidity, abetted by technology. A more generous interpretation is that, at least in some cases, the technology leads some people to take greater risks, confident that they can be saved by dialing 911.

Do such events indict technology, individual boneheads, a society of boneheads, or the nanny state?

Thanks to Don Searing for bringing this article to my attention.

Ken Pimple, PAIT Project Director