April 21, 2004

Workshop on "Privacy and Civil Liberties Issues in Computer Science Research"

I've blogged Barbara Simon's full-day workshop entitled "Privacy and Civil Liberties Issues in Computer Science Research" here. I'll have pictures up soon and I'll post the entry as it exists now (2004-04-21 10:04:25) in the extended entry below.

(I'll have pictures once they make their way through email!)

I spent Tuesday in the full-day workshop at CFP 2004 lead by Barbara Simons entitled, "Privacy and Civil Liberties Issues in Computer Science Research". It was simultaneously a blast and simultaneiously exhausting as we all realized how dark the future of privacy is. Here's a brief recap of the workshop (panelist Terry Winograd of Stanford University didn't present but did chime in quite a bit):

  • The first speacker was Susan Landau (Senior Staff Engineer, Sun Microsystems Inc.) with a talk, "Science -- and Thinking About Ethical Solutions". She spoke about a polish nuclear scientist, Joseph Rotblat who was involved with the development of the atomic bomb during the 1939-1943 period. His goal in the development process was to make sure that the U.S. got the bomb before Germany. When Germany surrendered, he went through quite an ordeal to sever himself from the atomic bomb development. This prompted others members of the development team to question their goal of buliding the bomb.

    The discussion after Ms. Landau's talk was quite interesting. It started with a discussion about how the atomic bomb is quite distinct from—a bright-line case—the issues we're dealing with in modern-day technology. That's what makes current ethical dilemas so hard... no one's necessarily going to die. The other part of the discussion centered around what the proper place for ethical education in computer science and engineering curricula. Some pointed out that people like Dave Farber, Terry Wingrad, and Prof. Kastenberg (Berkeley) have been teaching well-attended "ethics in copmuter science research" classes for years. Ms. Bajcsy and Barbara Simons pointed out that ethical education should be infused throughout engineering education. I personally feel that a combination of the two is best.


  • Nest up was [Ruzena Bajcsy][], the director of CITRIS from University of California, Berkeley who pointed out how difficult it is to get research professors to thing about the privacy implications of their research. A talkative guy from [Crytporights][] named del Tordo asked how hard would it be to require PIs to include a privacy implications statement in their proposals. Ms. Bajcsy seemed to think that this is something that the NSF could do and would be responsive to.


  • Next was Andrew Grosso Andrew Grosso & Associates. Mr. Grosso told a story of a typical day in the life of a modern American and how much we are tracked. His thesis seemed to be that further regulation of privacy rights other than HIPAA and the Privacy Act wouldn't do much.

  • Philippe Golle from Xerox Parc described some really interesting research that he and his team at PARC were involved with until the TIA funding was cut by congress. They were looking at a way of making sure that government analysts who don't have a warrant can not compromise an individual's privacy by requesting certain sets of information. That is, no one piece of data can pinpoit an individual but a few certain pieces can. There system (all on paper) would not allow analysts to request certain suites of data that would, in aggregate, pinpoint an individual uniquely. To do this, they would need a warrant. Unfortunately (and fortunately), funding dissappeared.


  • Marcia Hofmann from EPIC schooled us about CAPPS II, the new system proposed to screen American plane passengers. Two key take aways: 1) CAPSS II has taken 30 months to get 2/9 phases of development complete... at this pace it will be 2014 before it is done although the Transportation Security Administration says it will be "fully operational" (read: death star) in 2004 and; 2) Congress is very skeptical about CAPPS II and is dissapointed that the GAO found that only 1/8 of their concerns were addressed. What was that one concern that was adequatly addressed? They set up an oversight committee.


  • Finally, David Culler from University of California, Berkeley came to talk about privacy in sensor networks. This was the most scary and the most interesting talk of the day in my opinion. What does privacy mean when the world is blanketed with networked sensors? None of us knew... and there were many smart people in the room. Maybe confinement is the way to go... maybe some basic regulation is the way. It's totally unclear. The one clear thing is that this is set to become a whole new field: Sensor Network Policy.


Posted by joehall at April 21, 2004 10:05 AM | TrackBack
Comments
Post a comment









Remember personal info?