msr faculty summit: information access, privacy, and confidentiality: challenges and opportunities

| No Comments

Came in a little late, and Eric Horvitz is giving examples of how data mining your own activity (driving patterns, for example) can provide useful predictive services. ("Going to the airport? There's a backup on I-5 South.")

Users should control their ow data mines. There should be a shroud of privacy between your local private data and other systems. Context and additional content can come from outside, but personal data and resulting predictive models should be inside.

Notes that what's offensive and intrusive changes over time. In the 1800s, "rapid photography" was intrusive, and in the 1920's a ringing phone intruded into the private sanctity of the home. Shows some research they've done on people's preferences about sharing what with whom. What kinds of information are treated similarly? Currently working on privacy preferences and tradeoffs with web services. What's the biggest "bang" you can get for the personal data "buck"?

Value has a diminishing returns (submodular) curve, while the cost is an accelerating curve (supermodular). Can combine those to find the "sweet spot"--where should you stop asking about information because it will make people uncomfortable and not give you much of an increase in value?

Shows the survey they used about how much people are willing to share--I actually took this survey about 3 weeks ago. I wonder about the generalizability of this data, though, since the target survey population was very tech-savvy folks, many of whom are well aware of how much information is already stored about them. Notes that there is a rise in "preference and intention machines" that balance risk and benefit.

Next up, Tadayoshi Kohno from Univ of Washington. Talks about what privacy "actually means." Starts with dictionary definitions. Argh. I hate it when my students do this. It's too much of a cliched presentation opening.

One response people have is "privacy is dead, deal with it." Also "I've got nothing to hide." (I hear this a lot from my students.) And users often choose improved functionality over privacy (for example, customer loyalty cards).

On the other hand, some people say that privacy is critical. When people hear about privacy breeches, this can (temporarily) change their views. For example, the AOL search log controversy, the implementation of Facebook news feeds.

Shows a news article showing that loyalty card details have been used in some court cases (eg divorce cases). [that would be a nice example to use in my class]

Who's responsible for protecting private info? The data collector? The user?

Ends with "privacy is not dead, just complicated."

[There are more speakers, but I'm tired of transcribing...]

Oh, great line from an MSR researcher whose name I didn't catch (will fill it in later): "Privacy is a non-renewable resource."

Leave a comment

About this Entry

This page contains a single entry published on July 16, 2007 4:23 PM.

msr faculty summit: using social relevance to enhance CS was the previous entry in this blog.

microsoft lifechat lx-3000 headset on mac os x is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Archives

Category Archives