log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
The Implications of Privacy-Aware Choice
Thursday, February 9, 2017, 11:00 am-12:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

Privacy concerns are becoming a major obstacle to using data in the way that we want. It's often unclear how current regulations should translate into technology, and the changing legal landscape surrounding privacy can cause valuable data to go unused.  In addition, when people know that their current choices may have future consequences, they might modify their behavior to ensure that their data reveal less --- or perhaps, more favorable --- information about themselves.  Given these concerns, how can we continue to make use of potentially sensitive data, while providing satisfactory privacy guarantees to the people whose data we are using?  Answering this question requires an understanding of how people reason about their privacy and how privacy concerns affect behavior.
 
In this talk, we will see how strategic and human aspects of privacy interact with existing tools for data collection and analysis.  I will begin by adapting the standard model of consumer choice theory to a setting where consumers are aware of, and have preferences over, the information revealed by their choices.  In this model of privacy-aware choice, I will show that little can be inferred about an individual's preferences once we introduce the possibility that she has concerns about privacy, even when her preferences are assumed to satisfy relatively strong structural properties.  Next, I will analyze how privacy technologies affect behavior in a simple economic model of data-driven decision making.  Intuition suggests that strengthening privacy protections will both increase utility for the individuals providing data and decrease usefulness of the computation. I will demonstrate that this intuition can fail when strategic concerns affect consumer behavior.  Finally, I'll discuss ongoing behavioral experiments, designed to empirically measure how people trade off privacy for money, and to test whether human behavior is consistent with theoretical models for the value of privacy.

Bio

Rachel Cummings is a Ph.D. Candidate in Computing and Mathematical Sciences at the California Institute of Technology. Her research interests lie primarily in data privacy, with connections to machine learning, algorithmic economics, optimization, statistics, and information theory. Her work has focused on problems such as strategic aspects of data generation, incentivizing truthful reporting of data, privacy-preserving algorithm design, impacts of privacy policy, and human decision-making. She received her B.A. in Mathematics and Economics from the University of Southern California and her M.S. in Computer Science from Northwestern University. She won the Best Paper Award at the 2014 International Symposium on Distributed Computing, she serves on the ACM U.S. Public Policy Council's Privacy Committee, and she is the recipient of a Simons Award for Graduate Students in Theoretical Computer Science.

This talk is organized by Adelaide Findlay