- Changing norms and laws around privacy across time and cultures, including how people balance privacy vs. other goals
- Data aggregation, matching, and de-anonymization strategies
- Facial recognition technology (used by public and private actors)
- Consent for different types and uses of data
-
Aristotle: public sphere (politics and commerce 🚹) differs from private sphere (domestic life, family and friends 🚺)
- Individuals have interests in shielding the public from what happens in private
-
Social context is key, regardless of being in public or private
- We want to be able to control what is known about us + how we present ourselves
- Privacy is central to individual autonomy or self-determination
-
Modern, philosophical defintion of privacy:
The claim of individuals to determine for themselves when, how, and to what extent information about them is shared with or communicated to others
Privacy and Freedom, by Alan F. Westin, Atheneum, 1970, p. 7.- Violations of privacy impose harms
- Privacy is about protecting intimacy, freedom, and control (Solove, “‘I’ve Got Nothing to Hide’”)
Solove, Daniel J. “A Taxonomy of Privacy.” University of Pennsylvania Law Review, vol. 154, no. 3, 2006, p. 477.
- Information collection (surveillance)
- Information processing (aggregation and inferences from big data)
- Information dissemination (breach of confidentialty, discloure third parties, blackmail)
- Invasion of privacy (intrusion, decision interference)
What other values/rights/interests might be in tension with privacy?
- National security (terrorism)
- Public safety (crime)
- Innovation
- Convenience
- U.S. approach to a company’s data: transparency and choice
- U.S. entities inform individuals and provide a choice to consent or not
- …This is appealing because
- our idea of privacy is to control information about ourselves
- we are committed to the idea of a free market
- But that’s only true if…
- Individuals must be able to make informed, rational choices about the costs and benefits of different privacy policies
- The market must be able to deliver a diversity of products with different privacy settings
- We must be able to achive the societal balance that we want between privacy and other values via a set of decentralized decisions
- …Are those true in practice?
- Are individuals today actually up to the challenge of navigating privacy?
- Social scientists are skeptical (Acquisti 2015). Lawyers are concerned (Solove 2013). Information scientists doubt it (Nissenbaum 2011).
- People are uncertain about their preferences
- Pereferences are context-dependent
- Privacy prefs can be manipulated
- Privacy self-management does not scale well
- People cannot factor in aggregation
- People cannot anticipate harm
- Social scientists are skeptical (Acquisti 2015). Lawyers are concerned (Solove 2013). Information scientists doubt it (Nissenbaum 2011).
- Comprehensive privacy regulation
- …denies people the freedom to make choices
- …is not always clear in the trade off of privacy vs. data use
- …limits social benefits to data aggregation
- Improving privacy self-management through:
- Opt-in > opt-out consent
- Global > local management
- Focus on downstream use
- Acceptability of basic privacy norms
- Data privacy often involves a balance of competing interests
- Making data available for meaningful analysis
- for public good: medical research and healthcare improvement, protecting national security
- for private good: personalized advertising
- Deleting identifiers doesn’t make PII unidentifiable
- I’m an outspoken woman on the Internet and that comes with a cost. I need to protect myself and my family from any potential abuse.
- I don’t want to be discriminated against based on criteria I don’t know about.
- I’m going to live for a while — I want to be in charge of what information about me exists in the future for myself and my family.
- I want to be able to protest against my country and my government without fear for my safety.
- I don’t want to be treated differently because of lifestyle choices I intend to keep private.
- I want to be able to make my own decisions without the influence of microtargeting.
- I work for a visible, prominent company, and bad actors could weaponize that against me based on information I intend to keep private.
- Solove, Daniel. “‘I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy.” San Diego Law Review, vol. 44, 12 Jul 2007, p. 745.
- Acquisti, Alessandro; Brandimarte, Laura; Loewenstein, George. “Privacy and Human Behavior in the Age of Information.” Science, vol. 347, no. 6221, 30 Jan. 2015, pp. 509–514.
- Hill, Kashmir. “The Secretive Company That Might End Privacy as We Know It.” The New York Times, 18 Jan. 2020.
- Green, Matthew. “What is Differential Privacy?” A Few Thoughts on Cryptographic Engineering, 15 June 2016.
- Sarah Igo, The Known Citizen, Introduction, pp. 1-16 (Harvard University Press, 2018)
- Michel Foucault, Discipline and Punish, ch. 3 “Panopticism”
- Stanford Administrative Guide, 6.1.1 Privacy and Access to Electronic Information
- “A Contextual Approach to Privacy Online” by Helen Nissenbaum (Daedalus 140 (4), Fall 2011)
- “The Class Differential in Privacy Law” by Michele Estrin Gilman (Brooklyn Law Review, 2012)
- “Limitless Worker Surveillance” by Ifeoma Ajunwa, Kate Crawford, and Jason Schultz (California Law Review, 2016)
- “Facebook and the ‘Dead Body’ Problem” by Gideon Lewis-Kraus (New York Times, 2018)
- Shoshana Zuboff, The Age of Surveillance Capitalism, Chapter 18 (PublicAffairs, 2019)
- “Privacy and Information Sharing” by Lee Rainie and Maeve Duggan, pp. 1-8 (skim the rest), (Pew Research Center, 2016)
- “Americans feel the tensions between privacy and security concerns” by Shiva Maniam (Pew Research Center, 2016)
- “Privacy and Data Protection in an International Perspective” by Lee A. Bygrave, sections 3-5 (Scandinavian Studies in Law, 2010)
- “Privacy Self-Management and the Consent Dilemma” by Daniel Solove (Harvard Law Review, 2012)
- “Nudging Privacy: The Behavioral Economics of Personal Information” by Alessandro Acquisti (IEEE Security & Privacy, 2009)
- “Private traits and attributes are predictable from digital records of human behavior” by Michal Kosinski, David Stillwell, and Thore Graepel (PNAS, 2013)
- “Differential Privacy: A Primer for a Non-technical Audience” by Alexandra Wood et al. (Vanderbilt Journal of Entertainment & Technology Law, 2018), pp. 211-214 (Executive Summary) and pp. 225-246 (Sections III and IV)
- “Why ‘Anonymous’ Data Sometimes Isn’t” by Bruce Schneier (WIRED, December 2007)
- “The Promise of Differential Privacy: A Tutorial on Algorithmic Techniques” by Cynthia Dwork (Microsoft Research, 2011)
- “Comparing Privacy Laws: GDPR v. CCPA” by DataGuidance and Future of Privacy Forum (2018)
- GDPR, Art. 5 “Principles relating to processing of personal data”
- Eric Glen Weyl and Eric Posner, Radical Markets, ch. 5 “Data as Labor” (Princeton University Press, 2018)
- “A Design for Public Trustee and Privacy Protection Regulation” by Priscilla M. Regan (Stanford Working Paper, 2019)
- “Jaron Lanier Fixes the Internet” by Jaron Lanier and Adam Westbrook (video series) (New York Times, 2019)
- “Information Fiduciaries and the First Amendment” by Jack Balkin (UC Davis Law Review, 2016)
- “A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI” by Sandra Wachter and Brent Mittelstadt, pp. 1-18, 78-85 (Columbia Business Law Review, Forthcoming)
- “We May Own Our Data, but Facebook Has a Duty to Protect It” by Nathan Heller (New Yorker, 2018)
- “State Privacy Legislation Stalls Despite High Hopes” by Ashley Gold (The Information, 2019)
- Stanford Ethics, Technology & Public Policy case study: Facial Recognition
- “Civil Society Letter to Amazon on Facial Recognition” (Human Rights Watch, 2019)
- “The End of Trust” from McSweeney’s and Electronic Frontier Foundation
- “The Perpetual Line-Up: Unregulated Police Face Recognition In America” from Center on Privacy and Technology at Georgetown Law
- Podesta report “Big Data: Seizing Opportunities, Preserving Values" (especially pp. 58-68)
- "Report on the Telephone Records Program Conducted under Section 215" (Privacy And Civil Liberties Oversight Board, 2014)
- “Facial recognition technology: The need for public regulation and corporate responsibility” by Brad Smith (Microsoft, 2018)