“The commons is the cultural and natural resources accessible to all members of a society,” quoth Wikipedia, “held in common, not owned privately.” We live in an era of surveillance capitalism in a symbiotic relationship with advertising technology, quoth me. And I put it to you that privacy is not just a virtue, or a value, or a commodity: it is a commons.
You may well wonder: isn’t privacy pretty much definitionally “owned privately”? What does it matter to you, or to me, much less to society as a whole, if some 13-year-old somewhere (and her legal guardians) decide to sell her privacy to Facebook for $20 a month? OK, maybe you think rootcerting a teenager is sketchy — but if an adult chooses to sell their privacy, isn’t that entirely their own business?
The answer is: no, actually, not necessarily; not if there are enough of them; not if the commodification of privacy begins to affect us all. Privacy is like voting. An individual’s privacy, like an individual’s vote, is usually largely irrelevant to anyone but themselves … but the accumulation of individual privacy or lack thereof, like the accumulation of individual votes, is enormously consequential.
As I’ve written before, “This accumulation of data is, in and of itself, not a “personal privacy” issue, but a massive public security problem. At least three problems, in fact.” Those are:
- The absence of privacy has a chilling effect on dissidence and individual thought. Private spaces are the experimental petri dishes for societies. No privacy means no experimentation with anything of which society disapproves, especially if it’s illegal. (Which, please note, in recent memory includes things like marijuana and homosexuality; there is a long, long history of “illegal today” becoming “acceptable tomorrow” as societies become less authoritarian.)
- If privacy becomes a commodity, one that only the rich afford, then the rich can and will use this information asymmetry threaten and persecute people who challenge the status quo, thereby perpetuating it.
- Accumulated private data can and probably will increasingly be used to manipulate public opinion on a massive scale. Sure, Cambridge Analytica were bullshit artists, but in the not-too-distant future, what they promised their clients could conceivably become reality. No less an authority than François Chollet has written “I’d like to raise awareness about what really worries me when it comes to AI: the highly effective, highly scalable manipulation of human behavior that AI enables, and its malicious use by corporations and governments.”