The Limitations of Privacy Reform Rooted in Interest Convergence by Margaret Foster
A year and a half ago, Governor Jerry Brown signed into law the California Consumer Protection Act, which went into effect on January 1, 2020. Though the bill was simultaneously hailed as a “groundbreaking,” “extremely powerful,” “landmark law,” and criticized as a “punitive . . . mistake,” there’s no dispute that it is currently the strongest privacy law in America, garnering comparisons to the European Union’s sweeping General Data Protection Regulation (GDPR). Kari Paul, California’s Groundbreaking Privacy Law Takes Effect in January. What Does It Do?, The Guardian (Dec. 30, 2019); Zack Whittaker, Silicon Valley Is Terrified of California’s Privacy Law. Good., TechCrunch (Sept. 19, 2019); Natasha Singer, Group Behind California Privacy Law Aims to Strengthen It, The New York Times (Sept. 24, 2019).
But that accolade is somewhat misleading: the CCPA is the strongest privacy regulation in the country because it’s the first privacy regulation in the country. Of course, most people would argue that some privacy protections are better than none, but as the federal and state governments look to California in their race towards the next major piece of privacy legislation, it’s important to understand who the CCPA was made by, who it was made for, and how it ended up treating privacy as a compromise for the market, rather than a human right.
What eventually became the CCPA actually began as a ballot initiative spearheaded by California real estate developer, Alastair Mactaggart, and his friend and finance executive, Rick Arney. Nicholas Confessore, The Unlikely Activists Who Took on Silicon Valley – and Won, The New York Times Magazine (Aug. 14, 2018). After a lobbying group funded by Facebook, Google, and other tech giants, launched a campaign to kill his initiative, Mactaggart struck a deal with industry players and California lawmakers to work together on a bill that would aim to satisfy privacy advocates while still allowing Silicon Valley to thrive. Id. The result? What Electronic Privacy Information Center (EPIC) associate director Mary Stone Ross calls a “watered down” and “toothless” law. Mary Stone Ross, I Helped Draft California’s New Privacy Law. Here’s Why It Doesn’t Go Far Enough, Fast Company (Jan. 3, 2020).
Indeed, though the law allows consumers to review the personal information companies collect and for what purposes, opt out of their information being sold to third parties, request that their information be deleted, and sue companies when they fail to prevent data breaches, Cal. Civil Code § 1798.100, the CCPA is largely reactionary and thus, immensely burdensome to those it’s meant to protect. In other words, instead of requiring companies to proactively restrict their data collection to only that data which is necessary to provide services, or to have consumers opt in to having certain personal information collected and sold, the law allows companies to count on its tremendous information costs to consumers to continue collecting and selling heaps of information.
Beyond simply knowing that the CCPA exists, in order to benefit from its protections, consumers need both the time and the Internet know-how to find links to and instructions for requesting to review, delete, and opt out of selling their data for each company with whom they transact. And because the law does not stipulate a uniform information collection method, the format and size of the file of personal data that a company must provide a consumer should they request it could be massive and extremely difficult to parse through and understand. Although the law includes a private right of action that provides for statutory damages for data breaches in violation of the law, § 1798.150, the action is limited to personal information much more narrowly defined than throughout the rest of the statute and cannot be pursued until the consumer has notified the company of the specific statutory violation and given the company 30 days to cure the violation. Id.
That the CCPA’s primary beneficiary is a consumer with enough resources to pursue a meticulous inventory of their own digital footprint, that it’s aim is to regulate the private sector while ignoring the public sector, and that it’s protections and remedies are individualistic rather than collective, in nature, should not be surprising, given the law’s chief architects. Though the media has declared Alastair Mactaggart an “activist,” see, e.g., Tony Romm, Privacy activist in California launches new ballot initiative for 2020 election, The Washington Post (Sept. 24, 2019), his interests can hardly be disentangled from those of his foes. A self-described capitalist, Mactaggart is a wealthy man whose passion for privacy was ignited only a few years ago, over dinner with a Google engineer. Confessore, supra. While activists, immigrants, and formerly incarcerated individuals have long been subjects of corporate and government surveillance, see Alvaro M. Bedoya, The Color of Surveillance, Slate (Jan. 18, 2016); Barton Gellman & Sam Adler-Bell, The Disparate Impact of Surveillance, The Century Foundation (Dec. 17, 2017), for Mactaggart, “[t]he vast pools of data [that Google and Facebook] collected and monetized were abstractions, something he knew existed, but, as with plane crashes, rarely dwelt on.” Confessore, supra.
But then the Edward Snowden and Cambridge Analytica scandals broke and everyone was exposed as vulnerable to privacy violations. Id. All at once, a wealthy man’s secrets, Big Tech’s business model, and government’s credibility were at stake. Only then did the privacy interests of these disparate parties converge with those most often harmed by unregulated surveillance and only then did privacy legislation become inevitable. But as Derrick Bell’s original articulation of interest convergence theory revealed, such a point of convergence is both fleeting and finite. Derrick A. Bell, Jr., Brown v. Board of Education and the Interest-Convergence Dilemma, 93 Harv. L. Rev. 518 (1980). For to base legislative reform on interest convergence is to inhibit creative and expansive problem solving from the beginning, and to instead guarantee myopathy.
The danger, then, of modeling future privacy legislation off of the CCPA is that it will likely be similarly narrow in scope and lacking in forethought and thus, fail to account for what happens when the interests of consumer, business, and government diverge. By framing personal information solely in the context of business-consumer transactions, the CCPA plays into the notion of privacy as a commodity, thereby both leaving open the door for the disparate impact of a power-imbalanced, anti-poor market and shifting focus away from the primary transgressor of privacy violations: the U.S. government. See, e.g., Barton Gellman, NSA Broke Privacy Rules Thousands of Times Per Year, Audit Finds, The Washington Post (Aug. 15, 2013).
The CCPA has, importantly, moved privacy to the forefront of law and policy in the U.S. but it should, nonetheless, serve as a warning to privacy advocates that a law rooted in interest convergence will necessarily sidestep the weakest party’s most pressing needs and do little more than superficially and marginally disrupt the status quo. Rather than relying on a team of millionaires receiving input from Facebook and Google, lawmakers currently working on privacy legislation must also seek the perspective of individuals from the communities that have been harmed by targeted mass surveillance and data collection not just financially, but in the context of social welfare services, criminal justice, housing, employment, education, healthcare, and public safety. Such perspective will help to ensure that future privacy legislation does not perpetuate or worsen the discriminatory effects of predatory or seemingly benign data mining and monetizing practices and begin reframing privacy not just as a consumer protection issue, but as a human rights issue.
Bio: Meg Foster is a first-year student at Northeastern University School of Law and holds a master’s degree in Information Science from the University of North Carolina at Chapel Hill. She is interested in the disparate impact of surveillance and artificial intelligence and will be clerking at the Electronic Privacy Information Center this summer.