In Defense of Cambridge Analytica: We Really Should be Blaming Surveillance Capitalism

Christie Dougherty

The Federal Trade Commission (“FTC”) issued Cambridge Analytica’s epitaph in late November 2019, when it published its settlement opinion. So, it came as a surprise to many Twitter users on January 1, 2020 as they scrolled through their feeds and read: “Data analytics firm #SCLGroup shut down amidst scandal when extensive data work in the shadows of elections globally was called into question via subsidiary #CambridgeAnalytica. To avoid document confiscation, SCL went bankrupt. Its [sic] time to release the files. #Hindsightis2020.” @HindsightFiles, Twitter (Jan. 1, 2020).  Brittany Kaiser, former Cambridge Analytica business development director, blew the whistle on the company back in 2018 and now has begun leaking internal documents on Twitter under the username @HindsightFiles, stating that “democracy has been hacked.” @HindsightFiles, Twitter (Jan. 1, 2020).  As Cambridge Analytica resurfaces from the dead, consumers, regulators, and the media owe the company some respect in how conversations about the company are framed—because Cambridge Analytica, alone, is not the villain.

Over the past three decades, society has had to adapt to the fast-paced evolution of technology.  Rapid technological changes have spurred a new set of cultural norms, and things that were once taboo—such as sharing personal information with virtual strangers—are now the status quo.  That said, it is still human nature to want to place blame when violations of these new norms occur. But while Cambridge Analytica was rightfully charged with deceptive practices under the Federal Trade Commission Act, their actions are not unique.  Blinded by the capabilities and speed of technological advancements, our new societal norms have created the real villain we should all fear: surveillance capitalism. Cambridge Analytica is but one example of this new specter.

To explain why Cambridge Analytica deserves defending, it is important to understand precisely what happened with the company back in 2014.  SCL Group Ltd. was the parent company of Cambridge Analytica and SCL Elections Limited (“SCL Elections”).  In re Cambridge Analytica, LLC, FTC 1, 3 (Nov. 25, 2019).  At that time, Cambridge Analytica was a data analytics and consulting company “that provide[d] voter-profiling and marketing services.”  Id.  Cambridge Analytica and SCL Elections’ businesses were intertwined through “common business functions, ownership, officers, and employees.”  Id.

In 2013 and early 2014, Cambridge Analytica and SCL Elections became interested in research being conducted at the Psychometrics Centre at the University of Cambridge by Aleksandr Kogan. His work would allow the company to expand their services to voter-profiling, microtargeting, and similar marketing support for political campaigns and other clients based in the United States.  Id. at 4.  The research suggested that individual personality traits could be predicted through Facebook profile information using “the ‘OCEAN’ scale, a psychometric model that measures an individual’s openness to experiences, conscientiousness, extraversion, agreeableness, and neuroticism.”  Id. at 4.  The researchers developed an algorithm which they applied to an individual’s “likes” of public Facebook pages. According to the researchers, the algorithm “could potentially predict an individual’s personality better than the person’s co-workers, friends, family, and even spouse.”  Id. at 4.  Kogan created a corporation, Global Science Research, Ltd. (“GSR”), to enter into an agreement with SCL Elections to carry out the project and separate it from his role at the University of Cambridge.  Id. at 5.

SCL Elections entered into this agreement with GSR while acting for and on behalf of Cambridge Analytica.  Id. at 5.  Cambridge Analytica and SCL Elections entered into a Services Agreement “whereby SCL Elections agreed, among other things, to (a) acquire, for and on behalf of Cambridge Analytica, demographic, transactional, lifestyle, and behavioral data about consumers in target populations; (b) identify and build target voter lists; (c) apply research techniques to understand better the habits and daily lives of target voter groups; and (Id) apply psychological profiles to target groups of voters.”  Id. at 5.

Together, Cambridge Analytica and GSR launched a Facebook application (more commonly known as an “app”) called GSRApp, designed to capture Facebook users’ profile data.  In total, the app obtained Facebook user profile data from approximately 250,000 Facebook users who directly interacted with the app, as well as data from more than 50 million additional users who were “friends” with those who interacted with the app. Id. at 1.  The app generated survey questions for users to complete and requested consent to the users’ Facebook profile data being collected, including public Facebook page “likes.”  Id. at 6.  This process allowed Cambridge Analytica to ascertain personality scores for the “Affected Friends, from whom [the app] collected Facebook data but had no survey responses.” It also enabled the company to match this data to United States voter records and roll the app out on a wider scale.  Id. at 6.

While the app requested consent to Facebook user data, it stated that it did not download the user’s name or any other personal information, even though “the GSRApp collected the Facebook User ID of those App Users who authorized it.” Id. at 7.  Ultimately, Cambridge Analytica collected all of the following Facebook profile data from app users: “Facebook User ID; gender; birthdate; location (“current city”): friends list; and ‘likes’ of public Facebook pages.” Id. at 7.  Additionally, the company collected “Facebook User ID; name; gender; birthdate; location (“current city”); and ‘likes of public Facebook pages” of “Affected Friends.”  Id. at 7.

Fast forward to 2018 when the headlines broke on the data collection that occurred, and news sources vilified Cambridge Analytica. Such damning media coverage continues to this day. But what these news reports have consistently failed to recognize is that this type of psychographic profiling was, and still is considered an industry standard in advertising technology.  See, e.g., Garett Sloane, Facebook Exec Says Cambridge Analytica Sold ‘Snake Oil, and We Knew It’ in Leaked Memo, AdAge (Jan. 7, 2020); The Great Hack (Netflix 2019). So why is Cambridge Analytica the only villain in the eyes of the media?

In 2014, when Cambridge Analytica and GSR collected Facebook user data, “Facebook allowed third party apps to collect not only the data of the people who consented to giving it up, but also their friends’ data.”  Lorenzo Franceschi-Bicchierai, Why We’re Not Calling the Cambridge Analytica Story a ‘Data Breach,’ Vice (Mar. 19, 2018).  Further, this type of consent and data collection is similar to that which is currently used by ad targeting companies in a process called Real Time Bidding (“RTB”).  Peering into the Future of Digital Advertising, Cognizant 4 (2014).  RTB allows companies to purchase varying combinations of consumer data points within 200 milliseconds from ad targeting companies.  Real-Time Bidding (RTB): The Complete Guide, SMAATO (last visited Jan. 8, 2020).  Companies use these data points, which may include users’ “friends” network, to “capture, analyze and determine the ‘audience’ arriving on the Web site and serve targeted advertisements and communication” in real-time.  Peering into the Future of Digital Advertising, Cognizant 4 (2014).  Ultimately, RTB allows companies to go deeper into the minds of consumers than ever before.  Id.

This process of psychographic profiling is problematic across all industries. Shoshana Zuboff, a professor at the Harvard Business School, coined a term for this new marketplace: surveillance capitalism. Zuboff defines this market concept as “the unilateral claiming of private human experience as free raw material for translation into behavioral data.” John Laidler, High Tech is Watching You, Harv. Gazette (Mar. 4, 2019). Companies then take the data, package it as “prediction products,” and sell these products in “behavioral futures markets—business customers with a commercial interest in knowing what we will do now, soon, and later.” Id.

In a nutshell, Cambridge Analytica was engaging in surveillance capitalism—but so are countless other, well-known advertising technology companies like Google and Facebook.  The United States has spent three decades avoiding regulating the Internet, allowing the online ecosystem to get out of control, and keeping consumers in the dark regarding the deceptive and misleading practices of many companies, of which Cambridge Analytica is only one example.  Id.

Surveillance capitalism is a yet unregulated market driving companies to amass large amounts of data. Such personal data is the key to producing better behavior-predicting products that will provide a competitive advantage in this new marketplace.  Id.  These predictive products, however, ultimately intervene and may even alter consumer behavior on- and offline. This is already occurring through the use of subliminal cues (dark patterns) and heuristics.  Id.  And when products and companies begin to interfere with human behavior, they erode at consumers’ autonomy and agency. As Zuboff warns, “surveillance capitalism’s ‘means of behavioral modification’ at scale erodes democracy from within because, without autonomy in action and in thought, we have little capacity for the moral judgement and critical thinking necessary for a democratic society.”  Id.

So, if this practice is so common, why did Cambridge Analytica bear the brunt of public outrage?  It’s likely because Americans have a tight hold on their First Amendment rights, even if they haven’t quite grasped onto their privacy rights.  Cambridge Analytica manipulated the democratic process and the ultimate form of American speech—the right to vote.  While other companies’ tactics seem to have less dramatic consequences, such as inducing a consumer to purchase a car, surveillance capitalism is slowly eroding our concept of democracy and free will.  And while the severity of its practices should not be minimized, this is not a burden Cambridge Analytica should carry alone—the data collection that the company engaged in was, and still is, considered an industry standard.  Lorenzo Franceschi-Bicchierai, Why We’re Not Calling the Cambridge Analytica Story a ‘Data Breach,’ Vice (Mar. 19, 2018).  Hopefully, the re-opening of Cambridge Analytica’s tomb will help shift the discussion from one of blame to one of remediation and targeted regulation.