Facebook created the blueprint for Cambridge Analytica
If you're like me, you may've assumed that the Cambridge Analytica (C.A.) scandal was an HBO-Original-style hack.
Type: #Essay
Re: #Politech #Technology
Watching Zuckerberg describe what happened, I pictured a shadowy man under a black cotton hood. Nineties techno blared over the deft clacking of a mechanical keyboard.
I could almost taste the Monster Energy drink, lukewarm and long since stale. “We’re in,” he whispered as a waterfall of green gibberish fell down his dark Oakley sunglasses. If you're like me, you'd be wrong about that imagery. They can't steal what was handed to them.
Looking back, it almost feels intentional. From congressional hearings to a Netflix documentary, we heard language that implied our data was stolen. It wasn’t.
Facebook allowed a foreign company to steal private information. They allowed a foreign company to steal sensitive information from tens of millions of Americans.
—U.S. Senator Jon Tester (D) at the 2018 congressional hearing on Facebook's role in the Cambridge Analytica scandal.
In reality, Cambridge Analytica used Facebook's open and available tools to harvest the personal data of 87 million Americans — door open, welcome sign lit. C.A. then used that data against us and exploited our most vulnerable neuroses without our knowledge or consent. On the other hand, Facebook not only knew this manipulation was possible, they literally wrote the book on it several years prior.
In this article, I will explain how Facebook paved the way for Cambridge Analytica to successfully execute one of the most aggressive psychological operations in modern history. To my knowledge, these connections have not been made by media or congress. Why? Perhaps short memories and a poor understanding of the technology that runs our lives are to blame.
Behavioral Psychology + Big Data + Targeted Engagement = Behavior Change
— Cambridge Analytica pitch deck
Image Memory Glasses
There’s a scene towards the end of the first act of Donnie Darko that popped into my head while writing this article. Perhaps you remember it.
Donnie Darko (Jake Gyllenhaal) and his girlfriend Gretchen (Jena Malone) stand in front of the classroom to present their imaginary invention called the Infant Memory Generator. In the scene, they describe a pair of glasses that could, in theory, display a slideshow of pleasant images to a sleeping baby.
You can instantly feel the tension in the room. The teacher (Noah Wyle) is visibly upset by the idea. He asks his students whether they considered that a baby needs darkness to sleep. The two school bullies (Alex Greenwald and Seth Rogan) immediately raise their hands. “What if the parents put in pictures of satan?” one asked. “Or, like, dead people? Crap like that.”
The implication here, realized by everyone in the room except Donnie Darko and Gretchen, is that their invention could have the power to affect a baby’s mood and behavior in unpredictable ways. In the wrong hands, such a device could be dangerous.
Gretchen then replies to the bully, “Is that what you’d show your kids?”
The dawn of emotional engineering
In 2010, Facebook released a public study that showed off its ability to affect voter turnouts. It was a brazen admission considering how easy it was to pull off. It wasn’t anything that a graphic designer with access to our newsfeeds couldn’t achieve.
Facebook injected a banner into the newsfeeds of a three subsets of Facebook users. The first group saw a banner with a pro-voting message, a link to find your nearest polling location, and the profile picture of friends who had already voted. The second group also saw a pro-voting banner, this time without the polling link or social encouragement. The third group (everyone else) saw no banner at all, just their normal newsfeeds.
Here are the results, reported by The New Statesmen in their 2014 article: Facebook could decide an election without anyone ever finding out:
The researchers concluded that their Facebook graphic directly mobilized 60,000 voters and, thanks to the ripple effect, ultimately caused an additional 340,000 votes to be cast that day.
For context, George W. Bush won Florida in 2000, thus the presidency, by a little over 500 votes. Donald Trump won by 80,000 votes in three states. Razor-thin margins in this country win presidential elections. Facebook, it seems, has the power to sway close democratic elections in whichever direction, at the flip of a digital switch, if it so choose.
In 2012, Facebook set out to answer another question — can we alter peoples’ moods by changing what they see on their newsfeeds? The answer, revealed in a study released publicly two years later to major ethical concerns and mild internet outrage, was, yeah, you can. In fact, not only can you change a person’s mood, but that person could, through their own posts, affect the moods of their unwitting Facebook friends.
The study also showed that users wrote longer posts after negative or positive content was injected into their Newsfeeds. The opposite was true when their feeds became closer to neutral — they wrote posts with fewer words and were less likely to affect the moods of their friends. In short, the study determined that if you can change a person’s mood, you can also change their behavior.
The holy grail of communications is when you can start to change behavior.
— Cambridge Analytica
Facebook and Cambridge Analytica
The two studies mentioned above acted as blueprints, of sorts, for manipulating Facebook’s own user base on a large scale, without any of the users’ consent. What’s worse, Mark Zuckerberg shared this information with the press, potential advertisers, and the public at large. Not as a warning, but as an accomplishment to be revered. Perhaps, his boasting would not have been such a big deal if Facebook guarded our personal data from outside organizations. But he didn’t. Instead, Zuckerberg did the opposite and provided app developers with a large, injudicious flow of precisely the type of data needed to recreate the scenarios outlined in the published studies.
In 2010, Facebook announced a new API called Open Graph. (An API is a way for applications to summon specific pieces of data from other applications.) Facebook pitched Open Graph as a way for developers to implement Facebook features — things like commenting ability, the Facebook Like button, and Facebook Login — into third-party apps. Open Graph also gave app developers generous access to the treasure trove of user data Facebook had amassed over the years.
One popular method for opening the data spigot was to use Facebook Login as a primary or exclusive method for signing into a third-party app. Cambridge Analytica used this exact method. It’s worth noting that the term “Open Graph” was not uttered once by a U.S. Senator during the Facebook Congressional hearings a few years ago.
In 2018, when the Cambridge Analytica story broke, Zuckerberg called the incedent a “breach of trust.” Still, he gave no indication he cared about the collection of data itself or how Cambridge Analytica used that data to develop psychological profiles of Americans. None of that was against Facebook’s terms of service. The issue was that, technically, C.A. was an outside party, as they hired an academic to create the app, then later paid the academic for his efforts. And that is against the rules. But Zuckerberg doesn’t like to hold grudges. According to Christopher Wylie- famous whistleblower and former Cambridge Analytica employee- Facebook’s ad team, led by COO Sheryl Sandberg, helped C.A. develop their advertising campaigns a full year after Facebook knew of this “breach of trust.”
In order to group users by various psychological traits (then later serve ads that exploited those traits), Cambridge Analytica used a phycological model called OCEAN — Openness, Conscientiousness, Extraversion, Agreeableness, and Neuroticism.
C.A. claims that these models were at the heart of how they profiled you — your neuroses and other exploitable traits.
— New York Times
In 2015, Cambridge Analytica created a psychological quiz and paid 300,000 Facebook users roughly $5 each to log into the app and complete the quiz. Since those users signed in using Facebook Login, CA was not only able to obtain the names, locations, and ‘like’ histories of the over quarter million participants, but their friends’ names, locations, and ‘like’ histories as well. Eighty-seven million friends, to be exact. Where did Cambridge Analytica get the idea for this psychological operation? Facebook.
In 2012, Facebook released a third study: Private traits and attributes are predictable from digital records of human behavior. The study detailed a method for applying the OCEAN phycological model on Facebook users based on their ‘like’ history and, yes, results from a quiz.
By its own accord, Cambridge Analytica was founded on the ability to harvest and profile user data. The studies conducted and shared publicly by Facebook seem to line up perfectly with Cambridge Analytica’s strategy for the 2016 Presidential election. Without those studies and without access to all that user data via the Open Graph, Cambridge Analytica simply does not exist.
By its own accord, Cambridge Analytica was founded on the ability to harvest and profile user data. The studies conducted and shared publicly by Facebook seem to line up perfectly with Cambridge Analytica’s strategy for the 2016 Presidential election. Without those studies and without access to all that user data via the Open Graph, Cambridge Analytica simply does not exist.
Our newsfeeds operate in darkness
There’s a missing piece to this puzzle. You’d be hard-pressed to find a single advertisement made by Cambridge Analytica. How can that be? No one has seen an ad post-scandal because it ran as something called a dark ad, or dark post.
Here’s a quote by Carole Cadwalladr from her 2019 Ted Talk titled Facebook’s role in Brexit — and the threat to democracy
This entire referendum took place in darkness because it took place on Facebook. And what happens on Facebook stays on Facebook because only you see your news feed, and then it vanishes, so it’s impossible to research anything.
Facebook has since created an open ad library in which anyone can see what ads are currently running. However, there is very little anyone can do to research what people saw on their newsfeeds during the months leading to the 2016 election in the absence of government pressure. I can’t stress enough how inept the U.S. Congress has been through all this and how little they’ve managed to hold Facebook accountable.
Thankfully, the U.K. Parliament was able to subpoena a few of the Brexit ads developed by C.A. Another misconception is that C.A. has only meddled in the 2016 presidential election, but they’ve been accused of orchestrating disinformation campaigns in over 150 democratic elections worldwide.
Here’s a quote by Carole Cadwalladr from her 2019 Ted Talk titled Facebook’s role in Brexit — and the threat to democracy:
This entire referendum took place in darkness because it took place on Facebook. And what happens on Facebook stays on Facebook because only you see your news feed, and then it vanishes, so it’s impossible to research anything.
Facebook has since created an open ad library in which anyone can see what ads are currently running. However, there is very little anyone can do to research what people saw on their newsfeeds during the months leading to the 2016 election in the absence of government pressure. I can’t stress enough how inept the U.S. Congress has been through all this and how little they’ve managed to hold Facebook accountable.
Thankfully, the U.K. Parliament was able to subpoena a few of the Brexit ads developed by C.A. Another misconception is that C.A. has only meddled in the 2016 presidential election, but they’ve been accused of orchestrating disinformation campaigns in over 150 democratic elections worldwide.
There was never any indication that anyone even so much as considered the possibility of Turkey joining the E.U. Cambridge Analytica identified users prone to xenophobia, then targeted ads to them that incited fear. No one knew this was happening at the time because only Facebook ultimately knows what’s happening on our newsfeeds. Only we, as individuals, know what Facebook decides to put in front of our eyeballs.
Are we really connected in the dark?
Mark Zuckerberg and Sheryl Sandberg have long pushed the narrative of connecting Facebook users to the world, but they’ve done the opposite. Facebook has stripped us from the very thing that unites us as humans — our shared experiences and understanding of reality. We’ve been separated into psychological silos. Our worst fears, biases, and neuroses are collected and categorized, then fed back to us via our newsfeeds, closed groups, and dark posts.
Our governments demand fact-checking for our newsfeeds by the same force amplifying the lies. And lost in all this is what we arguably need the most; the best fact checkers we have — our peers. We’ve been encouraged by Facebook to quietly unfollow any friction in our social circles, and in that process, we may lose a trusted friend’s voice of reason. We no longer see posts from known experts willing to contextualize a claim because they’ve been muted. Facebook provided us with the tools not to connect but to isolate. And we chip away at the people around us, the ones who matter, until all we have are messages designed to exploit our most vulnerable traits. All the while, we are oblivious to, and cannot opt out from, the psychological manipulation.
What Facebook allowed Cambridge Analytica to do was weaponize it’s tools in a way that we now no longer agree on basic truths — vaccinations prevent disease, the world is a sphere, Hilary Clinton is not drinking the blood of children as a fountain of youth.
We lay in our beds before we sleep, and we stare at a screen, at our very own personal slideshow that prioritizes negativity and disinformation. There is no one around to see what we see. No one to help us reason away our fears.
Mark Zuckerberg and Sheryl Sandburg continue to stand in front of the classroom, pitching us on new products they themselves don’t fully understand. If Facebook is willing and able to control us from a glowing rectangle one foot from our faces, just imagine what they can accomplish with a pair of Image Memory Glasses wrapped around our heads as we enter the metaverse.
After the scandal broke in 2018, Cambridge Analytica closed its doors, only to reappear under the name Emerdata. Facebook has since changed its name to Meta, and Jake Gyllenhaal finds himself in the crosshairs of Taylor Swift fans.
Created: October 21, 2021
Last Evolved: November 2, 2022