Photo credit: Phil Pasquini / Shutterstock.com
_____
On July 16, a reporter writing a story about Covid-19 vaccine disinformation asked Joe Biden: “What’s your message to platforms like Facebook?” The President answered, “I mean, they’re killing people. Look, the only pandemic we have is among the unvaccinated. And they’re killing people.”
I want to point out that although the reporter was asking a general question (“platforms like Facebook”), Facebook immediately and strenuously denied any culpability for spreading fake news about the Covid vaccine. This is part of a larger strategy of corporate disinformation. Mark Zuckerberg’s signature platform only does good things, not bad things. When bad things happen, it’s because Facebook upholds larger principles, like freedom of speech or point of view diversity.
Therefore, the following day, on the Facebook blog, Vice President of Integrity (yes, that is really his title) Guy Rosen blasted back. “At a time when COVID-19 cases are rising in America,” Rosen wrote, “the Biden administration has chosen to blame a handful of American social media companies. The fact is that vaccine acceptance among Facebook users in the US has increased.” Rosen cited research collaborations with major universities and high rates of vaccine acceptance (85%) among platform users that supported this statement.
But there are competing truths. Even if only 15% of Facebook users are anti-vaxxers, that says little about the platform’s influence as a channel for spreading misinformation. A week after Rosen glossed Facebook’s research, New York Times reporter Sheera Frenkel noted that Facebook has routinely distributed fake news generated by one of the most influential anti-vaxxers, Florida doctor Joseph Mercola. Unsurprisingly, Dr. Mercola has also been selling quack remedies for Covid-19 as well. Using the platform’s own tools, she determined that one posted article reached 400,000 readers, some of whom republished it on other platforms and blogs.
As importantly, for our purposes, Frenkel notes that
Dr. Mercola’s official English-language Facebook page has over 1.7 million followers, while his Spanish-language page has 1 million followers. The Times also found 17 other Facebook pages that appeared to be run by him or were closely connected to his businesses.
That same day, New York Times reporters Jim Tankersley and Cecilia Kang announced that the federal government is beginning to mobilize against a corporate internet that functions as the Wild West did after the Civil War: lawless, murderous, and immensely profitable for a very few. Biden’s Department of Justice, they write, has “assembled the most aggressive antitrust team in decades,” a team that will make policy from the perspective that “Facebook, Google, and Amazon have monopoly power and have used their dominant positions in social media, search and online retail to squash competitors, leaving consumers with fewer options, even if that doesn’t result in higher costs.”
Facebook has plenty to worry about because its track record is poor. Its business has been built around aggressive user acquisition and data collection, not any concern for public safety or accurate information. Tabloid newspapers used to say, “If it bleeds, it leads”—that is doubly true of Facebook’s social media algorithm, which generates profits from content that motivates sharing through outrage. And currently, that is the only strategy that Republicans have for holding onto power.
Yet, after 15 years of Facebook currying favor with both parties, both sides of the aisle now have grievances against the tech industry’s social and economic power. Frenkel and Kang, veteran tech reporters, make it clear in their new bestselling book, The Ugly Truth: Inside Facebook’s Battle for Domination (HarperCollins, 2021) why that is: Mark Zuckerberg knew from the beginning that what he wanted was a monopoly over the world’s eyeballs and that data gathered from users, like soybeans, could be sold in numerous forms.
But Frenkel and Kang add a new wrinkle to the story: the magic of Zukerberg’s partnership with Sheryl Sandberg. Ever since they met at a Christmas party in 2008, both Zuckerberg and Sandberg (who was working for Google at the time but knew her road to the top was blocked) understood that the then-young company could be “the global power that it is today.”
Together, they “methodically built a business model that is unstoppable in its growth…and entirely deliberate in its design.” As importantly, because Sandberg handles the advertising end and Zuckerberg the technology, each has plausible deniability when the evils of the platform make themselves evident in anti-Muslim riots, as in Myanmar and India, elections undermined by foreign interference (the United States and the United Kingdom), and its half-hearted efforts to prevent underage users from establishing accounts.
The Facebook design is not about maintaining safety for you, your children, or the democracy you want your family to live in. This is something we know. But Frenkel and Kang establish a firm and persuasive timeline that demonstrates how Facebook became a digital wolf in sheep’s clothing, one that sucked users in under the pretense of friendship and then cannibalized them.
Facebook was only ever about three things: collecting your data, selling your data, and keeping users “engaged” so that they can both be shown targeted advertising and encouraged to give up more information that allows those ads to be narrowcast even more precisely. Everything they have done has led to a moment in which half the world’s governments are fighting Facebook over truth. But Frenkel and Kang have a new story to tell: every turning point in Facebook’s history has been a step toward perfecting this data empire, and Mark Zuckerberg’s oft-repeated raison d’être for the platform—creating community—has been the biggest disinformation strategy of all.
Facebook was originally designed as parasitical, its technology copying and mimicking earlier failed projects. Zuckerberg and his founding team took advantage of pre-existing higher education communities to create Facebook’s first user base. Built at Harvard University for its first two years, Facebook was open only to those with a .edu email address and employees at select tech companies. Computer use was widespread on college campuses by 2004. Personal information about each other was a form of news that students craved, and through them, the Facebook team quickly learned how much personal information human beings would share in their quest to become the stars of their own lives.
On September 26, 2006, the platform opened to anyone over 13 with a valid email address—without, of course, Facebook having any way to ascertain who met this arbitrary age criterion. This was a problem that emerged later when Facebook’s technology-enabled a wave of teenage cyberbullying and sometimes grisly suicides.
But only three weeks before the general public launch, Zuckerberg faced his first major public relations challenge when he introduced News Feed. Before News Feed, users would have to deliberately travel to friends’ pages to see what they were up to; afterward, each user was presented with a scroll of “updates” from friends. It is hard to describe how controversial this was at the time. Many users saw the change as a significant erosion of their privacy, although in reality, many were “just now coming face-to-face with everything Facebook knew about them.”
Although many users were initially distressed, and Facebook responded by creating the layers of access that we are used to today, most users missed the naked reveal: Facebook had increased its ambitions for hoovering up data. And that was only partly because of Zuckerberg’s blandishments about his love for the community. A demographic shift was underway, in which internet-savvy Web 1.0 users were giving way to a vast new population that knew, and cared, little about what lurked behind Facebook’s bland, blue exterior.
News Feed increased the number of users and the amount of time each spent on the platform. It intensified FOMO, pushing negative interactions that captured eyeballs to larger audiences. “Sessions,” as Facebook calls them, became longer and more engaged. Every like, every photograph, every crowd-sourced question (“I’m buying a new dishwasher. Suggestions?”) amplified user data profiles. Facebook, as Sandberg told the major brands with which she did business, was “the biggest word-of-mouth platform in the world…users themselves would persuade their friends to buy products.” And the beauty of it was that most of Facebook’s customers, now possibly the biggest focus group in the history of the world experienced, experienced what they were doing as pleasurable activities that were voluntary, not compulsive, and technology-driven.
Facebook became what Zuckerberg and Sandberg had always intended to be: a giant marketing machine posing as a real-life community. Is it any surprise, then, that the algorithms created to power this giant money-maker have also been a potent channel for marketing conspiracy theories, quack remedies, black market data sales, and right-wing disinformation campaigns?
And Zuckerberg is his own disinformation machine, never acknowledging Facebook as the source of a problem, only the solution to it. For example, in 2019, the company launched an initiative to combat an escalation in teen suicide without acknowledging that its platforms have consistently facilitated a contagion of talk about self-harm among teens in the first place.
As Frenkel and Kang argue, each innovation at Facebook was touted as an improvement, when in fact, its real purpose was to advance Facebook’s monopoly on users and, thus, accelerate profits. For example, in 2010, the company launched Groups. Marketed as conflict-reducing and enhancing user experience, this shift replicated the original design: put pre-existing communities online, grow them, encourage data creation through conversation, and sell the whole package to advertisers.
More importantly, it is also the case that a community of like-minded people would create more and better data: gardening groups produce data on gardeners, gun groups on gun owners, trans people on lifestyle and fashion hacks, and so on. But Groups also facilitated siloing and the mobilization of fringe groups for political profit. Thus, for example, it is no accident that 2010 was the year that the Tea Party movement, initiated after Barack Obama’s victory in 2008, became a major player, not just in national politics but in the right-wing populist political fundraising that ultimately produced the MAGA movement.
Frenkel and Kang also do a great job of showing the conflict behind the scenes at Facebook: none of this was inevitable, and many of the people who worked there knew it. Even as Zuckerberg and Sandberg, in their different ways, were driving the company towards world domination, there was significant pushback from both founding employees and new experts brought in to help address the chaos Facebook was creating.
Yet, leadership was non-responsive to concerns coming from staffers and department heads, isolating them and shutting department heads out of the chain of command. Reports were generated and apparently never read. Importantly, although they never say it, Frenkel and Kang portray a Zuckerberg who was also stuck in an internet 1.0 mindset. All information spread as widely as possible was always good. Data harvesting created consumer opportunities—not exploitation.
Thus, in 2012, a platform operations manager informed Facebook’s senior leadership that the Open Graph program, launched for advertisers without safeguards on how purchasers exploited harvested data, exposed users to foreign state actors, and black-market data sales. But Zuckerberg and Sandberg ignored him. The ultimate consequence of this intentional lapse was the Cambridge Analytica scandal, which helped power Donald Trump to the presidency by targeting voters with wild conspiracy theories and disinformation about Hillary Clinton.
Frenkel and Kang are persuasive that Zuckerberg was genuinely shocked at the possibility that Facebook had played an integral role in throwing the election to Trump. But it was common knowledge among executives that the platform was being used during the 2016 campaign by foreign and domestic actors, and their hands were tied by a corporate culture that failed to differentiate between truth and lies, had no capacity to screen out malicious activity, and didn’t want to acknowledge that disinformation had a real-world impact.
When confronted with Facebook’s complicity in Trump’s victory on Wednesday, December 9, 2020, Zuckerberg had two responses, both of which are relevant to its current denialism about anti-vaxxer content. One was that the company had not assisted Trump and that Facebook should muster “hard data” that very little Facebook content was “fake news.” The other was to quickly reach out to Trump’s people to secure Facebook’s position with a new administration that was hostile to major technology companies whose employees had generally been sympathetic to liberal and left-wing causes.
But—and this speaks to the question of anti-vaxxer conspiracy theories circulating today—”executives quickly realized that if even only 1-2 percent of the content on Facebook qualified as false news, they were talking about millions of stories targeting American voters ahead of the election.” And those millions of stories often jumped off Facebook and onto other digital platforms, where millions of other eyeballs saw them.
The 2016 election, disastrous as it was, was not the worst outcome of Zuckerberg and Sandberg’s plan for global dominance, which had, as one outcome, killing people. Facebook made aggressive moves in South Asia: one program included facilitating the distribution of smartphones pre-loaded with the Facebook app.
However, they failed to create more than the most token content moderation in areas of the world with a long history of ethnic violence based on rumors. For example, in 2014, Facebook had become the virtual organizing tool for the anti-Rohynga riots in Myanmar, a phenomenon that the company failed to take seriously until the government of Myanmar took Facebook offline at the height of the violence. By 2018, at least 43,000 Rohynga were listed as missing and presumed dead; tens of thousands of others are refugees. And to this day, Facebook remains a central spreader of anti-Muslim hate content by Hindu extremists in India.
Frenkel and Kang’s case against Facebook is perhaps the clearest look at the intentionality of what Facebook has done and how many socially destructive political dynamics it has fueled. While Sheryl Sandberg comes off as a brilliant, ambitious, and ultimately amoral person, Mark Zuckerberg emerges as arrogant and closed-minded, cocooned in money, and bent on pushing technology as far as it can go, regardless of the consequences.
This makes it no surprise that Zuckerberg is currently minimizing Facebook’s role in distributing anti-vaccine conspiracies for the last eighteen months. As Gizmodo’s Whitney Kimball pointed out, the number of anti-vaccine posts on Facebook is less relevant than Facebook’s towering role in news consumption, a “massive disinformation mill” that extends engagement with lurid stories that reinforce false beliefs. As Kimball points out,
Researchers from numerous universities, specializing in various public health and political science-related fields, surveyed 20,669 people from all 50 states and D.C., between June and July 2021. They found that 25% of people who only got news from Facebook in the previous 24 hours say they won’t get vaccinated, putting it below only Newsmax (41%) and slightly above Fox (23%).
About a third of the group had accessed news on Facebook in the last 24 hours, making the platform second only to CNN as a superspreader of news.
So Joe Biden was right the first time around: Facebook is killing people, and it has been killing people long before the Covid-19 pandemic. It is part of facebook’s DNA.
The question is—what is he, and the Democratic Congress, going to do about this ugly truth?
_____
Claire Bond Potter is Professor of Historical Studies at The New School for Social Research and co-Executive Editor of Public Seminar. Her most recent book is Political Junkies: From Talk Radio to Twitter, How Alternative Media Hooked Us on Politics and Broke Our Democracy (Basic Books, 2020).
2 thoughts on “Actually, Facebook Does Kill People”