What's new
The Brexit And Political discussion Forum

Brexit may have begun but it is not over, indeed it may never be finished.

Whistleblower confirms that Facebook intentionally profits from sowing social chaos, violence

Brexiter

Active member
For a while, many of us have known from external studies and leaked documents that Facebook played a significant role in the Jan. 6 Capitol insurrection as the chief engine of radicalization and organization for the anti-democratic extremists who attacked Congress, the company’s dismissive denials notwithstanding. But on Sunday, a woman who once worked inside Facebook’s corporate bubble definitively blew the whistle on the corporation’s content-moderation policies and algorithm-fueled social-chaos-for-profit model on CBS’ 60 Minutes.

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” Frances Haugen, a data scientist hired by the social-media giant in 2019 to work on its misinformation problem, told reporter Scott Pelley. “And Facebook, over and over again, chose to optimize for its own interests, like making more money.”

Haugen copied documents containing Facebook’s internal research, which she says includes ample evidence that the company is lying when it claims to have made significant inroads in combating the spread of misinformation, hate speech, and extremist conspiracism on its platform. She said one of these internal reports estimated that the company takes action on “as little as 3-5% of hate and about 6-tenths of 1%” of violence and incitement on its website.

As many critics have previously noted, the primary driver in this problem is Facebook’s recommendations algorithms, which suggest reading and viewing to its users based on what they’ve read previously and what posts have driven similar “engagement” elsewhere on the platform. The result is that controversial material containing misinformation and incendiary rhetoric winds up being recommended heavily to ordinary readers who then fall down their ideological rabbit holes.

It’s all about keeping people glued to their site. “Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less money,” Haugen said.

Facebook’s executives have previously dismissed this criticism. "I think these events were largely organized on platforms that don’t have our abilities to stop hate, don’t have our standards and don’t have our transparency,” Facebook Chief Operating Officer Sheryl Sandberg told an interviewer. When asked by members of Congress about the company’s company culpability in the events of Jan. 6, CEO Mark Zuckerberg replied: “I think that the responsibility here lies with the people who took the actions to break the law and do the insurrection.”

Those denials continued after the 60 Minutes report aired. Facebook issued an angry statement, saying that “Facebook has taken extraordinary steps to address harmful content and we'll continue to do our part,” and reiterating that “the responsibility resides with those who broke the law, and the leaders who incited them.” It also claimed that the problem isn’t caused by their policies, but rather is just a reflection of longstanding divisions in society—which it then suggested that social media might actually resolve rather than exacerbate:

Research also shows that polarization has been growing in the United States for decades, long before platforms like Facebook even existed, and that it is decreasing in other countries where Internet and Facebook use has increased. We have our role to play and will continue to make changes consistent with the goal of making people's experience more meaningful, but blaming Facebook ignores the deeper causes of these issues—and the research.

Facebook hired Haugen and others into its new Civic Integrity section prior to the election as part of the public campaign to blunt criticism over how it handles political extremism and misinformation. “It understood the danger to the 2020 Election,” Haugen said.

But promptly after the election, it disbanded the section and abandoned that work. “As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety. And that really feels like a betrayal of democracy to me,” Haugen said.

"They basically said, 'Oh good, we made it through the election, there weren't riots, we can get rid of civic integrity now,'" she said. "Fast forward a couple of months, and we had the insurrection. When they got rid of Civic Integrity, it was the moment where I was like, 'I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous.'"

As The Washington Post’s Elizabeth Dwoskin previously reported, the evidence that Facebook’s role in the planning for the Capitol takeover is significant. Right up until the day before the attack, Eric Feinberg of the Coalition for a Safer Web reported, a search found that 128,000 people were still talking about the #StopTheSteal hashtag under which a number of the groups organized, with many of the users coordinating their actions with each other under it.

Dozens of Republican Party-affiliated groups around the nation used Facebook to organize bus trips to the Jan. 6 protest at which Trump spoke that shortly turned into the insurrection. As Media Matters reported, some two dozen GOP officials and organizations in at least 12 states used Facebook as a platform to organize bus trips to the rally. The posts advertising the buses were unsparing in the use of incendiary rhetoric, too.

“This is a call to ALL patriots from Donald J Trump for a BIG protest in Washington DC! TAKE AMERICA BACK! BE THERE, WILL BE WILD!” wrote the New Hanover County GOP of North Carolina in a Facebook post advertising bus seats. (The phrase “be there, will be wild!” was a rallying cry by Trump to his followers for that day.)

“BUS TRIP to DC .... #StoptheSteal. If your passions are running hot and you're intending to respond to the President's call for his supporters to descend on DC on Jan 6, LISTEN UP!” wrote the Polk County Republican Party of North Carolina in a Facebook post.

The New York Times reviewed dozens of post histories of Facebook users active in these far-right pages and found that the same algorithmic pull that draws readers and viewers down conspiracist rabbit holes by constantly recommending increasingly extreme content to users as a means of “engagement” has a similarly powerful effect on the content providers. Social media’s perverse incentives for exaggerations, falsehoods, and deception become overwhelming for the people constantly competing for eyeballs and readers, and in the process can transform people into raging extremists.

A study published in January by the Tech Transparency Project (TTP) documented how the social media giant became a place for far-right extremists peddling misinformation about the presidential election to spread their conspiracy theories, to recruit and radicalize the pro-Donald Trump army, and finally to organize and prepare their assault. The research documented the activities of various stripes of domestic extremists, including their discussions of weapons and tactics, their calls to overthrow the government, and their organizing activities—all on Facebook. These activities eventually included the insurgent attack on the Capitol, much of it occurring in private Facebook groups that enable extremists to organize out of the public eye but with access to a mass following.

Its major takeaways:

  • Militant groups had planned a nationwide effort to “back up” police on Election Day against supposed antifa and Black Lives Matter protests. The event carried the logos of the Proud Boys and anti-government militias and was circulated in private far-right Facebook groups with thousands of members.
  • Self-declared “patriot” groups on Facebook have ramped up their recruiting efforts tied to the election. Some of these groups promoted the Jan. 6 event at the Capitol.
  • Talk of overthrowing the U.S. government increased dramatically in Facebook groups monitored by TTP following the declaration of Biden as the winner of the 2020 vote.
  • A pro-Trump Facebook group required prospective members to declare they would be willing to die for their country in order to join in what may be a sign of growing extremism.
  • Calls to “occupy Congress” were rampant on Facebook in the weeks leading up to the deadly Capitol riot, making no secret of the event's aims. Two different “occupy” event listings were written in a Nazi-style font and began circulating on Facebook in December.
  • Since the insurrection, new posts promoting violence, including on Inauguration Day, have popped up on Facebook.

The report observed: “[Facebook] has spent the past year failing to remove extremist activity and election-related conspiracy theories stoked by President Trump that have radicalized a broad swath of the population and led many down a dangerous path.”

There was little mistaking the rhetoric on the private Facebook pages devoted to organizing the resistance to Trump’s election loss, which were nothing short of violent and openly seditionist—and notably, though such posts clearly violate Facebook’s terms of service, none were removed until after the violence.

“Patriots heading to DC, raise holly hell, its the only thing that Democrats understand,” wrote a member of a 9,600 member group for “Patriots” on the day before the insurrection. “If you want your country back, show them!!!”

The violent fanaticism was also self-evident: “Are [you] willing to fight or maybe even Die for YOUR COUNTRY?” administrators of a Facebook page for the Ohio Minutemen Militia asked applicants.

Facebook offered ads that seemed timed to coincide with the insurrectionists’ sudden demand for body armor, gun holsters, and a variety of military equipment, placed on pages promoting election disinformation and, subsequently, news about the storming of the Capitol. It did so in the face of concerned employees’ internal warnings.

On the Facebook group pages that cater to extremist content and organizing, ads featuring a range of defense equipment have been appearing. These include body armor plates, rifle enhancements, and shooting targets, all sold as a New Year’s special.

"Facebook has spent years facilitating fringe voices who use the platform to organize and amplify calls for violence,” said TTP Director Katie Paul. “As if that weren't enough, Facebook's advertising microtargeting is directing domestic extremists toward weapons accessories and armor that can make their militarized efforts more effective, all while Facebook profits."

BuzzFeed also later reported that an internal document, assembled by an internal task force studying harmful networks, acknowledged the role of Facebook activity by “Stop the Steal” activists, as well as pro-Trump groups associated with the brief attempt to organize a “Patriot Party” split from the GOP, in the violent events of Jan. 6. It also observed that insisting on an “inauthentic behavior” standard—rather than one based on the spread of misinformation and violent speech—hindered its attempts to take the appropriate preemptive steps.

“Hindsight is 20/20, at the time, it was very difficult to know whether what we were seeing was a coordinated effort to delegitimize the election, or whether it was free expression by users who were afraid and confused and deserved our empathy,” read the report. “But hindsight being 20/20 makes it all the more important to look back to learn what we can about the growth of the election delegitimizing movements that grew, spread conspiracy, and helped incite the Capitol insurrection.”

Another study by the public-interest group Avaaz, released in April, found that over the eight months leading up to the election, there were an estimated 10 billion views on key top-performing Facebook pages that regularly and repeatedly shared false information about the election. There was also a marked lack of moderation on those pages, allowing the “false or misleading information with the potential to cause public harm” to flourish. Those pages, the study found, saw a nearly threefold increase in interactions from October 2019—when they had 97 million—to a year later, when they had 277.9 million. It also found that nearly 100 million voters saw false voter fraud content on Facebook.

“A poll conducted in October 2020 found that 44% of registered voters reported seeing misinformation about mail-in voter fraud on Facebook (that equates to approximately 91 million registered voters),” the report states. “The polling suggests that 35% of registered voters (approximately 72 million people) believed this false claim.”

This growth particularly benefited pages backing the authoritarian QAnon conspiracy cult and, later, the Stop The Steal movement. The Avaaz study found 267 groups championing violence around the election with a combined following of 32 million—nearly 70% of which had Boogaloo, QAnon, or militia-themed names and content.

Facebook’s reliance on algorithmic detection played a large role in its failures to act on these pages, Avaaz noted, since the company’s policies also allow misinformation on their platform if politicians are spreading it. It noted that political ads for the Georgia election featured misinformation that had been debunked by fact checkers nonetheless being spread by Republican candidates—permissible under Facebook policy.

“The scary thing is that this is just for the top 100 pages—this is not the whole universe of misinformation,” Fadi Quran, a campaign director at Avaaz, told Time. “This doesn’t even include Facebook Groups, so the number is likely much bigger. We took a very, very conservative estimate in this case.”

The problem, of course, is not confined to the Jan. 6 insurrection, but reflects how Facebook has enabled the spread of conspiracist extremism across broad swaths of democratic society, poisoning the well of public discourse and encouraging violence and hatred. Facebook has been deeply implicit in the spread of QAnon conspiracism, particularly during the COVID-19 pandemic; in the growth of the “Boogaloo” movement that wants to inspire an apocalyptic, anti-government civil war; and in the spread of COVID denialism, particularly the strain that has been organizing mini-insurrections outside of local school boards and health districts. It has also been a major engine in spreading extremist ideologies within the ranks of America’s elite military forces.

As Amanda Marcotte observes at Salon, all this has had a profoundly toxic effect on American discourse and on society generally—all for the sake of the company’s profits. Allowing the disinformation to spread makes them money—lots of it:

Disinfo is everywhere because of old-fashioned market demand. Ordinary people, especially conservatives, crave lies and actively seek out and reward those who will feed them the lies they so dearly desire. The consumers of disinformation are not innocent victims being exploited for their naivete. They are complicit actors, sharing and driving up demand for lies, because doing so helps them further their goal of undermining American democracy.

This is something that journalist Brooke Binkowski of the website Truth or Fiction is constantly trying to hammer home: Disinformation is permission, not persuasion.

And in the view of Facebook executives, it is very, very profitable.
 
Back
Top