How Facebook could help plunge our democracy into chaos on Nov. 4
| Greg Sargent
| Paul Waldman
In 2016, a key component of Russia’s efforts to help Donald Trump win the White House involved moving disinformation through social media, especially Facebook. While it’s still uncertain what the Kremlin is up to this time, Facebook remains the world’s most powerful delivery system for lies, propaganda, and conspiracy theories.
As we head toward an election that could well be contested after Nov. 3, there’s a new reason to fear that Facebook could again play a toxic role in spreading chaos that is badly debilitating to our democratic system.
The threat this time resides in a combustible combination of two factors. The first is what’s known as Facebook’s “group recommendation engine,” which drives people to private Facebook groups. Experts have long warned that these private groups are festering grounds for disinformation and extremist activity, from QAnon to anti-vaxxers, and that the recommendation engine drives people unwittingly into them.
The second is President Trump himself — or, more specifically, Trump’s ongoing claim that the election must be decided on or just after Election Day, and that the millions and millions of still-uncounted ballots will inevitably be fraudulent.
A coalition of progressive groups — led by Accountable Tech — is sounding the alarm about the possibility that Trump’s disinformation along these lines risks getting amplified a thousandfold by this Facebook group function, and is calling on the tech giant to turn it off “until the U.S. election results are certified.”
The potential of these groups to spread “disinformation, suppress voters, organize intimidation efforts, or incite violence will only grow,” says the coalition, which includes groups like Mozilla, Common Cause, NARAL Pro-Choice America and Let America Vote.
Insiders at Facebook have already been warning that this group recommendation engine poses serious perils.
Facebook groups allow people with a similar interest — classic cars or Godzilla movies or particle physics or white supremacy — to exchange messages, links, videos and all kinds of other information. But while Facebook has tried to crack down on groups pushing extremist content or conspiracy theories, those efforts have mostly failed, as the continued spread of QAnon demonstrates.
Meanwhile, the recommendation algorithm appears to be exacerbating the problem. It recruits unwitting people to join groups, driving them into what are in effect walled-off communities where moderation functions are slow to penetrate.
For instance, as the Verge recently reported, researchers have found that Facebook helps build the anti-vaccine movement by recommending groups that traffic in this stuff to new mothers. As the Verge’s reporter put it:
To the outside world, Facebook says it’s working hard to prevent harmful content from multiplying. But internally, even Facebook employees are disturbed as they watch forces like QAnon easily outwit the measures meant to stop them while the algorithm keeps making the problem worse.
Now throw into this mix Trump’s insistence that the election must be decided on or around Nov. 3. He plainly believes he’ll be ahead and plans to declare victory on Election Day while depicting untold numbers of uncounted ballots as fraudulent.
“This is a bunch of tinder just waiting for a match,” Karen Kornbluh, the director of the Digital Innovation and Democracy Initiative, told us.
Kornbluh noted that these types of claims risk spreading like “wildfire” in these groups, helped along by the algorithm that steers people unwittingly into them. The threat, she said, is that this happens “before the platform can put a label on it” declaring it false.
We already have a glimpse of what this looks like. Donald Trump Jr. recently posted a video to Facebook claiming that Democrats are plotting “to add millions of fraudulent ballots that can cancel your vote,” while calling for an “army” to stop the fraud from happening.
“You can see people organizing to respond to fears that there’s fraud or that the election is going to be taken away,” Kornbluh told us. “The fear is that the groups are going to become a place where folks gin each other up and organize for violence.”
Kornbluh added that the recommendation engine may already be radicalizing people by steering them right into a “rabbit hole” of these radical groups, and on election night those groups will tell them “that the election is being stolen.”
All this could get a lot worse in the election’s immediate aftermath. While battleground states undertake the long process of counting unprecedented numbers of mail ballots, the Trump campaign will mount legal challenges contesting the count across the country.
The president himself will make all manner of insane charges, and be backed up by conservative media. Indeed, as Ben Smith recently suggested, Fox News, a major network, might come under intense pressure to declare Trump the winner prematurely.
Meanwhile, on Facebook, untold numbers and varieties of these right-wing groups will have become conduits for misinformation, disinformation, and the stoking of fear and anger, encouraging Trump supporters to reject official results when they arrive.
Facebook recently announced that it would ban any political advertisements that seek to prematurely declare someone the winner before the results are official. But that doesn’t do anything about the organic content that’s already spreading, or about the recommendation engine that’s dumping gasoline on those flames.
In a different world, the American public might wait for the final certified results with some modicum of patience and commitment to respect the results. In our world, Facebook could make that outcome a lot less likely.
Read Full Article
The Washington Post (External Site)