Salon: Despite Parler backlash, Facebook played huge role in fueling Capitol riot, watchdogs say

Salon: Despite Parler backlash, Facebook played huge role in fueling Capitol riot, watchdogs say

Larger companies were eager to single out Parler to avoid the "potential legal implications" from "associating yourself with an app or platform that is encouraging and inviting actions that will lead to violence," said Yosef Getachew, director of the media and democracy program at the watchdog group Common Cause. Parler played a role in the "organizing" of the siege and amplified calls to violence but "it wasn't just Parler, it was social media platforms across the board," Getachew said. Facebook in particular has "done a poor job of consistently enforcing their content moderation policies," he added. This isn't just a case of "one platform is a bad actor," Getachew said. "All platforms have not done what they need to do to prohibit this type of disinformation and incitement of violence." ... These groups didn't just spread misinformation but actively "encouraged people to attend the riot last week and to potentially arm themselves and to potentially engage in other violent acts," Getachew said. "These are the types of things from a public interest side that make it harder to monitor because the groups are closed, right? You need permission to enter and Facebook isn't doing a good enough job of actually facilitating or moderating these groups to prohibit this type of content, or to ban these groups altogether."

The far-right social media platform Parler has shouldered much of the blame for last week’s Capitol riot — and may since have been rendered permanently defunct. But watchdog groups say much larger companies like Facebook carry more of the responsibility for the lead-up to the pro-Trump siege.

Amazon Web Services, which hosted Parler, took the platform offline last week after Apple and Google removed it from their app stores, arguing Parler was not doing enough to moderate content that could incite violence. Amazon in court documents detailed extensive violent threats on Parler that the company “systemically failed” to remove. Hacked GPS metadata analyzed by Gizmodo shows that “at least several” Parler users managed to penetrate deep inside the Capitol. …

Larger companies were eager to single out Parler to avoid the “potential legal implications” from “associating yourself with an app or platform that is encouraging and inviting actions that will lead to violence,” said Yosef Getachew, director of the media and democracy program at the watchdog group Common Cause.

Parler played a role in the “organizing” of the siege and amplified calls to violence but “it wasn’t just Parler, it was social media platforms across the board,” Getachew said. Facebook in particular has “done a poor job of consistently enforcing their content moderation policies,” he added.

This isn’t just a case of “one platform is a bad actor,” Getachew said. “All platforms have not done what they need to do to prohibit this type of disinformation and incitement of violence.” …

These groups didn’t just spread misinformation but actively “encouraged people to attend the riot last week and to potentially arm themselves and to potentially engage in other violent acts,” Getachew said. “These are the types of things from a public interest side that make it harder to monitor because the groups are closed, right? You need permission to enter and Facebook isn’t doing a good enough job of actually facilitating or moderating these groups to prohibit this type of content, or to ban these groups altogether.” …

Facebook previously came under fire for failing to crack down on extremist content ahead of the deadly 2017 Charlottesville white nationalist rally. It was used to organize numerous protests against coronavirus restrictions earlier this year, including an armed invasion of the Michigan state capitol. Facebook later removed certain pages linked to the Charlottesville rally and announced plans to remove thousands of QAnon-related accounts. These actions have all been “too little, too late,” Getachew says. …

Getachew said that Facebook and others need to more consistently enforce their policies, and also expand them to more effectively combat disinformation and online voter suppression. …

Despite YouTube’s more proactive approach to dangerous material in recent months, it still needs greater “algorithmic transparency,” Getachew said.

“These are systems that are being developed in a black box. Oftentimes the individuals who are developing these algorithms are homogeneous in that they are white men,” he said. “They aren’t even diverse in terms of other perspectives, to actually create algorithms where they won’t lead you down these rabbit holes. We need diversity in developing these algorithms, but also we need transparency in how these algorithms are being developed, audits and other tests. … The company shouldn’t be looking for ways to maximize engagement by sending you more and more extreme content through algorithms.” …

The social network crackdowns and the takedown of Parler has led to an explosion of new users to encrypted messaging apps like Signal and Telegram, sparking some concern that extremists will be able to now be able to hatch plots out of sight.

“Encrypted apps have their purpose in terms of protecting the privacy of users,” Getachew said. “But that should not absolve companies from taking steps that prohibit the spread of disinformation, or at the very least taking steps so their platforms aren’t being used to facilitate disinformation and other content that could lead to offline violence.”