Days before Germany’s federal elections, Facebook took an unprecedented step: removing a series of accounts working together to spread COVID-19 misinformation and encourage violent reactions to COVID restrictions.
The action, announced on September 16, was the first use of Facebook’s new “coordinated social harm” policy, which was not intended to stop state-sponsored propaganda campaigns, but otherwise specific users who used hate speech or misinformation to circumvent rules. An increasingly sophisticated effort has been made for
In the case of the German network, about 150 accounts, pages and groups were linked to the so-called Querdenken movement, a loose coalition that has opposed lockdown measures in Germany and includes vaccine and mask anti-vaccines, conspiracy theorists and some far-right. extremist
Facebook touted the move as an innovative response to potentially harmful content; Far-right commentators denounced it as censorship. But a review of the deleted content – as well as the many more QuerDenken posts that are still available – suggests that Facebook’s action is at best. At the very least, critics say it could have been a ploy to counter complaints that it’s not doing enough to stop harmful content.
“This action appears to be driven by Facebook’s desire to demonstrate action to policymakers in the days before the election, not a broader effort to serve the public,” said researchers from Reset, a UK-based non-profit. concluded, which has drawn criticism from social media. role in democratic discourse
Facebook regularly updates journalists about accounts it removes under policies banning “coordinated unverified behavior,” a term coined in 2018 to describe those groups or people. who work together to mislead others. Since then, it has removed thousands of accounts, most of which were about bad actors attempting to interfere in elections and politics in countries around the world.
But there were some obstacles, as not all harmful behavior on Facebook is “unpromising”; There are plenty of completely authentic groups using social media to incite violence, spread misinformation and spread hatred. So the company was limited by its policy on what it could take.
But even with the new rule, a problem with takedowns remains: They don’t make it clear what harmful content remains on Facebook, which makes it difficult to determine exactly what the social network is achieving.
Case in point: the QuerDennecan network. Reset was monitoring accounts already removed by Facebook and issued a report concluding that only a small portion of QuerDenken-related content had been taken down, while many similar posts were allowed to remain.
The dangers of COVID-19 militancy were underlined days after Facebook’s announcement when a young German gas station worker was shot by a man who refused to wear a mask. The suspect followed several far-right users on Twitter and expressed negative views about immigrants and the government.
Facebook initially declined to provide examples of QuerDenken content, but eventually issued four posts to the Associated Press that did not differ from the content still available on Facebook. He incorrectly included a post saying that vaccines create new viral variants and another that wished death on the police that sparked violent protests against COVID restrictions.
An analysis of Facebook’s reset of deleted comments found that many were actually written by people trying to refute QuerDenken’s arguments, and did not contain misinformation.
Facebook defended its action, saying the account’s removal was never meant to be a broader ban on QueryDenken, but rather a carefully measured response to users working together to violate its rules and spread harmful content. Were.
According to David Agranovich, Facebook’s director of global threat disruption, Facebook plans to refine and expand the use of the new policy.
“It’s a start,” he told The Associated Press on Monday. “This is how we are expanding our network disruption model to address new and emerging threats.”
Between allowing different views and preventing harmful material from spreading, Agranovich said, the approach attempts to strike a balance.
According to Cliff Lampe, a professor of information at the University of Michigan who studies social media, the new policy could represent a significant change in the platform’s ability to counter harmful speech.
“In the past they have tried to crush cockroaches, but there are always more,” he said. “You can spend the whole day on your feet and you will get nowhere. Going behind the network is a sensible effort.”
Simon Hegelich, a political scientist at the Technical University of Munich, said that although removing the Querdenken network may be appropriate, it should raise questions about Facebook’s role in the democratic debate.
Hegelich said it appears Facebook is using Germany as a “test case” for the new policy.
“Facebook is really interfering in German politics,” Hegelich said. “The COVID situation is one of the biggest issues in the election. They’re probably right that there’s a lot of misinformation on these sites, but it’s a highly political issue nonetheless, and Facebook is interfering.”
Members of the Querdenken movement reacted angrily to Facebook’s decision, but many also expressed a lack of surprise.
“The big deletion continues,” posted a supporter in the still-active QuerDennecan Facebook group, “see you on the street.”
Additional reporting by The Associated Press
Credit: www.independent.co.uk /