Removing algorithms will result in ‘more, not less, hate speech’, says VP
Facebook’s controversial algorithms protect its users from exposure to excessive content, hate speech and misinformation, claimed the company’s vice president for policy and global affairs in an interview on Sunday.
Nick Clegg defends Facebook against whistleblower Frances Haugen’s allegations that its algorithms push clickbait and extreme content – but insists the company will never be able to completely eliminate misinformation and hate speech from its platforms .
“If you take away the algorithm… the first thing people will see is more, not less, Hate speech – more, no less misinformation,” Clegg told Dana Bash on CNN’s State of the Union. “These algorithms are designed to act almost like giant spam filters to identify and remove bad content “
Facebook executive outlines ‘future plans’ for implementing measures to protect teens on Instagram
“For every ten thousand bits of content, you will see only five bits of hate speech height,” he said. “I wish we could end it at zero.. Our platform is home to a third of the world’s population. Of course, we see the good, the bad, and the ugliness of human nature on our platform.”
Clegg insists to Bash that Facebook’s algorithms played no particular role in leading Capital riot on January 6 On NBC’s “Meet the Press,” he told Chuck Todd that Haugen’s claims Facebook took measures intended to shrink user feeds after the 2020 presidential election were “simply not true.”
“We’ve actually put most of them right through Inauguration, and we kept some permanently,” Clegg told Todd, adding that some of the changes were “outright.”
Tech CEO: Facebook is facing the ‘beginning of the end’
He said the company withdrew “blunt tools” – such as reducing the circulation of videos, opportunities for citizen engagement., and political ads – which were inadvertently “putting out a lot of completely innocent legitimate legal playful entertaining material.”
“We did that very extravagantly,” Clegg said. “We just let completely normal content air less on our platform. This is something we’ve done because of exceptional circumstances.”
Clegg tells Todd that responsibility is on Congress “Create a digital regulator” and set rules for data privacy and content moderation.
Get Granthshala Business on the go by clicking here
“I don’t think anyone wants a private company to decide on these tough trade-offs, you know, free expression on the one hand, and moderate or remove content on the other,” he said. “There is fundamental political dissent. The Right thinks we censor … too much content, the Left thinks we don’t take down enough.”
Clegg told ABC’s George Stephanopoulos that it was “extremely misleading” to harm Facebook’s perceived knowledge of its products to create awareness of the dangers of tobacco companies’ cigarettes to children and society.
“The similarities in the ’80s and ’90s were that watching too much television was like drinking alcohol, or arcade games like Pac-Man, as you know, was like drug abuse,” he said. “We can’t change human nature. You will always see bad things online.”