- Facebook quietly changed an algorithm to prioritize shared content again in 2018
- But leaked internal documents suggest it fueled the spread of misinformation
- This caused poisoning and violent material to be ‘highly prevalent’
- CEO Mark Zuckerberg was warned about the issue but was reluctant to make changes.
Facebook quietly changed its algorithm to prioritize shared content again in 2018, only for it to backfire and cause misinformation, poisoning and violent content to become ‘highly prevalent’ on the platform, leaked internal documents revealed.
The company’s CEO Mark Zuckerberg said that this change has been made to strengthen and better the relationship between users – especially family and friends.
But what happened was the opposite, the documents show, with Facebook becoming an angry place as the revised algorithm was rewarding outrage and sensationalism.
Company researchers found that publishers and political parties were intentionally posting negative or divisive content as it garnered likes and shares and spread to more users’ news feeds. wall street journal.
It has seen a series of internal documents that reveal Zuckerberg was warned about the problem in April 2020, but heeded it.
Facebook quietly changed an algorithm to prioritize reshared content in 2018, only for it to backfire and misinformation, poisoning and violent content to become ‘highly prevalent’ on the platform, leaked internal documents reveal. is gone
How to change who can comment on your posts
By default, everyone can comment on your public posts, even people who don’t follow you.
To change who can comment:
Go to the public post you want to change on your profile.
Click on the three dots on the top right of the post.
Click Who can comment on your post?
Choose who is allowed to comment on your public posts:
- Profiles and pages you mentioned
If a profile or Page that wants to comment on your post isn’t in your selected comment audience, they won’t be given the option to comment.
However, they will see that you have limited who can comment on your posts.
Encouraging more ‘meaningful social interactions’ was precisely the reason the 2018 algorithmic change was made, as people within the company were concerned about declining user engagement in the form of commenting or sharing posts.
This is important to Facebook because many inside the tech firm see it as an important barometer for the health of the platform — fearing that people may eventually stop using it if engagement is low.
In 2017, comments, likes and shares declined throughout the year, but as of August 2018, following an algorithmic change, Free Fall had stopped and the ‘daily active people’ metrics using Facebook improved substantially Was.
The problem, however, was that when the tech firm’s data scientists surveyed users they found that many thought their feeds had decreased in quality.
Not only that, but the changes in Poland made the political debate on the platform more hostile, the documents show.
A Polish political party, who did not wish to be named, is said to have told the company that its social media management team had moved its number of posts from 50/50 positive/negative to 80 percent negative due to an algorithmic change. Had given.
“Many parties, including those shifted to the negative, worry about the long-term effects on democracy,” according to an internal Facebook report, which did not name those parties.
It also affected online publishers.
BuzzFeed CEO Jonah Peretti emailed a top Facebook executive saying that the most divisive content created by publishers was going viral on the platform.
This, he said, was creating an incentive to produce more of it, according to the documents.
Mr Peretti’s complaints were highlighted by a team of Facebook data scientists, who wrote: ‘Our approach has had an unhealthy side effect on important slices of public content, such as politics and news.’
One of them added in a later memo: ‘This is an increasing liability.’
They speculated that the new algorithm was causing an increase in angry voices because it was giving more weight to re-sharing and often divisive content.
‘Misinformation, toxicity and violent content are unusually prevalent among re-sharing,’ the researchers wrote in an internal memo.
Facebook CEO Mark Zuckerberg said the algorithm change was made to strengthen the bond between users – especially family and friends – and to improve their well-being.
Facebook wanted its users to interact more with their family and friends rather than passively consuming professionally produced content. That’s because research suggested it was harmful to their mental health.
According to the documents, to encourage engagement and original posting, the company decided that its algorithm would reward posts with more comments and sentiment emoji, which were seen as more meaningful than likes.
An internal point system was used to measure its success, with one point worth of ‘likes’; Five point response; And an important comment, share again or leave a 30 point response. Multipliers were also added based on interactions between friends or strangers.
But after concerns were raised about potential issues with the algorithm, Zuckerberg was presented with a number of proposed changes that would counteract the spread of false and divisive content on the platform, an internal memo from April 2020 shows.
One of the suggestions was to boost the algorithm by having content re-shared by long chains of users, but Zuckerberg was reportedly quiet on the idea.
‘Mark doesn’t think we can move forward with change’, an employee wrote to coworkers after the meeting.
Zuckerberg said he was ready to test it, he said, but ‘if there was a physical business we wouldn’t launch …