Misinformation and extremism are spreading uncontrollably. Hate speech has fueled conflict and violence in the US and abroad. Human traffickers share a stage with pictures of children and announcements of engagement.
Despite its mission to bring people closer, internal documents obtained by USA Today show that Facebook knew it Users were being isolated by a wide range of dangerous and divisive content on its platforms.
The documents were part of disclosures made by the Securities and Exchange Commission Facebook whistleblower Frances Haugen. A consortium of news organizations, including USA Today, reviewed the revised versions received by Congress.
The documents provide a rare glimpse into the internal decisions made at Facebook that affect nearly 3 billion users worldwide.
Worried that Facebook was prioritizing profit over the well-being of its users, Haugen reviewed thousands of documents over several weeks before leaving the company in May.
Facebook rebrand on the horizon?:Is Facebook changing the company name? Shift in Metaverse ignited rebranding plan, report says
documents, some of which have been the subject of extensive reporting by wall street journal And the new York TimesExtensive research by the company shows that toxic and divisive content is prevalent in posts promoted by Facebook and shared widely by users.
Concerns about how Facebook operates and its effect on adolescents has united Congress leaders.
More political repercussions could come as Haugen testifies before British parliament on Monday.
During his testimony, Haugen said he is concerned with a number of concepts related to Facebook, such as ranking posts based on engagement, the lack of security support for languages beyond English, and the “wrong choices” that affect the way Facebook functions. Minimizes the discussion. In the battle between transparency versus privacy.
“Now is the most important time to take action,” Hogan said, comparing Facebook’s situation to an oil spill. “Facebook’s failures right now are making it difficult for us to regulate Facebook.”
Haugen also discussed the effect of “clusters” in the spread of misinformation and polarizing material.
“Undoubtedly, it’s making the hate worse,” she said.
Haugen suggested solutions that would help curb the spread of misinformation and move away from ranking based on engagement, such as the return of updates to users’ news feeds that occur chronologically.
However, Facebook has pushed back on changes that could affect its bottom line, she said.
“They don’t want to lose that growth,” Haugen said. “They don’t want 1% short sessions because that makes 1% less income.”
Haugen also addressed Facebook’s oversight board, which decides on content moderation for the platform. Haughan urged the board to bring more transparency in its relationship with Facebook.
Haugen said that if Facebook can actively mislead its board, “I don’t know what the purpose of the oversight board is.”
Hogen, who spoke for more than an hour, said she is “deeply concerned” about Facebook’s ability to make its social app Instagram safer for children.
Facebook had planned to release a version of the app for children under the age of 13. but postponed launch More work to do in September to address concerns from parents and lawmakers.
During the testimony, Haugen said that unlike other platforms, Instagram is built for “social comparison,” which can be worse for kids.
Haugen disputes Facebook claims they need to launch a children’s version of Instagram because many users under the age of 13 lie about their age. She suggests that Facebook should publish how they detect users under the age of 13.
Asked why Facebook hasn’t done anything to make Instagram safer for kids, he said the company knows that “young users are the future of the platform and the sooner they get to them, the more likely they are.” They’ll add them.”
Facebook is now facing the most intense scrutiny since its launch in 2004.
CEO Mark Zuckerberg has defended the company and its practices, sharing in an internal staff memo that “it’s very important to me that everything we make is safe and good for kids.”
Company spokesman Andy Stone said in a statement to USA Today, “At the heart of these stories is a premise that is false. Yes, we’re a business and we make a profit, but it’s the idea that we pay the price for doing so.” But let’s do it. People’s safety or well-being misunderstands where our own business interests lie. The truth is that we’ve invested $13 billion and have over 40,000 people to do one thing: keep people safe on Facebook “
Facebook’s vice president of global affairs, Nick Clegg, echoed a similar sentiment in a broad memo to employees obtained by Granthshala on Saturday. Clegg told employees he “should not be surprised to find himself under such intense scrutiny.”
Clegg said, “I think most reasonable people would accept that social media is being held responsible for many issues that run very deeply in society – from climate change to polarization, from juvenile mental health to organized crime. till.” “That is why we need the help of lawmakers. Private companies should not be left alone to decide where the line is drawn on social issues.
Sunday, Sen. Richard Blumenthal, D-Conn., chairman of the Consumer Protection Subcommittee, which Haugen testified, told CNN that Facebook “Should come clean and reveal everything.”
dissemination of misinformation
The documents reveal the internal discussion and scientific experimentation surrounding misinformation and harmful content being spread on Facebook.
A change to the algorithm that prioritizes what users see in their News Feed in 2018 was supposed to encourage “meaningful social interaction” and strengthen bonds with friends and family.
Facebook researchers found that the algorithm change was fueling the spread of misinformation and harmful content and was actively experimenting with ways to demote and include that content.handjob Show documents.
Who is Facebook whistleblower Frances Haugen:Everything You Need To Know
From Facebook friends to romance scammers:Older Americans increasingly targeted amid COVID pandemic
News feed with violence and nudity
Facebook research found that users with low digital literacy skills were significantly more likely to see graphic violence and borderline nudity in their News Feeds.
The research found that black, elderly and low-income people were most at risk from the influx of disturbing posts. It also said that Facebook conducted several in-depth interviews and in-home visits with 18 of these users over several months. Researchers found that exposure to disturbing content in their feeds reduced their chances of using Facebook and increased the trauma and hardships they were already experiencing.
Researchers found: A 44-year-old who was in a precarious financial situation followed Facebook pages that posted coupons and savings and was bombarded with posts from anonymous users of financial scams. A man who used the Facebook group for Narcotics Anonymous and summed up his car was shown alcoholic beverage ads and posts about cars for sale. Black people were constantly shown images of physical violence and police brutality.
In contrast, borderline hate posts appear more frequently in the feeds of high-digital-literacy users. While low-digital-literacy users were unable to avoid nudity and graphic violence in their feeds, research suggested that people with better digital skills use them to search for hate-filled content more effectively.
curb harmful material
The documents show that the company’s researchers tested a variety of methods to reduce the amount of misinformation and harmful content passed on to Facebook users.
The tests directly involved engineering improvements that would demote viral content that was negative, sensationalized or meant to provoke outrage.
In April 2019, company executives debated reducing the virality of misinformation by demoting “deep recharge” of content where the poster is not a friend or follower of the original poster.
Facebook found that users encountering posts that are more than two reshares away from the original post are four times as likely to see misinformation.
One employee wrote that by demoting that content, Facebook would be “easily scalable and could hold loads of misinformation”. “While we don’t think it is an alternative to other methods for dealing with misinformation, it is comparatively simple across languages and countries at large.”
Other documents show that Facebook implemented this change in 2019 in several countries including India, Ethiopia and Myanmar, but it is unclear whether Facebook stuck with this approach in these instances.
Done with Facebook?:Here’s how to deactivate or permanently delete your Facebook account
‘Profit before the people’:After Facebook whistleblower Frances Haugen argues in her case, will Congress take action?
How to moderate at-risk countries
The documents show that Facebook is aware of potential harm from content on its platform in countries at risk, but does not have effective moderation – either from its artificial intelligence screening or to review reports of potentially infringing content. from employees.
Another document based on 2020 data offered proposals to change the moderation of content in Arabic “to improve our ability to overtake dangerous incidents, PR fires, and integrity issues in a high priority …