Monica Bickert says Frances Haugen misrepresented information from stolen documents
Facebook’s vice president of content policy, Monica Bickert, hit back on Wednesday against whistleblower Frances Hogen, saying she was “not an expert” and did not work on child safety issues. testified Accused Capitol Hill that the platform is more concerned about making money than protecting its users.
Bickert said on “America’s Newsroom”, Haugen “misrepresented” the information disclosed from the platform.
Facebook’s anger management: Both sides embrace whistleblowers in heated hearing
“I can say that he did not act on these issues and testifying about and misrepresenting some of the documents he stole as if a journalist was reading another reporter’s story, a colleague’s story, and saying Oh, I’m an expert on this,” Bickert said.
“He’s not an expert in these areas.”
Haugen, who works in project management for civil misinformation, testified before lawmakers on Tuesday that the company prioritizes profit-making over implementing security measures to protect its users, which is also considered a “national security issue.” could.
Haugen leaked documents wall street journal Internal research at Facebook last month revealed that Instagram may be harmful to some young users, especially teenage girls.
Bickert pushed back on the report, arguing that the platform prioritizes security and works with individuals who have access to it, including teachers, counselors and legal experts.
Victor Davis Hanson: Dems Using Facebook Scandal to Weaponize Government Against Free Speech
“These are the experts we bring to Facebook because they care so deeply about these issues and research,” Bickert said.
Bickert claimed that the platform could also be beneficial to the mental health of some young teens who use Instagram.
“Research has largely shown that teens have positive experiences when dealing with mental health issues, as do many teens,” Bickert said. “Instagram is a place that helps them find support.”
During his testimony, Haugen also suggested Facebook. the algorithm is flawed And it has had a primary role in exposing teens and other users to harmful content on the web because it rewards posts that elicit polarizing responses. Bickert also emphasized those claims that emphasized the company’s push for transparency.
Bickert said, “We have published a set of content delivery guidelines at our Transparency Center that explain, for example, what types of content we devalue. We reduce engagement bait, clickbait, sensationalist content. Huh.”
“And as Mark said in his memo, it’s designed to give people a better experience. It’s in our business interest to make it a place people will want to return for years.”
Haugen urged lawmakers to consider reforming Section 230 of the Communications Decency Act, which has historically garnered bipartisan support, to eventually hold Facebook accountable for third-party content.