Facebook is halting development of a children’s version of Instagram geared towards children under the age of 13, to address concerns about the vulnerability of younger users.
“I still strongly believe that it’s a good thing to have a version of Instagram that’s designed to be safe for tweens, but we want to take the time to talk to parents and researchers and security experts and learn more about how to move forward.” Want to get more consensus about the method, said Instagram head Adam Mosseri in an interview Monday on NBC’s “Today” show.
The announcement comes after an investigative series by The Wall Street Journal reported that Facebook was aware that the use of Instagram by some teenage girls caused mental health problems and anxiety.
Yet the development of Instagram for a younger audience was met with widespread opposition almost immediately.
Facebook announced the development of an Instagram Kids app in March, saying it was “exploring a parental-controlled experience.” Two months later, a bipartisan group of 44 attorneys general wrote to Facebook CEO Mark Zuckerberg urging him to abandon the project, citing the well-being of children.
He cited an increase in cyberbullying, a potential vulnerability to online predators, and Facebook’s “checkered record” in protecting children on its platform. Facebook faced similar criticism in 2017 when it launched the Messenger Kids app, which was touted as a way for kids to chat with parent-approved family members and friends.
Josh Golin, executive director of children’s digital advocacy group Fairplay, on Monday urged the company to pull the plug on the app permanently. So did a group of Democratic members of Congress.
“Facebook is heeding our call to stop moving forward with plans to launch a version of Instagram for kids,” Massachusetts Sen. Ed Markey tweeted. “But a ‘pause’ is insufficient. Facebook should abandon this project altogether.”
The Senate had planned a hearing on Thursday with Facebook’s global security chief, Antigone Davis, to ask the company to find out how Instagram affects the mental health of young users.
Mosseri said Monday that the company believes it is better to have a specific platform for age-appropriate content for children under the age of 13, and that other companies such as TikTok and YouTube have access to content for that age group. There are app versions for .
He said in a blog post that it’s better to have a version of Instagram where parents can monitor and control their experience, rather than relying on the company’s ability to verify that kids are good enough to use the app. of age or not.
Mosseri said Instagram for Kids is for people ages 10 to 12, not young kids. It will require parental permission to join, is ad-free, and will include age-appropriate content and features. Parents will be able to monitor the time their kids spend on the app, see who can message them, who can follow them and who they can follow.
While work on Instagram Kids is being paused, the company will expand the opt-in tool for parental supervision to teen accounts 13 and older. Mosseri said more details about these devices will be revealed in the coming months.
This isn’t the first time Facebook has received backlash for a product aimed at children. Child development experts urged the company to shut down its Messenger Kids app in 2018, saying it was not responding to a “need” as Facebook insisted, but was instead building one.
In such a situation, Facebook took the app forward.