Facebook Is Delaying ‘Instagram Kids’ Amid Criticism
The social media giant said it still wanted to build a child-focused Instagram product but would postpone the plans in the face of criticism.,
Facebook is delaying an Instagram app for users under the age of 13.
Facebook said on Monday that it had paused development of an “Instagram Kids” service that would be tailored for children 13 years old or younger, as the social network increasingly faces questions about the app’s effect on young people’s mental health.
The pullback comes ahead of a congressional hearing this week about internal research conducted by Facebook, and reported in The Wall Street Journal, that showed the company knew of the harmful mental health effects that Instagram was having on teenage girls. The revelations have set off a public relations crisis for the Silicon Valley company and led to a fresh round of calls for new regulation.
Facebook said it still wanted to build an Instagram product intended for children that would have a more “age appropriate experience,” but was postponing the plans in the face of criticism.
“This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today,” Adam Mosseri, the head of Instagram, wrote in a blog post.
The decision to halt the app’s development represents a rare reversal for Facebook. In recent years, the social network has become perhaps the world’s most heavily scrutinized corporation, grappling with privacy accusations, hate speech, misinformation and allegations of anti-competitive business practices. Regulators, lawmakers, journalists and civil society groups around the world have criticized the company for the effects it is having on society.
With Instagram Kids, Facebook had argued that young people were using the photo-sharing app anyway, despite age-requirement rules, so it would be better to develop a version more suitable for them. Facebook said the “kids” app was intended for those age 10 to 12 and would require parental permission to join, forgo ads and carry more age-appropriate content and features. Parents would be able to control what accounts their child followed. YouTube, which is owned by Google, has released a children’s version of its app.
But since BuzzFeed broke the news earlier this year that Facebook was working on the app, the company has faced scrutiny. Policymakers, regulators, child safety groups and consumer rights groups have argued that it hooks children on the app at a younger age rather than protecting them from problems with the service, including child predatory grooming, bullying and body shaming.
Mr. Mosseri said on Monday that the “the project leaked way before we knew what it would be” and that the company had “few answers” for the public at the time.
Opposition to Facebook’s plans gained momentum this month when The Journal published articles based on leaked internal documents that showed Facebook knew about many of the harms it was causing. Facebook’s internal research showed that Instagram, in particular, had caused teen girls to feel worse about their bodies and led to increased rates of anxiety and depression, even while company executives publicly tried to minimize the app’s downsides.
On Thursday, Facebook’s global head of safety, Antigone Davis, is scheduled to testify at a Senate Commerce Committee hearing titled “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms.”
Simply pausing Instagram Kids was insufficient, said lawmakers, including Senator Richard Blumenthal, Democrat of Connecticut and the chairman of the subcommittee holding Thursday’s hearing. In a statement, he and others said Facebook had “completely forfeited the benefit of the doubt when it comes to protecting young people online and it must completely abandon this project.”
The lawmakers added that stronger regulation was needed. “Time and time again, Facebook has demonstrated the failures of self-regulation, and we know that Congress must step in,” they said.
A children’s version of Instagram would not fix more systemic problems, said Al Mik, a spokesman for 5Rights Foundation, a London group focused on digital rights issues for children. The group published a report in July showing that children as young as 13 were targeted within 24 hours of creating an account with harmful content, including material related to eating disorders, extreme diets, sexualized imagery, body shaming, self-harm and suicide.
“At some point, we have to ask whether Facebook is simply too big to police their own products and services,” Mr. Mik said. “Because unless and until they can provide the service they promise, they are not fit to be trusted with our kids — until then, neither Instagram nor Instagram for kids should be considered a good idea.”
American policymakers should pass tougher laws to restrict how tech platforms target children, said Josh Golin, executive director of Fairplay, a Boston-based group that was part of an international coalition of children’s and consumer groups opposed to the new app. Britain adopted an “Age Appropriate Design Code” last year that requires added privacy protections for digital services used by people under the age of 18.
Mr. Golin also called on Facebook to conduct a major public education campaign to tell parents to get their children under the age of 13 off Instagram.
The Instagram revelations have also set off discontent inside of Facebook. Last Thursday, during a companywide meeting led by Mark Zuckerberg, Facebook’s chief executive, employees demanded to see the Instagram research for themselves and asked what executives planned to do about the findings, according to one attendee, who was not authorized to speak publicly.
“Teen suicide rate has increased 20 percent in the last 4 years,” read one of the top-voted employee questions to Mr. Zuckerberg. “It’s proven that Instagram is toxic for teen girls. What is Facebook doing to address this?”
During the meeting, Mr. Zuckerberg passed the question to Mr. Mosseri, who said the research actually showed that Instagram mostly improved image issues for teens, according to the attendee. Those points were later publicly reiterated in a company blog post on Sunday.
It’s unclear what will happen to the team that led the development of the youth Instagram group. Facebook last year hired Pavni Diwanji, who previously oversaw the development of YouTube Kids, to build a similar experience for Instagram.
Instagram didn’t immediately have a comment on the team.
On Monday, Mr. Mosseri defended the company. He said that Facebook’s internal research was used to help guide product decisions, including new feature that allow people to pause their account or block certain words that could be used for bullying or harassment.
He pledged to introduce new parental control features for the existing Instagram app in the coming months while the company continues to consider a version for children. “Critics of ‘Instagram Kids’ will see this as an acknowledgment that the project is a bad idea,” Mr. Mosseri said. “That’s not the case.”