“We believe building ‘Instagram Kids’ is the right thing to do, but we’re pausing the work,” Facebook announced on September 27 amidst the intense scrutiny the company has been facing following damning revelations made by the Wall Street Journal as part of its Facebook Files investigation. “We’ll use this time to work with parents, experts and policymakers to demonstrate the value and need for this product,” the company said.
Facebook’s plan to build a version of its app for children under the age of 13, who are currently not allowed on the main app, was first reported by BuzzFeed News in March this year, but it has not been well received by public health experts.
Instagram’s impact on mental health of teenagers: WSJ report
Internal research conducted by Facebook showed that a large number of teenagers, particularly teenage girls, trace a significant amount of anxiety and mental health problems to Instagram, WSJ reported on September 14. “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” a slide in an internal presentation on the results of Facebook’s research said.
These findings, finalised in March 2020, were not made public by the company. In a letter written to US lawmakers after the research was conducted, Facebook refused to provide details of research that the company conducted on the impact of its social media platforms on young people. The reticence to share this information provoked comparisons to the tobacco industry in the 20th century by two people cited in the WSJ report — Senator Richard Blumenthal and psychology professor Jean Twenge said that Facebook’s behaviour was reminiscent of how the tobacco industry knew that cigarettes were carcinogenic, and did not agree with this assessment until much later.
What the findings say: The findings reported by Facebook boil down to these observations, apart from the one mentioned above:
- One in five teenagers said Instagram makes them feel bad about themselves.
- Teenagers blame Instagram for increased anxiety and depression, “unprompted”.
- 6–13% of teens who had suicidal thoughts attributed Instagram as a cause.
- An experiment to hide “likes” from posts didn’t significantly improve mental health outcomes.
Upcoming Senate testimony: In light of these revelations, Facebook has been asked to testify at a Senate hearing entitled “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms” on Thursday, September 30.
Facebook’s rebuttal: Facebook on September 26 published a point-by-point rebuttal saying that WSJ mischaracterized internal research and that the research actually “shows that on 11 of 12 well-being issues, teenage girls who said they struggled with those difficult issues also said that Instagram made them better rather than worse.” However, Facebook has refused to release the full research data firsthand for critics to analyse.
Is Instagram Kids a good or bad idea?
Do we need an app for under 13 users? “Critics of ‘Instagram Kids’ will see this as an acknowledgement that the project is a bad idea. That’s not the case,” Facebook said in its announcement arguing that “kids are getting phones younger and younger, misrepresenting their age, and downloading apps that are meant for those 13 or older” and that “it’s better for parents to have the option to give their children access to a version of Instagram that is designed for them.” But as a group of US attorneys general opposing the idea wrote in May: “Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account.” The attorneys general argued that “social media can be harmful to the physical, emotional, and mental well-being of children” and asked Facebook to shelve the plan entirely.
Can we trust Facebook with child safety features? Another point that Facebook made is that Instagram Kids “was never meant for younger kids, but for tweens (aged 10-12)” and that “it will require parental permission to join, it won’t have ads, and it will have age-appropriate content and features” including supervision of who can message and follow a kids’ account and who the kid can follow. But Facebook’s track record when it comes to protecting children has been spotty. “Facebook has completely forfeited the benefit of the doubt when it comes to protecting young people online and it must completely abandon this project,” US lawmakers said in a statement on Monday while asking Facebook to drop the plans entirely rather than just pause it. The attorneys general’s letter from May also points out multiple instances where Facebook failed to protect the safety and privacy of children despite claiming that its products have strict privacy controls. For example, in 2019, Facebook launched Messenger for Kids intended for children between the age of 6 to 12, but this app contained flaws that allowed these children to circumvent restrictions set by parents and join group chats with strangers not approved by the children’s parents.
Haven’t other platforms launched similar initiatives? Instagram will not be the first company to build a specialised version of its app for kids. YouTube and TikTok have versions of their app for those under 13 as well but these apps have received their fair share of criticism as well. For example, YouTube Kids has been criticised for its never-ending autoplay feature and TikTok for allowing users to bypass age restrictions by simply entering a false birthdate.
In the context of India’s Personal Data Protection (PDP) Bill
Chapter IV of the PDP bill deals with the personal data of children (any person below the age of 18) and says that data fiduciaries must process personal data of children in a manner that protects the child’s rights and is in the best interests of the child. These data fiduciaries must implement appropriate mechanisms to verify the age of the child, and ensure that consent is obtained from the parent or guardian. The bill, however, is yet to provide specific regulations that will govern how this is done.
Furthermore, platforms that operate commercial websites or online services directed at children (Instagram for Kids will fall under this), or process large volumes of personal data of children, will be classified as ‘guardian data fiduciaries. Guardian data fiduciaries are not permitted to engage in profiling, tracking, behavioural monitoring of children or direct targeted advertising at children. They are also barred from undertaking any other activities that may cause significant harm to a child. Facebook, in a statement to CNBC, had said that it will not show ads in the Instagram app it develops for kids.
- Instagram Announces Three New Safety Measures For Young Users, Including Limiting Advertisers’ Reach
- Forty US Attorneys General Urge Facebook To Not Build Instagram For Kids
- Facebook Launches Messenger Kids In India
- Google Announces A Slew Of New Measures To Protect The Safety Of Children On Its Platforms
Have something to add? Post your comment and gift someone a MediaNama subscription.