A bipartisan group of attorneys general from forty US states wrote to Facebook CEO Mark Zuckerberg asking Facebook to abandon plans to develop a version of Instagram for children under the age of 13. “Use of social media can be detrimental to the health and well-being of children, who are not equipped to navigate the challenges of having a social media account,” the letter argued. Furthermore, Facebook doesn’t have a great track record when it comes to protecting the welfare of children, the letter added. Facebook’s plans to build Instagram for the under-13 age group was reported by BuzzFeed News in March.
“It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account. In short, an Instagram platform for young children is harmful for myriad reasons. The attorneys general urge Facebook to abandon its plans to launch this new platform.”
“Social media can be harmful to the physical, emotional, and mental well-being of children”
The attorneys general cited multiple pieces of research that showed the harmful effects that social media can have on children. One such research showed the link between young people’s social media use and the increase in mental distress, self-injurious behaviour, and suicidality. Another research that uses data collected by an online monitoring company that tracked the activity of 5.4 million children found that Instagram was frequently the cause of suicidal ideation, depression and body image concerns.
“Instagram…exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share photos with their followers[,]” and “[t]he platform’s relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing.” – Organisations and experts cited in the letter
The attorneys general also criticized statements made by Facebook in the March 2021 Congressional hearing in which the platform claimed using social apps to connect to other people can have health benefits. “This overly simplified statement conflates the benefits of social connection (of which there are many) with purported benefits of using social media to enable that connection, which as outlined above, carry distinct harms to young children,” the attorneys general argued.
“Young children not equipped to handle the range of challenges that come with having an Instagram account”
Not only do children lack a mature understanding of privacy, but they also do not fully know what content is appropriate for them to share online, the permanency of such content when shared online, and who all can see such content, the attorneys general pointed out. The target population is too young to deal with what they encounter online especially when it concerns inappropriate content and online relationship with other users, the letter added.
“One report found an increase of 200% in recorded instances in the use of Instagram to target and abuse children over a six-month period in 2018, and UK police reports documented more cases of sexual grooming on Instagram than any other platform. In 2020 alone, Facebook and Instagram reported 20 million child sexual abuse images.”
A new Instagram platform will also increase cyberbullying, which is already at alarming rates. Citing a 2017 survey, the letter highlights that 42% of young Instagram users experienced cyberbullying, the highest of any platform measured. The situation online is worse because “the internet often leads cyberbullies to say and do crueller things than a schoolyard bully,” the letter stated. The issue has been further exacerbated by the pandemic as children spend more time online.
“Facebook has a record of failing to protect the safety and privacy of children”
The attorneys general point out multiple instances where Facebook failed to protect the safety and privacy of children despite claiming that its products have strict privacy controls. In 2019, Facebook launched Messenger for Kids intended for children between the age of 6 to 12, but this app contained flaws that allowed these children to circumvent restrictions set by parents and join group chats with strangers not approved by the children’s parents. Another instance the letter highlighted is when Instagram promoted diet content to young users with eating disorders and to people who were at the risk of relapsing. “These alarming failures cast doubt on Facebook’s ability to protect children on their proposed Instagram platform and comply with relevant privacy laws such as the Children’s Online Privacy Protection Act,” the attorneys general stated.
Not the first platform to face scrutiny
Other social media platforms have recently faced increased scrutiny on how they deal with children. In 2019, the US government conducted an investigation into YouTube for violating childrens’ privacy after receiving numerous complaints from consumer groups and privacy advocates. Following the investigation, YouTube faced a fine of $170 million in September last year. Earlier this year, YouTube announced a slew of changes to protect childrens’ privacy including new data collection practices for childrens’ content, more prominently identifying content for children, and working with creators that make kids content.
TikTok, after being fined $5.7 million in 2019 for violating kids’ privacy, also announced more controls and safety measures for children earlier this year. In addition to setting the accounts of users aged between 13 and 15 years to “private” by default, the short-form platform removed the feature that lets everyone comment on kids’ videos.
Facebook itself had to make changes to Messenger for Kids in February last year to address various concerns.
In the context of India’s Personal Data Protection (PDP) Bill
Chapter IV of the PDP bill deals with the personal data of children (any person below the age of 18) and says that data fiduciaries must process personal data of children in a manner that protects the child’s rights and is in the best interests of the child. These data fiduciaries must implement appropriate mechanisms to verify the age of the child, and ensure that consent is obtained from the parent or guardian. The bill, however, is yet to provide specific regulations that will govern how this is done.
Furthermore, platforms that operate commercial websites or online services directed at children (Instagram for Kids will fall under this), or process large volumes of personal data of children, will be classified as ‘guardian data fiduciaries. Guardian data fiduciaries are not permitted to engage in profiling, tracking, behavioural monitoring of children or direct targeted advertising at children. They are also barred from undertaking any other activities that may cause significant harm to a child. Facebook, in a statement to CNBC, said that it will not show ads in the Instagram app it develops for kids. “We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it,” Facebook added.
Any violations to rules under Chapter IV could result in a penalty up to Rs 15 crores or 4% of the worldwide turnover of the data fiduciary for the preceding financial year (whichever is higher).