“When social media platform companies create, design, implement, or maintain features for users, including child users, on their social media platforms that the company knows or should know are addictive to children, they should be held liable for the harms that result,” reads a new bill passed by the California State Assembly to tackle social media addiction among children.
Passed in June, the bill proposes to allow parents to sue social media companies if their children get addicted to them, levy fines of up to $25,000 for each violation, oblige platforms to conduct periodic audits to remove features that lead to addiction, and more. Called Social Media Platform Duty to Children Act, it is now due to be discussed in the California Senate before and if it is sent to the State’s governor to be signed into law
Apart from the aforementioned provisions, the bill says that addiction to social media platforms had been linked with increased suicides and depression among teenagers, impeding the development of judgment, attention, and memory in the brains of children, making them more lonely and so on. At several points, the bill also refers to revelations made from internal research at Meta (earlier known as Facebook), disclosed by Product Manager-turned-whistleblower Frances Haugen.
Last year, Haugen’s disclosures revealed that the platform’s internal research showed 5-6% of 14-year-olds using the platform admitted to identifying signs of addiction to the platform in their behaviour. Further, she revealed that 32% of teenage girls noted that Instagram made them feel worse about their bodies and some teenagers also traced their suicidal feelings back to the platform.
Why it matters? Following the revelations, which came forth in September 2021, there has been an increase in efforts to check the impact of social media on children, especially that of Meta’s platforms. For example, in November a group of US state Attorney Generals launched an investigation into Facebook for promoting Instagram to children and young adults despite knowing that the platform causes physical and mental health.
The new bill is yet another effort in this direction. We have summarised its provisions below.
Never miss out on important developments in tech policy, whether in India or across the world. Sign up for our daily newsletter with a free read of the day to experience MediaNama in a whole new way.
Which platforms will be covered by the bill and how
Companies with $100mn in revenue to be covered
Internet-based services or applications which meet the following criteria will be covered by the bill:
i) Has users in California
ii) The primary purpose should be to connect users and allow users to interact with each other within the service or application
Thus, it should provide the following functionalities:
“(ia)Construct a public or semi-public profile within a bounded system created by the service or application.
(ib)Populate a list of other users with whom an individual shares a connection within the system.
(ic)View and navigate a list of connections made by other individuals within the system
(id)Create or post content viewable by other users”
iii) Should be controlled by an entity that generated at least $100 Million in revenue in the last year
However, the bill also mentions services that will not be included in this description:
i) Email, SMS or MMS services that exclusively transmit user generated email, SMS or MMS, respectively
ii) Streaming sites that exclusively allow people to transmit licensed media
iii) Internal messaging services for businesses, which are not otherwise available to children or general public
iv) “A service offering only one-to-one live aural (audio-based) communications,” the bill says; and
v) Services like comments sections of news sites and product review sections of e-commerce sites which allow communication between people in the following ways:
(ia) “Posting comments or reviews relating to content produced and published by the provider of the service or by a person acting on behalf of the provider of the service.
(ib)Sharing comments or reviews described in sub-subclause (ia) on a different internet service.
(ic) Expressing a view on comments or reviews described in sub-subclause (ia), or on content mentioned in clause (i), by means of any of the following:
(Ia) Applying a “like” or “dislike” button or other button of that nature.
(Ib) Applying an emoji or symbol of any kind.
(Ic) Engaging in yes or no voting.
(Id) Rating or scoring the content, or the comments or reviews, in any way”
A child or their parent can sue the platform for role in addicting
The bill allows a child user i.e, a user of the platform under 18 years of age or their parent or guardian to sue a social media firm if it uses ‘designs, features, or affordance’ which causes the child to get addicted to the platform. For the suit to succeed, the bill lays out that the following points will have to be ascertained:
(A) “The design, feature, or affordance was a substantial factor in causing the child user’s addiction and harm.
(B) It was reasonably foreseeable that the use of that design, feature, or affordance would addict and harm child users.
(C) The child user in such a suit became addicted and was therefore harmed”
Further if a platform maintained a feature that causes addiction in a child, before January 1st 2023, it shall also be liable to pay damages. However the bill says that such violations could be foregone if they are stopped by a given date.
Conduct audits, remove addictive features to avoid penalties
“A social media platform shall not be held liable for a violation under this subdivision if, by _____, the platform ceases development, design, implementation, or maintenance of features that were known, or should have been known, by the platform to be addictive to child users,” the bill says.
The blank signifies that the assembly may give a date for this compliance later.
It further specifies that an ‘operator’ of a platform will not be subject to penalties if it does the following:
“(A) Institute and maintained a program of at least quarterly audits of its practices, designs, features, and affordances to detect practices or features that have the potential to cause or contribute to the addiction of child users.
(B) Corrected, within _____ days of the completion of an audit described in subparagraph (A), any practice, design, feature, or affordance discovered by the audit to present more than a de minimis risk of violating this subdivision,” – Social Media Platform Duty to Children Act.
How the bill defines ‘addiction’ and (to) ‘addict’
The bill lays out the following definitions for addiction and making someone addicted- both important grounds inviting penalties under its provisions.
Addiction:
” ‘Addiction’ means use of one or more social media platforms that does both of the following:
(i) Indicates preoccupation or obsession with, or withdrawal or difficulty to cease or reduce use of, a social media platform despite the user’s desire to cease or reduce that use.
(ii) Causes physical, mental, emotional, developmental, or material harms to the user,” the bill says.
Addict:
‘Addict’ is defined as ‘knowingly or negligently causing addiction through any act or omission or any combination of acts or omissions,’ in the bill.
Also read: