The California state government passed its Age-Appropriate Design Code Act (AADC) on September 15. Modelled after the UK’s Age-Appropriate Design Code, the AADC was introduced with the stated goal to protect children from online harms. However, while experts noted the UK version for improving the state of child rights online, the California law has received criticism for its ambiguity, which some say might end up endangering children further.
The Code, commencing from July 1, 2024, essentially puts the onus of child privacy and safety on businesses and states that children should be afforded protections “not only by online products and services specifically directed at them but by all online products and services they are likely to access” [emphasis added].
Some experts and rights groups warned that the lack of clarity in the Code will in fact threaten child safety and put children at a greater risk of online harms.
He won't do the right thing on the atrocity called AB 2273, either. Newsom and the overwhelmingly Democratic legislature are making terrible laws that threaten freedom of expression in major ways. Crickets, or support, from Calif's major media companies. https://t.co/V8NDO9KZTt
— Dan Gillmor (@dangillmor) September 15, 2022
FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.
Why it matters: Countries have various approaches in their child protection measures. The UK focuses on restricting access to children on a risk-assessed basis, while California in the U.S. holds businesses accountable for children’s safety. In India, earlier versions of the Data Protection Bill included ‘guardian data fiduciaries’ to regulate online company services directed at children. This category has since been done away with but experts told MediaNama that online harms relating to children may still be addressed in the Digital Media Act. Regardless, the Indian government needs pay attention to the measures adopted by other countries to ensure effective provisions on online safety and privacy while protecting children’s free speech.
Critics say stress on DPIA will affect innovation
The Californian law asks companies to complete Data Protection Impact Assessment (DPIA) reports before releasing any service, product, or feature likely to be accessed by children. The report will detail the purpose of the service or similar item, its use of children’s personal data and ‘risks of material detriment to children from the business’s data management practices.’
However, according to Eric Goldman, Law Professor at Santa Clara University School of Law, the DPIA will compel businesses to self-censor themselves to please the government. The report will be freely available to government enforcers on request, making regulators and judges the real audience for the DPIA.
In his blog posted on TechDirt, Goldman argued that the DPIA “puts the government enforcer’s concerns squarely in the room during the innovation development (usually as voiced by the lawyers).”
He warned that such hurdles will suppress innovations and nudge businesses to restrict their features from children, ‘shrinking the internet’ for Californian children.
AADC generalises age requirements and concerns
Another criticism against the Code is its one-size-fits-all approach when considering the best interests of “children.” As per the AADC, a “child” or “children,” means “a consumer or consumers who are under 18 years of age.” However, it is obvious that the best interests of an 11- year-old will be vastly different from those of a 17-year-old.
Goldman stated that this put “fiduciary obligations” on companies towards millions of individuals. Even setting aside the age differences, children can be categorised through many other social lenses like class, caste, race, geographic location, economic status, etc. In fact, when the law was passed, Fight for the Future, a digital rights group, said the Code will threaten human rights and free expression online.
AADC puts minorities at risk
Evan Greer, Director of Fight for the Future, said in a statement that the AB 2273 will make children “less safe, not more safe, online.” Greer said that the ‘vague and broad’ manner in which the Bill was written will allow widespread use of invasive age verification, leading to more surveillance. LGBTQ+ youth and other vulnerable groups, for whom internet has served “as a lifeline at times,” will specifically face the repercussions of this move.
Further, Greer said some people proposed biometric face scans to access websites or online services under this Bill. Such age verification requirements make it nearly impossible to use online services anonymously, threatening freedom of expression of human rights activists, whistleblowers, journalists, etc.
“It’s immoral and dangerous for lawmakers to continue using children as pawns to advance poorly-drafted legislation that does more harm than good,” said Greer.
She pointed out that there are other issues for the government to focus on such as crack down on child-surveillance software like e-proctoring and facial recognition in schools. Incidentally, critics like Goldman have pointed out how the AB 2273 too increases the threat of facial verification for children.
AB 2273’s age authentication needs conflict privacy safeguards
Privacy experts are famously against the widespread use of face scanning be it for children or adults due to its long-term privacy and security risks. In countries like India where there is no data or privacy protection law in place, privacy advocates are doubly wary of such norms. Yet, California’s Code demands that an individual’s identity as a child be verified by the companies. It appears to encourage face scanning or uploading age authenticating documents for a person accessing a new website.
The leaning towards facial recognition has created quite the buzz in recent years. Interestingly, Dr. Sonia Livingstone, OBE and Professor at the London School of Economics and Political Science has expressed a unique take on the issue in a recent PrivacyNama event.
Should age verification be applied for all?
During the ‘Privacy, Children and Access of services’ session on October 7, Livingstone aired the idea that every person and not just children should have their age verified.
“I think people have been a little slow to grasp that [everyone’s age should be verified]. You cannot age verify a child and then say, okay, no pornography or get parental consent. If we want children dealt with appropriately, we’re talking about some kind of age verification for everybody,” said Livingstone.
She recommended that this be done on a risk-assessed basis as per the recently passed Digital Services Act in the UK. For example, a person need not age verify just to check the weather but they do need to do so when signing up for a social media platform or accessing pornography.
Taking a child rights approach, she said that age verification should be used for protecting children while not limiting children or anyone’s rights of opportunities of civil rights and freedoms.
“I think that has to be the demand on companies. You must find a way of protecting that does not limit civil rights and freedoms. It cannot be that you protect children just by putting them in a narrow box and they cannot be agents and citizens in this digital world. So that has to be wrong,” she said.
At the same time, Livingstone said that the data used to age verify must not be used for any other purpose, and that authorities remain ‘privacy respecting.’ This requires strong communication between regulation and data protection authorities. According to the Professor, this means that people need trusted intermediaries to look into age verification rather than platforms or the companies.
Who calls the shots on children safety and privacy policies?
Yet another criticism against the California Code is how the law puts powers and responsibilities in the hands of companies but leaves parents with practically nothing. Even though the Act mentions the Parent’s Accountability and Child Protection Act, the law does not provide any powers or particular role to the parents. To the extent that the law’s definitions do not even include terms like “parents” or “guardians.”
Incidentally, the bill does refer to a 2019 statistic as a justification for the Code. As per this data point, “81 percent of voters said they wanted to prohibit companies from collecting personal information about children without parental consent.” Yet the law takes parents out of the equation and focuses solely on a child’s interaction with a business. Worse still, the law also fails to consider the opinion of youth leaders.
No youth representation in the Working Group: The law called for a California Children’s Data Protection Working Group to deliver a report regarding best practices for implementing the Code. The 10 members in the group will include:
“1) Two appointees by the Governor.
(2) Two appointees by the President Pro Tempore of the Senate.
(3) Two appointees by the Speaker of the Assembly.
(4) Two appointees by the Attorney General.
(5) Two appointees by the California Privacy Protection Agency.”
The members are expected to have expertise in at least two of the following areas:
“(1) Children’s data privacy.
(2) Physical health.
(3) Mental health and well-being.
(4) Computer science.
(5) Children’s rights.”
However, nowhere does the law allow for comments, recommendations from or inclusion of youth leaders. Even when talking about stakeholders, the Act mentions “academia, consumer advocacy groups, and small, medium, and large businesses affected by data privacy policies” but not child representatives.
So is the California law all bad?
However, Common Sense Media Policy Counsel Irene Ly told IAPP that the document forces companies “to re-prioritize, focusing more on kids’ health and well-being over profits and engagement.” Ly argued that the law will relieve the burden on parents to navigate through hundreds of cumbersome privacy policies and website and app settings to protect their ward from online harms. Instead, Ly recommended that the next step should be a legislation that “requires companies to make, and holds them accountable for, safer design choices.”
Recently, the New York government in the US also introduced a Bill on child data privacy and protection that has provisions similar to the AADC. As per its summary, the Bill “requires data controllers to assess the impact of its products on children for review by the bureau of internet and technology,” among other things.
Aparajita Bharti, Founder of The Quantum Hub who also spoke at PrivacyNama, said that Ireland and other countries are also thinking of such age-appropriate guidelines. She pointed out that this provided India with plenty of literature to draft its own child safety from online harms provisions.
“It should be a Code under the PDP Bill. I agree. But I think we should not wait for the Bill to get passed to start discussing what that Code should look like in the Indian context,” she said.
This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.
- Summary: Ireland’s Data Protection Commission Fines Meta’s Instagram For Threatening Child Privacy
- Summary: Haugen’s Disclosures Shape California’s Social Media Platform Duty To Children Act
- Facing The Consequences Of The Data Protection Bill On Children’s Digital Privacy
- #NAMA Children And Privacy On The Internet: Can And How Will Guardian Data Fiduciaries Manage Consent?