More than 50% of children used shared devices, and there isn’t a level playing field to access to the internet, said Siddharth Pillai of Aarambh India. This is the imperfect world that guardian data fiduciaries in the country will have to navigate, according to the provisions of the Personal Data Protection (PDP) Bill, 2019. These fiduciaries are supposed to verify the age of their users, and obtain consent from their guardian or parents if the user is a “child” — anyone under 18.
How will these fiduciaries do what they do, considering this complicated and confusing mandate? Held on December 9 with the support of Facebook and Google, our discussion explored how companies, NGOs and child welfare bodies will be affected by the PDP Bill. Watch full video of the discussion:
A bit of context: The PDP Bill, 2019 has defined guardian data fiduciaries (GDF) as entities that (i) operate commercial websites or online services directed at children or (ii) process large volumes of personal data of children. This seemingly board definition will potentially include educational institutions such as schools, edutech, gaming companies and so on.
- GDFs are prohibited from “profiling, tracking or behaviourally monitoring or, or targeted advertising direct at, children”. Essentially, they cannot process children’s data that can cause “significant harm” to the child.
- Failure to adhere to the provisions can attract a fine of ₹15 crore, or 4% of the company’s global turnover.
What does this mean for companies?
The Bill suggests that all data fiduciaries will have to age-gate in some form, regardless of whether your activities are directed towards children or a general audience or you have some content that is likely to be harmful to children, said Sreenidhi Srinivasan of Ikigai Law. She said that websites that are not necessarily geared towards children, for instance about wars or world history or even MediaNama.com, would have to have age-gating mechanisms in place. “I’m not sure if you will require sophisticated, complex age-gating tools, but you will need something regardless,” she said.
Srinivasan said that only websites that have a zero-data collection policy, where names, IP addresses, or anything else isn’t connected would possibly be exempt from age-gating. However, if the website collects sensitive information — such as a dating website — it would need some sort of a hard mechanism.
But how practical is it for websites to have such a system, and will it really be foolproof enough? Rahul Narayan, an independent lawyer, said that the sensible way for the regulation would be to apply a “reasonable standard” test. “You can’t expect perfection for this, but like the GDPR you just have to make the reasonable effort. So whatever the best technologically possible way, you identify like that. I don’t think they can go any further, they can’t expect perfection when none exists,” he said.
Across the world, there is indeed some consensus on the practicality of age-gating, felt Srinivasan. It’s hard to do this in any foolproof manner without collecting more information, such as an Aadhaar-OTP, for instance.
“That’s a disproportionate amount of data you’re collecting for something as simple as accessing a website. That flies in the face of data minimisation principle, which is another core tenet of data protection,” Srinivasan said.
Rajiv Chilaka, CEO of Green Gold Animation, which has children’s entertainment and gaming properties, said that businesses will have to work things out how they go about implementing age-restrictions. Speaking on edtech companies, he admitted that their business model depends on their ability to track the progress of a child. “In such a scenario, without tracking […], maybe the user can be anonymous or the user name be a pseudonym,” he suggested.
- Impact on business: At the same time, Chilaka agreed that a ban on use of children’s data and profiling of their data can have huge impacts on business. He took Green Gold’s YouTube channel as an example. Earlier in January this year, YouTube rolled out a series of changes for its platform that prohibited personalised ads from being shown to children. This change, Chilaka said, led to a 90 per cent drop of YouTube revenue for Green Gold.
- ‘No foolproof way for an app developer’: Chilaka explained the operational challenges for a mobile app developer, for whom there doesn’t exist a fool-proof way to ascertain a user’s age. The only way to verify a child is, perhaps, to do a video call and look at the guy, but even then some 14-year-olds could look like 18-year-olds. He added that 13 as the age of consent was “fairer” on companies.
- On defining GDFs: During the conversation, Narayan pointed out problems with the creation of a wholly new entity in guardian data fiduciaries. What is important is that any app or entity that deals with children’s data should follow certain guidelines, however the Bill imposes these restrictions on the GDFs. Only GDFs are prohibited from profiling children using data, but does this mean anyone who isn’t a GDF is allowed, he wondered. Srinivasan agreed, adding that the restriction on profiling could be better done through the construct of significant data fiduciaries (SDF), where there is more transparency requirements for processing data.
Should educational, children’s institutions be dealt with differently?
One of the major concerns with the PDP Bill, according to most of the speakers, was the fact that even educational (or edutech) institutions will also have to adhere to the same, strict standards. For instance, a school might potentially count as a guardian data fiduciary. By extension, the preparation of report cards — which entail collection of personal data of children — might run into obstacles.
There has to be a distinction between schooling and other activities, said Narayan. He explained that in the UK, there is a general law that is applicable, while the GDPR is primarily applicable to online services. We have this one umbrella legislation that covers both education and non-educational institutions, which is not a particularly smart way to go about it, he said. Instead, he argued for an activity-based approach to regulation: for instance, edutech companies or schools could be allowed to process children’s data, provided they adhere to higher standards than other institutions. If it is a video game company, indeed they shouldn’t be allowed to track or profile children, but schools should be able to, added Narayan.
Srinivasan agreed that a reading of the Bill does indicate profiling, monitoring, tracking data and behavioural advertising is not allowed. Similarly, it also takes the “hard view” of prohibiting any activity that can cause “significant harm”. She said that schools keep record cards, disciplinary records, performance certificates over time as this helps them come up with better assessments and design curriculum. Hence, there is a need for nuance on what uses cases are harmful and what are beneficial in some sense, she said.
Pillai added that the current prohibitions will impact and complicate how child welfare committees (CWC) and NGOs work. He explained that CWCs, despite being a government body, work closely with NGOs on a regular basis. They often share children’s data with NGOs, which in turn use their resources to reach out to the children and help them. He wondered how this process could be impacted by the Bill, since sharing or processing of data would essentially be banned.
Narayan, responding to Pillai, reiterated the need for proportionality and analysis on whether activities such as those of NGOs or CWCs are good or not, and for regulators to actually look into these issues. It makes you wonder whether there can an absolute ban on any of these things without impacting things that should not be impacted, he added.
Use of artificial intelligence in data processing
The GDPR specifically states that there shouldn’t artificial intelligence shouldn’t result in discrimination or adverse treatment of a child. I think that’s an important safeguard that needs to be brought into the PDP Bill as well, said Narayan. Srinivasan agreed, adding that the GDPR allows users to object to processes where outcomes are entirely automated. This speaks to larger questions about the right to explainability, she said.
How have other countries done it?
The United Kingdom’s Information Commissioner’s Office (ICO) has come out with suggestions on likely methods, Srinivasan said. One popular method is self-declaration, another is through knowledge-based tests, where you’d presume only adults will know the answers to. “The UK ICO also sets out hard identifiers, like official documents — you have assess the risks and see if the situation calls for it. Other tools may be linking to credit tools, assuming that that’s a product not many children will use,” she said. Another tool is artificial intelligence, which entail more data collection, she added.
In the United States of America, the Federal Trade Commission has a test that allows businesses to determine whether they need to implement age-gating mechanisms. For instance, if a business uses animated characters or celebrities popular with children, or if it has actual knowledge of the fact that a child is below 13, they cross the threshold. The PDP Bill does not provide for that sort of differentiation, something that will likely be dealt with in the regulations and codes in the future, said Srinivasan.
‘PDP Bill should focus on giving children agency’: Thoughts on improving the Bill
Speaking on the changes that he would like in the Bill’s next draft, Pillai said that the Bill should not try to protect children by taking away their agency. Children feel safer when they have more control, and taking away control doesn’t work, he said.
Chilaka said that there is indeed a need to make the internet safer for children, but reiterated that the age-restriction mechanisms seem very tricky to implement. “We have to have more privacy for everyone, and then a law for kids becomes easier to implement. I mean, if you visit a storeroom to buy anything, even a grocery shop, the first thing they ask you is your mobile number […] Even at a ground level, they are collecting so much data,” he said, indicating the need to discourage data collection on a universal basis.
Narayan reiterated his stance on the need for distinctions based on the purpose of data collection as well as whether the parent’s consent is important or the child’s. In the context of education, the consent of a parent is important, but in cases like counselling, the consent of child might be more important. I think there needs to be a distinction between these use cases, he said.
Srinivasan said that the Bill should avoid being very prescriptive when it comes to processing of data to account for a beneficial use cases of profiling. In addition, she reiterated that the Bill should prescribe a “reasonable standard” for age-restrictions since platforms don’t have a fool-proof way of implementing them.