wordpress blog stats
Connect with us

Hi, what are you looking for?

How do we protect children’s privacy online? Speakers weigh in at #PrivacyNama2022

Panelists at PrivacyNama 2022 conference on protection of children and their privacy coming to the fore in regulatory efforts globally

privacynama 2022

An expert panel got together virtually to discuss children’s privacy rights on 6th October 2022 at Medianama’s PrivacyNama 2022 event. These policy professionals weighed in on – children’s privacy rights in India, age appropriate design codes, restricting children’s access to content and more. We present to you a summary of all the major talking points in the 90-minute discussion. The panel hosted by Medianama founder Nikhil Pahwa featured Sonia Livingstone, media professor at the London School of Economics, Aparajita Bharti, co-founder of YLAC and The Quantum Hub and Nivedita Krishna, founder of Pacta.

MediaNama hosting these discussions with support from Mozilla, Meta, Walmart, Amazon, the Centre for Communication Governance at NLU Delhi, Access Now, the Centre for Internet and Society, and the Advertising Standards Council of India.


FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.


Did the age-appropriate design code serve its purpose?

The age appropriation code is a part of the U.K.’s data protection law, said Sonia Livingstone. “…it came into force a couple of years ago and has led to a number of improvements for children and children’s rights in Britain,” she added. The next three sections summarise her speech:

About the U.K.’s age appropriation code

It sets out 15 standards to be followed by digital platforms that are likely to be accessed by children. It includes things like:

  • not geo-locating or making their (children’s) presence visible to others
  • not profiling them for ‘negative’ purposes that are against their best interest
  • not nudging them in ways that are against their right to interest, etc

How organisations complied with the design code:

  • Instagram stopped adults from being able to send messages to children who don’t follow them
  • YouTube made its default setting private for those under 18 years
  • Twitter expanded its safety mode to block abusive messages to children

Impact of the design code

  • Attention has shifted from services which are directly intended to be accessed by children to those that are “likely” to be accessed by children.

“…and that language of likely to be accessed is very powerful.” – Sonia Livingstone

  • Before rolling out new innovations, big platforms are beginning to do child rights impact assessments to find out how their services might adversely impact children.

Dealing with vagueness in the design code

The design code needs to be followed by online platforms which are likely to be accessed by children. The host, Nikhil, asked Sonia how words such as “websites likely to be accessed by children” can be interpreted.

“The fact that the definition of likely is vague, does not mean that the need for regulation is vague.” – Sonia Livingstone

Sonia suggested a risked-based approach:

  • A child is at low or zero risk when using a weather app, for example. Implementing the code is not needed in such cases.
  • But an app like Instagram harvests data and therefore implementation is required in such cases.
  • Complying with the design code will affect smaller players but the Data Protection Authority has to prioritise and take action where the most “egregious” treatment of children’s data is taking place.

How to legislate children’s privacy rights in India

“…let’s not wait for the PDP bill to get passed. As soon as they’re done with that draft, we should start discussing what this code (for children) should look like,” said Aparajita Bharti. Here’s a brief of what she said:

  • The government is working on another draft bill and there is space to start discussing a code for children already.
  • The government is aware that children are more at risk than others, so the discussion about a children’s code should begin now.
  • India can learn from the experiences of other countries. The U.K. has already enacted a design code, California has passed a similar code and New York’s code is in the discussion phase. Ireland is also thinking along the same lines.
  • It has been mentioned that the upcoming Digital India Act will include online harm against children.

“It should be a code under the PDP bill. I agree But I think we should not wait for the bill to get passed to start discussing what that code should look like in the Indian context.” – Aparajita Bharti

How the age of consent affects internet access

Nivedita Krishna, the founder of Pacta, weighed in on child-safe designs and getting the consent of children.

  • Most children in India access the internet using their parent’s phones or gadgets
  • “I’m a little bit sceptical about childsafe designs, particularly if we don’t acknowledge the importance of having OTTs take accountability for the sort of content that is on their platform”
  • A transparent algorithm is an important part of building child-safe design.
  • “…an artificial intelligence program can maybe verify a child white, you know, a white Caucasian child, AI is not going to be able to verify a brown, Asian, Indian sort of child or black child for that matter, or the age of that it can’t even verify the identity of black people.”
  • “The law requires an age of majority to be able to consent, the concept of non-contractual consent is absolutely foreign to India, we don’t even recognize non-contractual consent, it would be nice to bring that in considering that children these days, some of them are capable of understanding what they are going into. But sort of allowing for a non-contractual consent and then backing it up with several other things like algorithmic track transparency, accountability for, you know, what is on their platforms by the OTT. So these sorts of things might make the internet safer.”

On verification by social media companies

  • Livingstone said that companies deny being able to identify if the user is a child or not, however, there’s another point of view – that companies can identify this if they know things such as if the user is looking for a new pair of shoes and the colour they’re looking for.

“I think some of the AI being used by Google, for example, they look at how old your friends are, they look at, you know, what kind of likes and dislikes, they make a pretty fair guess. I think the core now is to make those processes transparent. Because we need to know what data they make these judgments on, we need to know where the errors are, in that judgment.” – Sonia Livingstone

    • Age verification would mean that everyone’s age will be verified, not just of the children, Livingstone said. “If people are checking the weather, you do not need to age verify, if they are signing up for a social media platform or want to access pornography, there’s a lot of grey in between”.
    • According to a European Commission’s direction in the recently passed Digital Service Act, age verification may not be needed by every platform.
    • Another principle that the European Commission has put into law is the audio-visual media services act which states that data used for age verification cannot be used for any other purposes.
  • Bharti believes that over-reliance on parental consent is an issue.
    • “First, in general, even adults, do they understand consent? Are they giving it freely? Do they understand everything, even for themselves? There are enough records to say that there’s a lot of consent fatigue, and therefore, they themselves do not process those. So are they taking the right decisions, for their children, not always?”
    • Girls will get impacted negatively because parents have a different ‘social make-up’ in their minds about what all is ‘allowed’ for girls to do online, she added. She also said that this will also impact children from backward communities whose parents are not as well aware.

Profiling of children

  • Nikhil asked if all sort of profiling of children is bad, as some profiling may lead to better content delivery. He also asked the panellists how can one enable helpful profiling and disable harmful profiling.
  • Nivedita Krishna said that all profiling is not bad. She went on to say that users should have the flexibility to choose what kind of feed they should have, whether they should be based on their close friends or any other preference. If one has a transparent algorithm then profiling does not become so problematic, Krishna added.

And if a child wants to be on social media, they only have one Facebook, one Instagram, one LinkedIn to go into and so consent is irrelevant at that point. I think, algorithmic transparency, and combine that with settings that empower an individual to choose how they want feed to appear on their profile would be more sort of practical, useful ways of allowing profiling to happen for commercial, etc. reasons, but also allowing children to drive safely.

  • Livingstone went on to highlight the positive and negative impacts of profiling. Profiling can be used by ed-tech companies to identify students who are academically struggling, whereas it can also be used to nudge people in a particular commercial direction, she said. “I think anywhere but from a child rights point of view, going back to the age appropriate design code, the argument is, you must not profile a child for commercial purposes, that is against their best interests.”

This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Read more:

Written By

I cover privacy, surveillance and tech policy. In my reporting, I try my best to present the most relevant facts, and sometimes add in a pinch of my thoughts.

Free Reads

News

Any licensed service provider will be eligible for testing in the regulatory sandbox as principal applicants, provided they meet the conditions laid down for...

News

The FIR has been filed with the Cyber Crime Cell of the Mumbai Police against an undisclosed person under sections of the Indian Penal...

News

Paytm streamlines UPI services, transitioning users from Paytm Payments Bank to four major PSP banks after NPCI green light.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...

News

Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...

News

The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...

News

Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...

News

Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ