“Yesterday, I met a 5-year-old who is on Tik Tok and who is creating content that may be indecent by traditional wisdom. In a world where children are creating content, shouldn’t discussion about what they should be exposed to also evolve a little bit?” asked Snigdha Poonam, an independent journalist, at a MediaNama discussion on online content regulation, with support from Amazon, Netflix and Facebook, and in partnership with NLU Delhi’s Centre for Communication Governance (CCG) on September 26.

Poonam’s comment sparked off a disagreement about who is responsible for controlling children’s access to online content — only parents, or do platforms carry some responsibility for protecting children?

It takes a village to raise a child

“I don’t think it’s a complete parental liability and a parental responsibility to purely take care of what content their child is consuming,” Ekta Rohra, the Design Director at IBM India, said in response to Poonam. While parents are undoubtedly responsible for the content their children watch, the platforms bear at least “20%” responsibility as they “can recognise that it’s a child” viewing the content and thus “serve appropriate content” which can increase attention spans, Rohra said. She said that parental controls on YouTube are not good enough.

“The platforms that create and put out content can also be responsible about the content that they are putting out. Through AI, computer vision, facial recognition and mood recognition, it is possible to know today whether a certain video clip contains a child or not, or, if the content is being consumed by a child or not.” — Ekta Rohra

Understandably, the privacy concerns associated with implementing any kind of facial recognition technology to the content we view were highlighted by Nikhil Pahwa, editor of MediaNama. It is doubly damning when children could be subjected to such kind of surveillance.

Ekta Rohra at the round table discussion

Ekta Rohra at the round table discussion

This responsibility, as per Rohra, can be a “code of ethics, an assessment, an audit, rules, regulation, or a policy”. Agreeing with her, Anugrah Abraham, the Humanitarian Resilience Officer at Change Alliance, called on the technology companies to “be more proactive in sharing that responsibility”.

Children as content creators: Another way of looking at regulation

Traditionally, most discussions around content regulation have centered around protecting children from “inappropriate or indecent or objectionable content”. But, with the rise of user-generated content, children are now creating content “that is by traditional wisdom inappropriate, indecent, objectionable”, Poonam pointed out.

Most content regulation has historically targeted the viewer, not the content creator. Thus, it has sought to limit viewers below a certain age from watching “objectionable” content. “I think what you’re saying about children creating content is an interesting point because it flips the access of regulation in some ways which is something that doesn’t exist in our imagination I think right now,” said Manish, a research associate at Centre for Policy Research, told Poonam. If the liability for children uploading content started resting with the platforms, no platform could survive such a liability, Pahwa noted, referring to the Supreme Court’s judgment.

Snigdha Poonam

Singdha Poonam discussing children as content creators at the round table discussion

Another point to consider is the consent of the child as the creator and subject of the video. “Is the child even aware or capable of processing the implications of a video of her dancing?” asked Sohini Chatterjee, a Research Fellow at Shardul Amarchand Mangaldas. And who is the administrator of such an account that primarily features a child? Although consent from a minor is inadmissible, especially when it comes to data protection and privacy, we cannot ignore the changing content generation landscape. Chatterjee also pointed out that no one had figured out how to have effective age verification controls on internet.

For regulation purposes, approaching children-created content through the lens of child pornography would be more useful than trying to regulate the platform itself, Vakasha Sachdev, legal editor at The Quint, suggested. The new definition of child pornography now includes any visual depiction of sexually explicit conduct involving a child.

“Whoever is responsible for creation of that content, including the parent, would be responsible under the POCSO new rules and child pornography anyway.” —Vakasha Sachdev

Sachdev cited an American case where two minors were prosecuted for consensually creating a sex tape. As they were minors, they did not get a severe punishment. (Read about a similar case here.) However, just like streaming platforms, UGC platforms also have to implement tools so that pornographic content is not uploaded, and even if it is, it is quickly flagged and removed.

But platforms can’t do the parents’ job

A number of people disagreed with Rohra and Abraham about the role of the platforms. In a judgement, the Supreme Court had said that parents had to exercise some amount of control over the kind of content their children had access to, Abhishek Malhotra, Managing Partner at TMT Law Practice, said. “Let’s say that if a parent wants to buy a magazine that has content that is only meant for adults, that magazine could be in circulation in India, it could be a foreign magazine which has circulation in India also. … The [Supreme] Court said that you can’t stop the adults from viewing what they are legally entitled to view. The argument from the other side was [that] the magazine or the content could be lying on the dining table, or the drawing room table and the children could be looking at it. The Court said we can’t regulate that kind of activity on such a minuscule level. … You can’t use the state mechanism to come into literally people’s drawing rooms,” he said.

Giving intermediaries such a responsibility would be “a parenting disaster”. “You can’t let people other than the parents take a call on what the kids should do,” Avinash Ramachandra, a policy professional, said. He said that just like talent shows, perhaps the parents want their kid on TikTok, and have given consent on the minor’s behalf. Khurana also agreed that it was the parent’s responsibility to see what the kids did.

“The keys to controlling children needs to lie with the parent and nobody else. … If the parent cannot control the kid going out on Tik Tok or watching on YouTube, it’s a parenting disaster. It’s as simple as that.” — Avinash Ramachandra

Ramachandra said that companies can only provide tools, but it is the parents who have to use them. “It’s very easy to blame or shift the responsibility to somebody else. … Letting the AI engine figure out whether a child is watching it and block it is not the answer,” he said. Through parental controls on platforms, Malhotra noted, certain kind of content that is fit only for adults is already excluded for the subcategory of children by design. And excessive regulations on content often lead to higher rates of piracy as “content is available in multiple places, [in] multiple modes”, Khurana pointed out. In such a scenario, “It’s the responsibility of the platform to provide tools to the parent”, but they have to ultimately be deployed by the parents themselves.

Ambika Khurana

Netflix’s Ambika Khurana

And letting algorithms do the filtering is not the answer either. They are not good enough to not censor useful information, Sumant Srivathsan said. A more granular definition and evaluation of content is the user’s responsibility in the absence of intuitive technology, he said. Most parental advisories fail if parents don’t ensure that children only watch appropriate content because at the end of the day, it is the parents who have to make sure that children leave the room when parents watch such content, he argued. “I would rather be the judge of what I want to see and what I would allow minors in my care to see than an algorithm,” said Sumant Srivathsan.

These limitations of technology in indexing content were recognised by streaming platforms as well. “Age classification needs to be done with a lot of sensitivity and maturity. It should not just be left to tech tools but be aided by tech tools and driven by a lot of human intelligence. It’s an art, it’s a science and it should be upheld,” Khurana said.

“A successful online content regulation would let creators continue creating what they want to, and would protect the most vulnerable sections (children) who spend a lot of time online,” Khurana said. When it comes to regulating content online for children, standardisation of age ratings, and strengthening and using parental controls is the way ahead, she continued.

How to control access to content

If we’re going to have regulation in the content space, we can possibly give users more means to control content, said Nikhil Narendran from Trilegal. “The reason why regulations around cable TV and broadcast existed was because the nature of delivery of content itself was very different. It was analog, which gave the user or family or someone at home very limited choices to control the content they watch. But, that’s not the case with OTT itself. Let’s say a regulation needs to come: and is there a possibility to work out a consent artefact, where more choice is given to the users through sub-groups, parallel permissions, age verification techniques and so on, to ensure that no one below a certain age watches certain content.”

There are three levels at which content can be controlled, said Deepak Maheshwari from Symantec. One, any content control can happen at three different levels, one is at the recipient level; second at the user level, and third at the conduit level. “In July 2006, the government had clarified that every telecom service provider who has a license in India has to provide unfettered access to the internet, except for the government-blocked specific web pages or URLs. So, this is something which can also be looked at not only at the level of the source or the recipient, but also between at the conduit level.”

Note: Quotes have been edited for length and clarity.

*

Read our other reports from the discussion here.

Correction: (October 3, 2019 11:21 am): A quotation was incorrectly attributed to Shohini Sengupta (Esya Centre). It has been corrected to identify Sohini Chatterjee (Shardul Amarchand Mangaldas) as the speaker. We regret the error.