On October 26th and 27th, MediaNama held the 2023 edition of its annual privacy conference, PrivacyNama. This year’s session focused on various aspects of rulemaking under India’s recently-passed data protection law, and the implications of different approaches that might be taken on businesses and citizen rights.
Our objective for this discussion was to unpack the execution of various provisions of the Digital Personal Data Protection Act, 2023, like:
- How can companies obtain ‘meaningful’ consent for data processing from users based on digital ‘notices’?
- How should the government accommodate for childrens’ agency when devising procedures for platforms to obtain ‘verifiable consent’ from their parents?
- What functions and skills do data protection officers need to successfully balance company interests against compliance with privacy laws?
- How can rules under the law be drafted to help users better access their data protection rights, including their rights to erasure, correction, and more?
- How do data protection authorities from around the world identify areas for regulatory attention, and how does their policymaking process work?
- How will global companies, and India’s digital economy, be impacted by a potentially restrictive cross border data flows regime?
- How should rules under the law distinguish between data processors and data fiduciaries?
- How can India Inc lead the way in helping the government identify priority areas under the law that require clarity in the rules?
- How should the Data Protection Board of India be ideally staffed and structured so as to fairly evaluate privacy complaints under the law?
Download a copy of the event report
Executive Summary
There’s a need within companies to make India’s Data Protection law intrinsic to their DNA by developing privacy programs that holistically balance the interests of business functions and consumers, Privacy heads from Unlimit, LTIMindtree, and PhonePe concurred at PrivacyNama. This requires a deep understanding of technology, of where data is collected and processed within a company, sensitising internal teams on data protection law compliance, taking granular consent from users, and implementing opt-out mechanisms for processing. Company boards should also provide privacy officers with independence, budgets, and investments in technology. Jurisdictional compliances will also impact policies: for example, while India’s data protection law allows companies to scrape publicly available data, the officers advised companies against doing so, as they become joint controllers over that data (in European legal terms), which would require taking consent from the user for that processing again.
They were less enthusiastic about the rules mandating integrations with consent managers, given the level of agency companies have over privacy protection. Larger companies anyway have internal consent management systems. Mandatory integrations with consent managers may complicate fiduciary interactions with regulators and auditors, who often demand consent-related artefacts and evidence—it is unclear how consent managers may share this information with the fiduciaries themselves in such cases. To aid compliance with the data protection law, the officers added that the government should bring out detailed guidelines, obligations for data processors, and civic awareness campaigns.
During the Data Protection Commissioners Roundtable, privacy regulators from Japan and Iceland reinforced the need for regulators to be independent and with rulemaking powers to fairly deal with privacy conflicts. Unlike the proposed Data Protection Board of India, both can issue rules on specific data protection matters, like children’s privacy or cross border flows. In Japan, rules and guidances are often developed based on consumer or business inquiries. Stakeholder consultations with business, academia, and others are held to understand future areas of focus, or issues that require amendments or clarifications. Public consultations must also be held before drafting and issuing rules that impact people’s rights and interests. While suggesting top advice for new data protection offices, the Japanese authority added that the best option is to develop rules that are as simple as possible, to enable a culture of compliance for entities bound by them. Additionally, the Icelandic authority specifically conducts detailed surveillance before identifying its priority areas of regulation each year, a process that involves auditing and investigating user complaints, or data breach notifications.
In this light, the scope for clarificatory rulemaking under India’s data protection law is vast. For example, India’s law is unclear on whether notice has to be provided in the case of “legitimate uses”, or when data is processed without the user’s explicit consent. However, failing to do so could vitiate a user’s privacy rights, defeating the purpose of the law. The rules may need to clarify this concern. Similarly, the rules may also need to clarify whether notices need to be issued in the case of vicarious consent, or when one person consents to data processing on behalf of another. The rules may also need to distinguish between core data processing, and secondary or incidental data processing, to avoid misuse of the provision. For example, if notices are required in the case of legitimate uses, a car company may have to distinguish between vehicle-related marketing processing for a consumer, and marketing processing related to other current and future company products.
For notice in the case of explicit consent, speakers observed that the rules could prescribe formats for more “meaningful” and transparent data processing notices to help data principals provide informed consent—like providing information on what data is being collected, how it’ll be used, how long it’ll be stored, and their rights over the processing, including withdrawing consent. The rules could provide sector-specific notices for industries healthcare, which deal with sensitive health data. They could include guidelines for categorising data (into sensitive buckets), how it can be processed, and the duration of these activities, to help patients make informed decisions. The rules may also need to provide relatively specific notice formats for high-frequency platforms (where users interact a lot with the platform, like social media), as opposed to lower-frequency platforms (like a health check-up platform). Notably, companies may no longer be able to send surprise deliveries given that processing personal data, like an address, requires explicit consent from the recipient, an area that the rules may need to clear up as well.
On deprecating consent—or automatically withdrawing consent for users who haven’t used a service for a while—some noted that processing has to be specific, and that data cannot be stored forever. Solutions for companies include setting up consent management centres allowing users to opt out of services they’re no longer interested in, or simply deleting data that no longer serves their purposes. Companies may also have obligations to delete data should a user withdraw consent—if so, the rules could introduce timelines for this, so that companies regularly delete such data.
Consent is more complicated for children—parents or guardians must provide “verifiable consent” before data processing of under-18s can happen. However, this method requires verifying the age of parents too. Companies can consider different mechanisms for this: like ID card verification, cryptographic question-and-answer tests to safely determine age, self-disclosure, or biometric verification. But, these methods are all circumventable by children, raising the point that they may not desire to have their digital access mediated by their parents in the first place. Research into the aspirations of Indian youth could help inform government rules on verifiable consent, and platform design. The rules may also have the space to consider a risk-based approach—based on the platform’s potential harm to the child—and mandate age verification accordingly. The rules could also consider a sliding scale for the age of consent, to help older children give consent without parental interference.
Additionally, India suffers from a digital literacy problem among parents, while governments themselves lack awareness on how to protect the massive amounts of children’s data they collect. Teaching children and parents about digital consent early can help bridge this gap, alongside improving government awareness on children’s data sharing and protection best practices.
However, for data processing companies with no direct exposure to users, like B2B companies, issuing notice and obtaining consent may complicate their operations, an issue that the rules could carve exceptions for. The rules could also allow users to give consent post-processing, with a clear opt-out facility from the processing itself. At a larger level, this highlights the need to distinguish between data fiduciaries and processors in the rules, while accounting for the possibility that the same company could be considered both, depending on the processing they undertake (for example, B2B payments companies). The data protection law currently imposes the same kinds of obligations on both entities—such as undertaking processing for lawful purposes, and more. Various specific obligations may also be baked into contracts between data fiduciaries and processors—for example, ensuring the data has been collected lawfully, if consent was taken, and if they had an audit trail.
On the question of user rights, the law can do more to specify how companies are supposed to uphold them. For example, user rights may also be impacted by “state instrumentalities”—or broadly defined entities that can exempt themselves from following parts of the data protection law. The extent of this impact will depend on how the rules notify the scope of data fiduciaries that can be exempted, and the volume of exemptions themselves.
The law also states that users have rights to access readily available grievance redressal mechanisms without clearly defining what this term means. Grievance redressal rights and systems may also be complicated by areas where data is collected offline and then digitised later (like hospitals)—it remains unclear as to how consumers would be made aware of the digitisation (and their new rights to redressal) after they’ve handed over their data. Citizens also have the right to request information on data processing—given that the formats of such disclosures are unspecified, the rules could prescribe principles, such as including information on third-party sharing too. The law also prescribes the right to nominate an individual to manage a person’s privacy in the event of death or incapacity of mind and body, but the provision doesn’t account for the large number of people in India who lack digital literacy skills.
India Inc also seeks predictability in India’s cross border data transfers regime, which currently leaves the task of designating countries to which data cannot be transferred entirely to delegated rules and based on unspecified grounds. This impacts business and innovation: for example, cross border restrictions may limit the amount of artificial intelligence systems in India take in, affecting the subsequent accuracy of their results. To avoid arbitrariness in government decision-making, baseline principles can be included in forthcoming rules: like avoiding discrimination by deciding who to blacklist based on well-defined data protection criteria; imposing data responsibility or due diligence obligations on companies; and data resilience in the case of security breaches.
Top European Commission representatives present at PrivacyNama 2023 added that modern data protection regimes, including India, should outline broad methods to transfer data, such as the European Union’s (EU) popular model contract clauses used to bind data transfers from the EU to elsewhere. They added that they were closely following the evolution of India’s privacy law and were “looking forward to further developments”.
On data localisation, the enacted version remains silent on the issue. But, the law allows other sectoral regulators (like the Reserve Bank of India) to prescribe regulations on cross border data flows, which may involve data localisation too. This raises concerns of overlapping rules for companies to comply with, based on different sector-specific understandings of data protection. However, another section of the law says in cases of sectoral overlap, the data protection act will prevail—so whether companies actually have to comply with multiple sectoral rules on cross border flows remains unclear.
The European Commission representatives also underlined the need for a strong regulator for cross border flows that can equip businesses with compliance tools. Similarly, industry representatives highlighted the need for a ‘privacy advocate’ within the law guiding companies on how to best proactively comply with it. Without it, the law may end up being interpreted retrospectively. Alongside looking abroad for inspiration on data protection rule-making, India Inc could help fast-track fleshing out rules and guidances by specifying the issues it’d like clarity on. For example, the rules could consider a risk-based and staggered approach to data breach reporting by data fiduciaries, potentially based on the level of harm the breach causes a user. Additionally, the fines imposed on companies during breaches by the Data Protection Board should be ‘proportionate’.
Speakers largely concurred that the government-appointed Data Protection Board of India is more of an administrator of government rules. Nevertheless, some speakers argued that this weakened regulator might prevent it from becoming an ‘overactive’ institution, akin to the Telecom Regulatory Authority of India. Conversely, the Board can recommend the blocking of non-compliant companies’ content/services in India, a provision that may lack constitutional backing, and appears misplaced in a data protection law.
The Board’s funding, procedures, and its members’ qualifications are also delegated to rules. Clarifications are required on the Board members’ qualifications, as well as its strength, appointment processes, whether privacy matters will be delegated to individual members, and enforcement powers. The rules can also consider extending the Board chairperson’s tenure beyond two years to ensure experienced leadership, as well as additional state-level Boards to handle the expected large volume of privacy complaints. These concerns over capacity and expertise assume significance given that the Board will ultimately be the main actor interpreting the law—and thus impacting users’ privacy rights and company operations.
Nevertheless, the Board’s power to defer privacy complaints to alternate dispute resolution mechanisms may be a positive step to avoid overburdening the institution—it may even result in data subjects securing monetary compensation during mediation (a right that is otherwise missing in the data protection law itself). However, future rules may need to prescribe the qualifications of mediators, as well as procedures guiding mediation.
Videos and coverage
Day 1:
Day 2:
MediaNama’s coverage of the conference can be found here.
About the discussion
Speakers
Notice, Consent, and Grounds for Processing
- Sreenidhi Srinivasan (Ikigai Law)
- Rajeev Sharma (Tata 1mg)
- Abha Tiwari (Renault)
- Richa Mukherjee (PayU)
- B.G. Mahesh (Sahamati)
Children and Privacy
- Aparajita Bharti (The Quantum Hub)
- Manasa Venkataraman (Public policy professional)
- Uthara Ganesh (Snap)
- Nidhi Sudhan (Citizen Digital Foundation)
- Sonali Patankar (Responsible Netism)
Chief Privacy Officers Roundtable
- Rahul Narayan (Chandhiok & Mahajan)
- Vasudha Gupta (Unlimit)
- Jagannath P.V. (LTIMindtree)
- Bharat Saraf (PhonePe)
User Rights and Principles For the Rules
- Gangesh Varma (Saraf and Partners)
- Vrinda Bhandari (Lawyer)
- Swati Punia (Centre for Communication Governance)
- Amol Kulkarni (CUTS)
Data Protection Commissioner Roundtable
- Malavika Raghavan (Future of Privacy Forum)
- Junichi Ishii (Personal Information Protection Commission, Japan)
- Valborg Steingrímsdóttir (Data Protection Authority, Iceland)
Cross Border Data Flows: Adequacy, Rulemaking, and Sectoral Regulations
- Arindrajit Basu (Centre for Internet and Society)
- Bruno Gencarelli (European Commission)
- Venkatesh Krishnamoorthy (BSA | The Software Alliance)
- Bhawna Sharma (PricewaterhouseCoopers)
- Vivek Abraham (Salesforce)
Obligations for Data Fiduciaries: What Next?
- Prasanto Roy (FTI Consulting)
- Varun Sen Bahl (NASSCOM)
- Nehaa Chaudhari (Ikigai Law)
- Tamoghna Goswami (ShareChat)
- Pragya Mehrishi (Truecaller)
Data Protection Board: Adjudication and Next Steps
- Meghna Bal (Esya Centre)
- Anirudh Burman (Carnegie India)
- Alok Prasanna Kumar (Vidhi Centre for Legal Policy)
- S. Chandrasekhar (K&S Digiprotect)
- Arya Tripathy (PSA)
Participation
We saw participation from organisations including PayU Payments, People interactive, Pierstone, PLR Chambers, Polygon Technology, Primus Partners, Disney Star India, DSK Legal, A&M India Pvt Ltd, ShareChat, ShreeJee Academy, Singh & Singh Law Firm LLP (The IP Law Firm), Supreme Court of India, Tattle Civic Technologies, Access Now, Accessibility Lab, ADVAG, Ajatshatru Chambers, Alleviate IT Consultancy Pvt Ltd, Amity Law School, APCO WORLDWIDE, Arakoo AI, Ashwathh Legal, Australian High Commission, CCG, Zebra Technologies India Pvt Ltd, and Zedsoftpoint. NLU Delhi, TechBridge, TechCrunch, Accenture, CCMG, Jamia Millis Islamia, CDAC, Centre for Civil Society, Centre for Policy Research, Centre for the Study of Developing Societies, Sarvada Legal, SeedAI, SFLC.in, Shardul Amarchand Mangaldas, Jagran New Media, Jawaharlal Nehru University, K&S Digiprotect, KPMG, Lakshmikumaran & Sridharan Attorneys, Chanakya National Law University, The Asia Group, IT for Change, TechLegis, Times Management, Truecaller, TrustLab, Tsaaro, UKIBC, The Data Firm, The Quantum Hub, Times Internet Limited, Chandhiok & Mahajan, Chargebee, Chemistry World, Chitale Verma & Associates, COAI, Competition Advisory Services (India) LLP, CQ, CRED, CUTS Institute for Regulation & Competition, Cybersecurity-Nxxt, Cyril Amarchand Mangaldas, Data Security Council of India, DataLEADS, DeepStrat, Deloitte, Newslaundry, NLU Delhi, Office of Dr Amar Patnaik, Razorpay, Saikrishna & Associates, Member of Parliament, Oncquest Laboratories Ltd, DigitalTrends, AWS, AZB & Partners, Azure Data Protection Consultants LLP, Banerjee & Grewal Advocates, Bar Council of Delhi, BLS International, BTG Advaya, BYJU’S, Cambridge University, FTI Consulting, FUN2DO LABS, G&W Legal, Games24x7, GGSIPU, Google, Hero Fincorp, Hindustan Times, IAMAI, ICRIER, IDfy (Baldor Technologies), Ikigai Law. India Press Agency, Indiamart Intermesh Limited, Indian institute of mass communication, Indicc Associates, Inshorts, Dvara Research, EY, Foundation for Advancing Science and Technology (FAST India), J Sagar Associates,Saraf and Partners, Law Chamber of Anirban Sen, M&G Global Services, Malayala Manorama, Maruti, Meta, Ministry of Health & Family Welfare, MSS Law Chambers, Nagarro, NALSAR University of Law, Outlook Business, PRS Legislative Research, PSA, PW&CO LLP, PwC, Rapid IPR, Institute for Governance, Policies and Politics, Internet Society, Vidhi Centre for Legal Policy , VTG, and VVNT Foundation & VVNT SEQUOR.
Support and Partners
MediaNama hosted this discussion with support from Google, Meta, PhonePe, and Salesforce, and our community partners were CUTS and the Centre for Community Governance at the National Law University, Delhi.
