In our other reports on the Consent panel at the #NAMAprivacy conference in Delhi, we look at whether we should do away with consent altogether, and whether we need a data protection and privacy regulator in India.

*

There are two key parts to consent: the first is providing information in a manner that it is understandable for a user, and then the process of taking specific sanction. Consent isn’t broken, as a principle, but it is, when it comes to implementation.

No unfair terms

The EU’s General Data Protection Regulation (GDPR) lays it out pretty straight, Jochai Ben-Avie, ‎Senior Global Policy Manager at Mozilla pointed out: “You can’t have a little box that’s already checked. You can’t do ‘by scrolling down this page, you have consented’, or just by hitting this page you have consented to our terms.”

“What this legalise focuses on is, what are the ways the user is making an affirmative action, that you’ve provided the transparency or the information to do that, and in some cases, especially in low literacy environments, we’re going to need new ways of communicating these things to the end user,” he added. In the GDPR, there’s a requirement that companies should to have easy to understand terms of service. “That’s progress in the right direction.”

Renuka Sané, Associate Professor at NIPFP pointed out that “The draft Indian Financial Code lays out a principle that says that financial contracts cannot have unfair contract terms, or you have to have fair disclosure. Or that data controllers cannot do unfair conduct. These are timeless principles, and we can build on it in the data protection space. What does it mean to have an unfair contract term? That’s when consent becomes meaningful, because when the contract term is fair, I can meaningfully begin to process what I’m signing up to or not.” Prasanto Roy of NASSCOM pointed out that there have been examples of buyer-seller agreements have been overturned.

Malavika Raghavan, Project Head – Future of Finance Initiative at IFMR Finance Foundation pointed towards the legitimate interest test: “Firstly, the person or company processing the data needs to determine the lawfulness of the reason for processing data. Next, the firm needs to assess the necessity of the specific kind of data processing i.e. whether they have legitimate interest in accessing this data. For instance, If I’m a healthcare provider, I need health data. But if I’m a random mobile app which does payments, do I need your health data? In that situation, you as a firm need to check whether it’s legitimate to collect or use particular types of data. Finally, the firm has to conduct a preliminary balancing of their and data subjects’ interests: whether it’s in line with the customer’s reasonable expectations and will it cause them harm? You can get deeper and deeper into this. I’m not necessarily prescribing this for India, but it captures all of the stuff we’ve talked about.”

Jochai (Ben-Avie) talked about no surprises, and that’s about reasonable expectations: is it reasonable expectation for a consumer. If the legitimate interest is higher than the individuals harm, but if the individual harm is really high, then maybe you shouldn’t go there, even when the company and the individual have a legitimate interest in this transaction.”

Color coding contract terms

One interesting solution came from Anugrah Abraham from Change Alliance, who said that one approach to get accountability is to have color coding: If you have principles, and have color coding which the Learned Intermediaries (data protection regulator) can add to a, say, 20 page contract, saying that “on this principle, it scores a green, on this it scores a yellow and on this it scores a red.”…”If I see not just an app, but also a code that gives me a red flag, saying that there seems to be a problem in this area, a color coding might be a solution.”

Ben-Avie also pointed out that literacy is a big challenge, and experiments should be done with spoken prompts. The EU GDPRs also include a series of icons which may be adapted, to convey in different ways and places on how this data processing is happening.

Adding limitations to consent

Prasanto Roy pointed towards the legal approach to statutory rape, where you can’t have a minor who consents to sex as far as the law is concerned, and asked whether there are there any limitations that can apply to consent from a data protection perspective.

Aditya Berlia, of the Svrán Group, pointed towards other industries for guidance: “There is a program called the international qualified investor program, which they have to go through a process before they are identified by banks as clients who they can take additional risks, or give their consent beyond a point.”

Adding liability for people

Another idea that Berlia suggested that could be learnt from, was about giving individuals liability for failing to ensure compliance: “a QP program which the EU has come out with, for pharmaceutical companies as the third party intermediaries. Post the Moody’s blowout they said that third parties can only be single individuals with personal criminal liability, cannot be organisations. QP people in the pharma companies actually go into factories, look at IP, at their secret formulas inside their drums and then sign off they safety. There are thousands of these people who have been trained.”

Time-limited and active consent

Chinmayi Arun, Executive Director of the Centre for Communication Governance at the National Law University, Delhi, said that the marital rape case currently going on is “a classic example of how consent should not work”. “You consent once and you’re not allowed to withdraw without divorce. I feel like that the risk-based approach, active, and ongoing consent, with you not assuming that one round of consent is consent the next time. That can be important, depending on the degree of invasiveness that we’re talking about.”

Ben-Avie said that policies and notices cannot be a substitute for time-permission requesting. “You can’t do a privacy policy, even if it is easy to understand and in plan language, and wash your hands saying that my work is done here. Particularly when you’re dealing with sensitive information, you need to do an in-moment request for information. [For example] Do you really want to turn on your microphone? Do you really want to send your location data? We need to think about more sophisticated and continuous way to get active consent.”

Gradations of privacy? Different rules for different sets of customers?

“Are some violations of privacy more serious than others?” she asked. “I think creating a standard would very hard to do right now, but there is a difference between different kinds of violations of privacy.”

Renuka Sané of NIPFP said that in the financial sector context, “one way that we did think of it was for- for the lack of a better word – an unsophisticated customer, would have higher rights. There’s something called a suitability advice. I have the right to a suitable product. It’s one way to think about consent and all of these other limitations. Greater restrictions apply when you’re dealing with certain customers, when you think that they can’t make up their minds or not very quickly.”

Following up on Chinmayi Arun’s comment that even as we are regulating in terms of privacy, we also want to incentivise companies to come up with better ways of seeking consent and protecting privacy, Ben-Avie pointed towards how product development can be done with collecting more data from test users. “There’s a segment of people who have opted in and opted into each individual experiment on Test Pilot, which is gathering more data than what your general release feature will be. Then we can learn from that,” he said.

Registration for certain types of data

Malavika Raghavan, Project Head – Future of Finance Initiative at IFMR Finance Foundation pointed towards what african countries are doing things. “Eighteen different african countries have a registration required for people handling certain kinds of data. Half the problem here is that everybody collects data, so it seems ridiculous to try and regulate anyone. I’m not prescribing this, but they seem to have a regime that they’re going towards to registration requirements in relation to certain types of data.”

Putting users in control

Rahul Jain of Google, when asked by MediaNama about Google’s take on consent, especially in the context of the criticism of app store permissions,said that the company believes in putting users in control by giving them meaningful choices. Consent, he said, “has to be contextual. If companies have legitimate interests to protect the user, that is probably where you don’t need explicit consent. For example, securing a particular device, or checking spam. Then there could be situations where its sensitive information, where explicit consent is relevant. In other scenarios implied consent could be relevant. It’s dependent on the context.”

He said that Google has “tried to give more controls to users in terms of the apps they use. In the Android system and for each app you can simply switch off the kind of permissions the app is asking for. Those are the kind of user controls we’re trying to build and to gain trust.” Amazon’s Uthara Ganesh concurred, saying that the company is about giving users control over the data they’re sharing. She added that it is complicated in case of IoT devices, especially when they are interoperable across geographies, it becomes complicated. Twitter’s Mahima Kaul said that the company had introduced a ‘Do Not Track’ button, hoping that it would become an industry standard, but it found very little uptake.

Giving users alternatives

Ben-Avie also added that “The current debate, and this has not been settled. If they reject it should they still be allowed to use the service? Some services will not work as well without the personalisation of cookies. I’d be okay with that for some services. Can you offer an equivalent service that isn’t the totally exploitative one?” He said that users send businesses many signals, and it’s up to those who process the data to listen to these signals. “For example, when a user opens a private browser, they signal the need to be private. So we don’t allow third party trackers and other tracking services to load, which is not what some other browsers do, where they allow the trackers to load and then dump the data at the end of the session, which still allows you to get fingerprinted in the process. We saw that signal from the user and chose to act upon it from a product perspective.”

Chaitanya Kalbag, former editor of Business Today had a similar suggestion: “Why not think of something very simplistic where you have choices, where you can upgrade or not upgrade. Therefore sacrifice certain functionality and let people choose only parts, so that they get the basic services and not the bells and whistles?”

*

Updates: Malavika Raghavan sent in slight modifications to what she said, to provider greater specificity, and this post has been updated to reflect that.

The #NAMAprivacy conference was supported by Google, Facebook and Microsoft. To support/sponsor #NAMAprivacy discussions, contact harneet@medianama.com