At NAMApolicy on the Security and Privacy of IoT at Delhi, multiple stakeholders discussed what IoT devices’ policies revolve around, why we need to define sensitive data and how to keep it safe. On consent, Beni Chugh from the IFMR Trust said, “Even if I’m following all OECD processes or the seven pillars of data protection, there can be discrimination, bias, price discrimination, tracking, racial profiling… Who’s regulating this flow of data, and do we a) can we technologically do that? And b) where do we draw the policy lines?”

This is a report on the second session about the security of IoT. Read the first part about the privacy of IoT here.

What follows is a paraphrased, not verbatim, transcript of the discussion.

Security: Default login passwords and suggestions

  • Normally end user devices are supplied by companies providing the service, not the hardware manufacturer. So individual passwords can be used when devices are supplied to user end. (Kapil Chawla, Alpha Consultants)
  • It can be a single random string instead of giving a single password, if the default password is a random string, people will try and change it. (Sangeeta Mittal, Jaypee Institute of Information Technology)
  • I don’t think there’s an alternative to user education. Devices have to come with some security features – One is a customisable password and the other is that the firmware has to be updated. On using external devices to secure IoT devices, one is firewall. Eventually, anti-virus programs will also grow up to scan the devices on your network, and scan for vulnerabilities like default passwords. In a lot of places, WiFi access points are also open.. We’re now seeing a lot of new software and devices which haven’t matured enough to be secure themselves. (Adnan Hasnain Alam, Nutanix)
  • For a WiFi router, we can have one-time authentication process, like a standard user ID and password for the first time, and after compulsorily change the credentials. (Guneet Singh Gudh, Panag & Babu Law Offices)
  • We should use multi factor authentication, where we have the password as well as a device, or else time-based OTPs. (Sheikh Raashid Javid, Amity University)

Biometric passwords: The troubles, vulnerabilities 

  • Passwords are getting redundant because your body is becoming the password. Where your user interface is your body, how are you going to play with that? (Manpreet Dhillon, Jawaharlal Nehru University)
  • Biometrics make you vulnerable for the rest of your lives, like a permanent password, which is frankly quite stupid. The most intelligent usage of biometrics for access control is where the data is stored locally on the device, and it’s encrypted; as opposed to being stored on a server and being validated by the server. Market is moving towards server-based authentication. That’s going to become more problematic with time. (Nikhil Pahwa, MediaNama)
  • Because biometrics are stored on the device, and it’s connected to the internet, it’s prone to get hacked….Despite this most of the other access control is now happening with biometrics. (Debashish Bhattacharya, Broadband India Forum)
  • Biometric is a big problem because I’m required at home to do many things. When I’m away, things don’t work because I’m not there. (Brajesh Jain, ISPAI)
  • Why shouldn’t IOT devices also provide 2FA? Password plus OTP. (Kapil Chawla, Alpha Consultants)
  • There is a need for session controls, because once you’re logged in you’re logged in forever. For biometrics, when you’re not at home, polymer resins that have been found— there were a whole batch found in Surat — which were being used to do Aadhaar authentication in government for people. (Nikhil Pahwa, MediaNama)

On consent as more and more devices get connected

  • It’s very easy to get confused between two kinds of consumer harms:
    a. one is harm to your personal data, how to keep your data protected at all points (transit and rest, etc.).
    b. And the second is harms from your data, what can be done with my data; how much of it should be interoperable; how can I use it; for how long can I retain it?

Even if I’m following these processes, certain outcomes are anti-consumer, which might still occur even if I followed all OECD processes or the seven pillars of data protection. There can be outcomes like discrimination, bias, price discrimination, tracking, racial profiling… despite following the processes. Who’s regulating this flow of data, and do we a) can we technologically do that? And b) where do we draw the policy lines? (Beni Chugh, IFMR Trust)

User controls, communication and harms of data sharing

  • We need something over and above consent. Regulating for harmful outcomes is one, But also to design processes where you know because of the profession or service you’re providing… Even if my privacy were intact, can everybody use my data? (Beni Chugh, IFMR Trust)

Nikhil Pahwa: Question we got: Given that India is making a big push for smart cities, including in urban lighting management, energy, etc, what does trust and consent mean in this context? How can individual privacy be protected, especially in the context of the public space?

  • I read a story, where, in Germany, where the smart meter was hacked to watch what people were watching on the TV. there’s a huge disconnect between what you think you can control via consent, and what hackers can do to control. (Amitabh Singhal)
  • The EU in 2006, discussed the right to be forgotten in a limited sense. The litigant did not want certain information available about him on Google. They’ve tried to accommodate this under Article 17 under GDPR. If you were to incorporate that into the privacy framework..  but the possibility of being able to take back your consent on some level. The government can also be a service provider for IoT, but the government is both your service provider as well as your governing authority. When you vote your govt into power, do you implicitly give them the power to be a service provider and have access to your data? (Renjini Rajagopalan, The Quantum Hub)

Consent and its complexities 

  • For security cameras — the whole point of the security cameras is that the homeowner knows who’s about to step in. But that person needn’t be made aware of the fact that they’re being monitored. In public spaces, though, there’s a general catch-all saying you’re under CCTV surveillance. But how would you ensure that in the privacy of your own homes… In the public space, to a certain extent, as long as you’re given the notification that you’re under CCTV surveillance. The act of walking into a mall says ‘yes, I agree to your terms and conditions; I’ll comply with your security requirements.’ As per the definition under the bill for biometric information, a facial image does qualify as biometric information. It’s just as simple as a facial image, that could be a picture. (Tuhina Joshi, Ikigai Law)
  • Explicit consent is only for sensitive personal data [in the data protection law]. Implied consent doesn’t work. Private use isn’t even an exception as far as I know. In CCTVs, under the current bill, implied consent doesn’t work, because this situation requires explicit consent. (Nehaa Chaudhari, Ikigai Law)

As far as privacy is concerned, it’s immaterial if you’re going into a mall or to someone’s home. If you’re being recorded, you have the right to be informed and have a choice. You need consent everywhere. In a public place, you’ll be recorded. You can always turn around and go back. (Rahul Ajatshatru, Ajatshatru Chambers)

  • Let’s put in all our efforts as lawmakers or policymakers for the country to decide on the issue of processing rather than gathering. (Ananth Padmanabhan, Centre for Policy Research)
  • I agree that consent is must for collection itself, and not just for processing…. There’s so many concerns about Huawei problem with the US; Australia and South Korea have banned procurement from Chinese companies. There are no questions being asked in India. (Smitha Francis, Institute for Studies in Industrial Development)
  • We could give consent through a timestamp.. for one month, two months, one year, two years. That would, to a large extent, solve my problem as a consumer. I’m gone for a few days, but once I give consent and download again; many times it doesn’t ask me for consent. It takes the previous consent. (Brajesh Jain, ISPAI)
  • I think the line of human rights ends when it is in public interest. If your fundamental rights are compromised, then probably there is a consumer interest. And the consent architecture has to take that into account.. The law has to standardise that. If you cross this line with respect to the fundamental rights of the consumer, you’re penalized. (Sudhir Singh, iSPIRT Foundation)

On medical data

  • When medical information is collected, I think the consent needs to be clearly linked to a specific purpose…  Since I’m giving consent only to that one service provider, we have to make sure that it doesn’t get transferred somewhere else. (Neharika Srivastava, Aon)

There’s a huge vacuum when we talk about medical data, which is collected by different hospitals and clinics; every small clinic is using small equipment to measure heart rate etc. We generally don’t know what the data is used for by that clinic or brand. Even a small thermometer can now store your temperature from the last ten times. (Guneet Singh Gudh, Panag & Babu Law Offices)

  • There needs to be a clear understanding and definition for what constitutes medical data in terms of the bill?.. Having a graded approach with sensitivity may not be wise, because in aggregate, that data will be sensitive. (Tuhina Joshi, Ikigai Law)
  • The DISHA Bill tried to indicate that certain sector data should be anonymised. For example, data collected by hospitals: is connected to insurance companies. There is a definite vulnerability there. Vis a vis data that’s collected through that’s a band that’s tracking your heartbeat, which may also go through an insurance company; but maybe a sector-wise way of deciding vulnerability could be a possible way of understanding this. (Renjini Rajagopalan, The Quantum Hub)

On connected devices and regulation

  • The government regulation for IoT devices (SIM based) is going to be a KYC system, as far as I understand. You know where the device is, and somebody takes responsibility for it. But 90% of non-SIM based devices don’t have a KYC. There is no process, you don’t know who’s responsible for it; what happens if it gets stolen; what happens if one of those devices gets hacked, and creates rogue devices. It turns, and makes the entire network corrupt, and as a result, you can create catastrophes. You can pull the power system down, of a particular city; you can hack into any major networks doing this. What is the way out? (Debashish Bhattacharya, Broadband India Forum)
  • There’s a lot of manufacturers building devices and not keeping a record of what they produce. It’s literally just a factory that churns out things that other companies use to make their devices. There’s an issue at the sub-hardware level. The higher-level stuff is great, but until we fix some of these foundational issues, I fear that we’ll never get to fixing these problems. Gone are the days where you just thought protection was inside a network. But your device can also harm outside your own network. (Rajnesh Singh, ISOC)
  • Whether regional, local or international… You’d probably have to have a global or national framework of regulation controlling the production of these devices, where they adhere to certain standards. Those standards still need to be spelled out, they’ll keep evolving over time… I think it’s necessary because you can’t just have a free-for-all, where people can just play with your security because they’re not regulated.

I’m not one for heavy-handed regulation, at the same time, we could use a light touch regulation, where standards can then be created by industry bodies, by the governments overseeing it; at the same time, ensuring that these compliances are adhered to by the industry bodies, the players, the manufacturers and the service providers. By the time you’re doing all this, the consumer’s also getting aware of what they can expect from the industry. (Amitabh Singhal)

  • Consent is agency. It is the right to say no. And you also need regulation for harm. So you need both. It’s not— there will be consent fatigue, but at every level where you just have to sign something, you have a choice of saying no. And when you say no to something, it can’t be done. (Nikhil Pahwa, MediaNama)
  • In India you have mobile handsets from various countries which have the same identifier. So that can be spoofed. A radio device like a mobile phone comes into the country and has to meet testing made by a local regulator. A lot of countries have started to think of it. One of them is that it has to be embedded with a global identifier for tracking, tracing, and so on. The problem is if it’s a legitimate import coming through the customs border, easy to control. But if someone carries it through a flight or some other way, you will still have rogue devices. You’ll still have them on your network. This is not something we can solve individually; it really needs international cooperation. (Rajnesh Singh, ISOC)
  • I have similar discussions with Sri Lankan policymakers as well as Japanese and Korean policymakers, and generally the consensus seems to be that they’ll have to use the telecom tap regulations, if devices are imported into the country. Devices have to meet certain technical requirements. Downside to that is, a device could be worth two dollars. How much more are you going to put into that in terms of cost overheads to make sure the monitoring works? (Rajnesh Singh, ISOC)