MediaNama is hosting a full day discussion on privacy, called The Future of User Data in India, on the 6th of September 2017, supported by Google, Microsoft and Facebook. Apply to attend at http://www.bit.ly/namaprivacyapply. One of the key elements of current privacy discussions, and especially the TRAI consultation on data protection is the Justice AP Shah report on Privacy, submitted in 2012. This opinion piece by Bhairav Acharya looks at whether the framework suggested by that report is in consonance with current practices and needs.
Privacy principles born in the 1970s that still inform the law globally are now on the verge of obsolescence.
The prospects for consumer privacy in India have rarely been bleaker. A series of high-profile breaches and hacks reveal an economy thoroughly unprepared for safeguarding data privacy. But privacy suffers from something worse than widespread breaches: the state of the national privacy debate. Indian industry and civil society want regulation on the basis of a globally widespread set of privacy principles. The principles were born in the early 1970s in Europe and the US and have informed law ever since.
In 2012, a group appointed by the Planning Commission and chaired by Justice Ajit Prakash Shah proposed an extended version of the principles for India. But while they worked well for the last 40 years, the principles are now on the verge of obsolescence.
Notice and choice
In a free market, the belief went, service providers would compete to offer the best privacy terms for consumers to select. But it has not worked out that way. Privacy policies are too convoluted to understand. That may be a deliberate ploy by lawyers or an unavoidable reality due to the complexity of commercial data practices.
Choice is meaningless because there are no popular alternatives to mainstream data practices. The model has failed to create an informed consumer base to demand better privacy from service providers. In surveys, consumers claim to value their privacy but in practice, they sacrifice it for incremental convenience. There is a yawning disconnect at the heart of the notice and choice model.
Big data endangers the principles. Broadly speaking, big data refers to the massive amount of data collection taking place on a daily basis. Inevitably, new uses are found for that data beyond the purposes for which it was collected. For instance, roving location data collected from phones could help cities plan efficient mass transportation.
More data means more uses, so more data is collected. But the principle of collection limitation, which restricts the amount of collectable data, directly conflicts with big data. The principle of purpose limitation, which restricts how data can be used, prevents that data from being repurposed for new uses.
Smart data, the operational element of big data, which is best manifested in the Internet of Things, threatens most of the remaining principles. So far, data collection sensors—cameras, radio frequency identification readers, and such—have been unconnected. In the Internet of Things (IoT), sensors will be ubiquitous, connected, and freed from human interaction. Your phone, fridge and television will talk to each other without your input.
Eventually, there will be connections between a body area network of wearable devices, a local area network of home devices, a remote network of cars, and a very wide area network of municipal infrastructure. In a system designed to simultaneously share data, the principle of disclosure, which restricts data sharing, will be bent into an unrecognizable shape.
Smart data is growing. It will be integrated into the scalable Smart Cities project as the IoT proliferates. The principle of access, which calls for people to be able to review their personal data, will become unworkable as data is continuously collected. Contrary to the principle of security, data insecurity intensifies as more devices join the system.
The future of privacy
The Shah principles were designed for a pre-big data economy. They will be eclipsed twice over by smart data. The solution is to modify the principles where possible and, where not, devise a new approach. The notice and choice model does not work. It focuses on data collection even though people blindly agree to collection and multiple uses. Only a model focused on data use will work.
A use-focused model will categorize data uses on the basis of harm. Data can be tagged at the moment of its creation with a list of permissible uses. Discriminatory uses are clearly harmful, urban planning less so. Data tagging will code context-based integrity into the system. For instance, your phone’s roving location can be shared in real time with other phones to plot travel times, but not with your employer.
This does not mean that data collection will be unconstrained. Designing devices that minimize data collection but are compatible with the IoT is another solution. “Privacy by design” offers a better chance of success given people uncomprehendingly sacrifice privacy for convenience.
What about people who simply do not want any part of smart data? The data industry is not eager to cater to them, so their data will be collected by default. That should be changed so that the default mode is non-collection unless consumers opt in. But to create the demand for that, India needs a more sophisticated privacy debate.
This post was first published in Mint on Dec 25 2016. Crossposted here with permission from Mint.
At the time that this was published, Bhairav Acharya was an Open Technology Institute Program Fellow at New America. He is currently Public Policy Manager, India and South Asia at Facebook.