wordpress blog stats
Connect with us

Hi, what are you looking for?

Summary: A data governance framework based on granular consent

A global financial body argues for data control to rest with those who create it.

An ideal data governance system must restore control of data to data subjects with regard to the collection, processing, and sharing of their data, according to a research paper released by the Bank of International Settlements. It further stated that factors like technological protocols for notice and consent, purpose limitation, data minimisation, retention restriction, and use limitation are crucial in operationalising the data governance framework.

The paper has proposed using granular consent instead of a broad and sweeping one obtained by a digital system which can be secured and operated at low transaction costs. It has also offered an oversight framework for digital public infrastructure comprising a regulatory authority, a self-regulating organisation, and a technology standards organisation.

“Large public digital infrastructures for commercial use are successful when designed as part of a public-private partnership, where the public sector creates the foundations (legal and policy underpinnings) for the private sector to develop and maintain the (technological) underpinnings of the consumer interface,” read the paper viewed by MediaNama.

The paper also put forth a template to be used as a benchmark for data governance systems across the world. The template is intended to serve as a guide for any particular country that wishes to modify its data governance framework.

Data is often labelled as the new oil of the 21st century which underscores the importance of this resource. Several countries are looking to introduce legislation to extract the most out of this resource and put an effective mechanism in place to govern its use. This paper can be helpful in providing a way forward for such countries, including India.


Dear reader, we urgently need to build capacity to cover the fast-moving tech policy space. For that, our independent newsroom is counting on you. Subscribe to MediaNama today, and help us report on the policies that govern the internet.

Advertisement. Scroll to continue reading.

What is the proposed consent-based data governance system?

The BIS paper has stated two points to keep in mind for an effective consent-based data-sharing system:

  • Data protection policy framework: The paper has suggested that data subjects must be empowered to use the data they create for their own benefit through two avenues.
    • Recognition: The framework should recognise the rights of data subjects over the data they create whether their data reside with them or not.
    • Prior consent: The system should ask for the consent of the data subjects prior to the processing and sharing of data.
  • Technological infrastructure: A technical system must be put in place for a user-friendly implementation of the data protection policy, the paper said. “The technological infrastructure should be sector-agnostic to allow for cross-sectoral applications. The platform upon which the infrastructure operates should be open, interoperable and non- discriminatory. Data management must be digitally based and scalable across large numbers of users to achieve cost savings.”
    • Confidential clean room: A confidential clean room, where data can be processed and results obtained, without extracting the personally identifiable data themselves can be explored to ensure data are controlled by those that generate them.

However, the paper has steered clear of recommending how to develop common, interoperable protocols for data-sharing across countries, due to “a lack of global consensus on an optimal data governance system – both within countries and across borders.”

Conditions for data-sharing

The data governance system must specify which data are requested for sharing, how long they will be retained by data users, and who will process them when there is data sharing between parties. The system should meet the following five standards:

  • Purpose limitation: The purpose for which data are being shared should be described in clear and specific terms.
  • Data minimisation: Share only as much data as is strictly necessary to achieve the stated purpose.
  • Retention restriction: Data should not be shared for longer than required to attain the given purpose.
  • Limitation: Data should be used only for the purpose for which they were shared.
  • Operational resilience: Data should be secure and the system must fend off unauthorised access.

Granular consent

The new system should ensure grant of consent must be granular and specify to whom data is provided, for how long and for what purpose and not call for broad and sweeping ex ante consent. The system must be open and interoperable given the presence of multiple players.

Source: BIS paper

The paper suggested that the consent system should be built around the ORGANS principle (Open, Revocable, Granular, Auditable, Notice and Secure). It also recommended that users should provide consent just before data is shared which should be revocable once provided, and “data subjects should have the right to audit data-sharing transactions ex post”.

Oversight of regulatory and technological standards

The paper suggested that a data governance framework must have an oversight framework that has been used for other digital public infrastructures. The governance framework needs the following:

  • A policy framework that defines the architecture and the accompanying operational framework which implements the policy guidelines.
  • A technological component for the functioning (including the upkeep) of the architecture, including periodic auditing and updating of the technological protocols and standards.

The paper has proposed three institutions with the following roles and responsibilities:

  1. Regulatory authority: “The policymaking body, which may be a statutory, executive body of the government, should be responsible for creating a legal ringfence around the DPI, ensuring, through guidelines and executive policies, that the state provides the normative and institutional basis for the adoption and implementation of the technical architecture,” read the paper.
  2. Self-regulating organisation (SRO): “An SRO is responsible for the actual implementation of the framework defined by the regulator. The SRO – an alliance of entities that have a stake in the system – is self-governed with its own institutional framework and rules of business,” the paper explained.
    1. The SRO will have the following responsibilities:
      1. Be a conduit through which sectoral policy-makers and regulators interact with market participants on the DPI;
      2. Run the dispute resolution system between market participants;
      3. Certify the proper adoption of the DPI by market participants as they develop their own user-facing apps and services.
  3. Technology standards organisation (TSO): “A TSO maintains the technical standards that underpin the digital public infrastructure. While some may see the government as taking this role, there are risks to the government being a player, regulator and standard setter in the same market,” the paper cautioned. The model of the World Wide Web Consortium (W3C) can be followed to organise TSOs, the paper said.

Conditions for market adoption

The three building blocks crucial for market adoption in the case of data are:

  • Data portability: It involves principles such as data subjects having rights to the data they created, “including the right to move the data from one provider to another, as well as open and interoperable consent mechanisms that are user-friendly, low cost, real time, efficient and secure,” the paper elaborated.
  • Getting prices right: The prices charged need to be market-determined so participants can recover costs in the market system, the paper outlined.
  • Keeping the data marketplace competitive: There must be oversight of the system to ensure that the prices set do not bar access to data by data subjects, the paper said.

“In this framework, the official sector defines the regulatory framework and the private sector is encouraged to engage in innovation and manage the consumer interface. In other words, it seeks to build a balanced framework between protecting consumers on the one hand and supporting market innovation on the other,” the paper concluded.

Taking a leaf out of India’s DEPA

The paper said that one can refer to the framework in India’s Data Empowerment and Protection Architecture (DEPA) and the way in which data flows are managed to come up with a data governance system.

“This consent system embodies the protocols that translate privacy principles to the digital space, not least by mandating specialised data fiduciaries whose primary task is to ensure that data are shared in a fashion that respects widely agreed principles of effective data governance,” the paper stated.

The paper wrote that DEPA went live in the financial sector in September 2021. “The system is market-driven, with participants renumerated for the costs they incur. It allows data subjects to benefit from the data they create, while providing for both financial information providers and users to benefit from activities that make full use of their information capital,” the paper said.

What are the problems around data?

“The combination of the expanding consumer footprint, increased availability of data and inexpensive storage has provided the foundations for high-performance computation. It has also enabled the harnessing of very large amounts of consumer data – often referred to as “big data” – into a valuable commodity,” the paper explained.

Advertisement. Scroll to continue reading.

The paper bemoaned that generators of data such as consumers and small and medium-sized enterprises (SMEs) do not have control over the data they generate despite privacy laws.

“They are denied the opportunity to reap the full value from their use. Inaccessible data, including data walled off in silos owned and operated by big tech firms, represent a significant cost to consumers and to society,” the paper said.

It also said that existing rules such as the GDPR and the California Consumer Privacy Act may have benefitted large players at the expense of small entrants, for whom compliance is more expensive.

Why is there a problem?

The paper has suggested that consumers find it difficult to exercise their consent effectively because of the following:

  • Broad and sweeping: “A service provider usually seeks consent to use and transfer data at the time when a consumer agrees to participate in an activity with the service provider. This consent is sought ex ante and for a wide range of possibilities,” as described in the paper.
  • Remain in silos: “Newly created data are often gathered and retained in proprietary silos and stored in various institutions in incompatible formats. Consumers have only limited options for combining data requests across institutions,” the paper said.

This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Also Read:

Have something to add? Subscribe to MediaNama here and post your comment. 

Advertisement. Scroll to continue reading.
Written By

I cover several beats such as Crypto, Telecom, and OTT at MediaNama. I can be found loitering at my local theatre when I am off work consuming movies by the dozen.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

Studying the 'community' supporting the late Sushant Singh Rajput (SSR) shows how Twitter was gamed through organized engagement

News

Do we have an enabling system for the National Data Governance Framework Policy (NDGFP) aiming to create a repository of non-personal data?

News

A viewpoint on why the regulation of cryptocurrencies and crypto exchnages under 2019's E-Commerce Rules puts it in a 'grey area'

News

India's IT Rules mandate a GAC to address user 'grievances' , but is re-instatement of content removed by a platform a power it should...

News

There is a need for reconceptualizing personal, non-personal data and the concept of privacy itself for regulators to effectively protect data

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ