Most decisions of education institutions across the world, considering the framework around examinations, are based on the one aim of preventing ‘cheating’ during a test. While that is easier to surveill within four walls and constant squinting of eyes, the difficulty begins when it takes place online. There is no denying that limiting malpractice must be taken seriously, but it is also extremely necessary to consider the importance of privacy rights in cyberspace. With the growth in dependence on Artificial Intelligence through proctoring devices, many educational institutions in India have taken to platforms that provide these services.
Proctoring can be done using Artificial Intelligence or with a combination of both AI and human proctors. The AI-enabled proctors have the functionality to detect movements, facial recognition technology, monitor screen shifts etc. Since the pandemic and the world shifting to home, the reliance on the use of AI proctoring has increased and is being used by educational institutions for internal exams as well as large-scale entrance tests. While for the former, an AI along with a human proctor is set up, for the latter, a solely AI-based proctor is used. Through these AI proctors, personal data of students across is collected and stored by the platforms providing these services.
The author has considered privacy policies of three edtech providers that also offer proctoring services and have been used by Indian education institutions in the first part of the article and has offered key comments in the latter:
Merittrac, for example, collects information data such as name, address, caste certificate etc., personal identifiers (Aadhaar, PAN etc), location information, photographs, biometric data such as voice recognition and fingerprints. This would qualify as personal data and a part of this as sensitive personal data placing higher liability on the controller to obtain informed consent, prevent data leaks and restrict third party sharing.
Similarly, mettl, also a proctoring service provider has listed the type of personal data it collects, including contact information (name, email, phone number), live images, cookies data, IP address etc. The policy entails a clear liability on the employer given a situation wherein they receive data from the employer. The policy further lists the uses for which it collects the said data, an example to verify identity before signing up and interestingly, a clause lacking in others’ has been mentioned here stating that if the data is used for internal assessment, the same is aggregated and anonymised. On retention and use limitation, the provider is clear with regard to restricting storage for as long as required to provide services, including the time frame given by the employer to utilise the data. However, the policy also states that the data is to be used for any other permissible related purpose that might surface later. This broad and undefined language gives scope for possible privacy risks.
The platform relies on the safe harbour provided to intermediaries under section 79 of the Information Technology Act, 2000. The claim establishes data collected as third party information since the platform is a mere intermediary working for the client and providing service to the consumer. Thus, assigning the client (example: an organisation or educational institution) the primary role of directing the use and storage of data indefinitely for a client or wheebox’s internal assessment.
The immunity, however, can only be invoked if the intermediary does adhere to due diligence (Section 79(2)) while discharging its duties under the IT Act, 2000 and under any rules issued by the government. Ridding itself of liability and shifting it on the client with regard to data retention for assessment purposes calls for security concerns.
What’s going wrong?
Given the aspect that an exhaustive data privacy law has not been implemented in India, it becomes extremely necessary for the third-party platforms/proctors and the educational institutions to ensure due diligence with the privacy principles. The platforms have at many times shifted the liability of storing and retaining data onto the institutions (Clients). While it is only best known to institutions as to how they ensure security agreements with these third-party proctors, it also becomes extremely necessary for these institutions to impart enough knowledge to its students about their data that is collected, stored and transmitted primarily on the consent of the institutions. Only in such a case, will there be informed consent of the students. A similar case was made against ‘free consent’ of students, through the lens of GDPR, during online exams, by the Amsterdam University Student Council but the District Court went on to label it ‘legitimate interest’ of the University for the students thus proving wrong that there was no free will. However, in terms of the Sri Krishna Committee report, informed and voluntary consent holds an important place highlighting the need for information symmetry while collecting data of individuals, that precisely is missing in our case against third-party proctors.
Most of these third-party proctoring services also use face scans, the use of which has been completely banned in many parts of the world owing to its intrusive nature, including major racial bias concerns. These concerns were also in the algorithms of these proctoring services. Along with the regulatory concerns, there are also surveillance concerns that arise when such platforms are given biometric access and extremely invasive access to the personal space of a student. Combining the two, i.e the lack of regulatory measures as well as the concerns that arise out of the use, especially in India, it becomes necessary to take a step back and look for alternative measures to ensure complete data security and privacy of students. In recent reports, it was found that ProctorU, a similar third-party provider, also available in India, went through an unfortunate data leak. The policy of this platform states that there are certain leaks that are beyond their control and that the providers, in such an event, are not liable for the leaked data. Such clauses are present in most policies, and with the growth of dependence on these service providers the risk of leak of bulks of data stored in their servers is a high possibility with absolutely nobody to hold liable.
In the light of the pandemic, the use of these extremely expensive third-party proctoring service providers has been at an all-time high. Students abroad have raised voices against the use of invasive exam services stating concerns with regard to ‘who on the other side has access to the students’ computers’ to ‘invading of personal space’ and have suggested other alternative methods such as time-barred assignment papers etc, also suggesting an evolved method of evaluation and learning.
Keeping aside an extremely important fact that the use of such platforms, especially in India, cannot be seen without considering societal disparities — at the end of the day, the sole purpose is to ensure that there is learning and application of knowledge by students, and it is believed that the same can be achieved through alternative methods, that do not include surveillance and data security concerns in the absence of laws. And especially during a pandemic.
Priyanshi Dixit is a final year law student and the co-founder of r-TLP, a platform for marginalised genders in the tech law and policy field.
This article was originally published here. It article has been cross-posted with the author’s permission.