The government of Nepal has proposed a new Information Technology Bill, 2075 (2018) which aims to replace the Electronic Transaction Act (ETA). The bill has been claimed by the government to address long-standing concerns related to IT management. However, civil society activists criticise the bill as stifling the growth of the emerging internet and technology industry.
The critics of the bill are worried about the vague language in the provisions of the draft legislation that may act as red-tape for the data-based companies. Note that the bill is in Nepali, and this post is based on a translation of Nepal’s Information Technology provided to us by lawyer Babu Ram Aryal.
Registration of social networks; rationale missing
It is under the Section 91(1) of the proposed Act that government has introduced the requirement of registration such that “any person who wants to run social network has to register in the Department pursuant to this Act”. The proposed provisions related to registration are needed to be scrutinized before their introduction as they can potentially prohibit the operation of a social network operator.
In the absence of a definition of the term “social networks”, mandating the registration for an ambiguous category of social media platforms will lead to uncertainty during the implementation.
No rationale has been provided under the provision for the requirement of registration of a social network operator. Without rationale, a registration requirement is likely to increase the cost of compliance and hamper flexibility of operations of such social network operators, particularly as this industry is still young and has many start-ups.
Censorship or not?
Section 94 enlists certain categories of content that cannot be published in a social network. It provides that
“(1) No one shall perform or cause to perform the following acts in the Social Network.
(a) Communicate such content that undermines the sovereignty of Nepal, geographical integrity, national security, national unity, independence, dignity or national interest or harmonious relations between federation and unit doing or causing to do or the incitement or encourage incitement for hatred…”
Section 92 and Section 94(2) of the Bill empowers the Department of Information Technology to issue the certain directive to any social network operator to remove or cause to remove the content if such publishing of the content violates Section 94(2).
The violation of Section 94(1) enables the government to have direct control over the operation of social networks. The proper test of the violation of Section 94(1) requires the analysis of clear facts and laws. As most of the heads under Section 94(1) are vague, there remain chances of future litigation over its legality.
The Indian Supreme Court’s judgment in the case of Shreya Singhal v. Union of India can serve as the illustration for striking down of provisions due to the use of “open-ended, undefined and vague” words that are “nebulous in meaning”.
In the case, the apex court struck down Section 66A of the Indian Information Technology Act 2000, observing that the section was arbitrary, excessive and was disproportionately invading the rights of free speech. As the heads under the impugned section were “annoying” and “insulting” (among others) that are too wide in scope and considered as expanded restrictions.
In order to specify the intention of the legislation, it is required that the words of the provision must become clear in meaning. As there are already a lot of restrictions for online service providers, pushing for more control like having the registration requirement will only add to the hurdles. The nation cannot expect innovation amidst such restrictions. Meanwhile as reported by the Business Standard, the Nepal government has defended the bill saying they are only trying to regulate those who might be misusing internet-based platforms.
The bill contains Section 67(4) which provides that “personal information collected or stored under the law for a specific purpose shall be destroyed within thirty days after the purpose or use of the information is served”. The law arbitrarily obligates the service providers to delete the data within a period of 30 days, irrespective of the purpose.
This data retention obligation should be reconsidered by the Government as the 30 days is an extremely strict time-frame in terms of global best practices. There are various key consumer concerns that are associated with such a strict limitation.
For fin-tech services, records of information are important for not only processing the transactions but also for other necessary purposes such as refunds, fraud detection, and so on.
For cross-border service providers, it is a complex task to comply with multiple data-retention obligations for processing of a data-information. Therefore, it is very important to provide flexibility in terms of limitation specifying retention of data.
The provision doesn’t include an exception to such a short data-retention limitation for the cases of legal necessity. In the absence of an exception clause, there may be unnecessary legal uncertainty.
The European Union’s GDPR states that the time period for which the personal data is stored should be limited, but the authority to decide what should be relevant time-limit is vested in data controller. The data controller periodically reviews the data in order to ensure compliance of the data retention policy that has been decided by its organisation according to the purpose of the task.
In USA, the regulations that governs the use of personal data are sector-specific. They provide the different retention limitation considering to the need of that sector. For an instance, The Health Insurance Portability and Accountability Act (HIPAA) includes a part that protects the confidentiality and security of healthcare information, providing that records of company’s policies can be kept for Six years. The Internal Revenue Service regulations provides that records of IRS audits can be kept for six years at least. Also, there are different state data security and data breach laws require businesses to retain data and records of breaches for certain periods.
Intermediary not protected; provisions left vague
The provisions relating to Service providers are contained in Sections 89 and 90 of the proposed Act. It provides for the liability of Service Provider such that under Section 89, “Service providers are not liable, in the following situation, for any criminal liability arises from any factor particulars only because they provided access to such information or data or link”.
The conception of excluding the intermediary liability to a certain limit is a global best practice, as the policies that govern intermediary’s liability may substantially impact on freedom of speech, expression and right to privacy. The Bill has recognised the practice significantly but has failed to be explicit and clear regarding it.
The intermediary liability will be limited on the fulfilment of the condition that the service provider has under Section 89(b), such that it has “not selected the user by its own and the Service Provider did not select or altered the information its own”.
The usage of the term “selected the user on its own” is vague. The meaning of the words is not specifying that which acts of the service provider will constitute the “selection”. Whether the act of providing targeted content based on choice, requirements etc. of the user, would imply that the user has been “selected” by the algorithm, is unclear.
Under the proviso, the service provider has the duty to remove any information that is “directed by a public agency or tribunal to remove or disable declaring the content as unlawful”. However, there are no adequate due process safeguards that have been built into Section 89 (c). In the absence of any procedural details regarding jurisdiction, identification, documentation etc. related to the unlawful content, the question of intermediary liability remains unclear.
As the proposed Act has been successful in recognising the principle of limitation of intermediary liability, it has failed to serve the principle’s stated purpose in an explicit and defined manner. The Bill should simply clarify that there will be no liability accrued to the intermediary for unlawful third-party content.
The guiding principles regarding intermediary liability are the Manila Principles that are recognised worldwide. It is touted as the ‘Best Practice’ roadmap to protect rights and promote innovation.
Manila principles broadly state that: 1) Intermediaries should be shielded from liability for third-party content; 2) Requests for the imposition of restrictions on content must be clear, unambiguous and follow due process of law, and they must comply with the tests of necessity and proportionality; 3) Laws providing for content restriction must also follow due process of law; andTransparency and accountability must be built into the process of requesting content to be taken down or blocked.