We missed this earlier.
Asia Internet Coalition said in June that Bangladesh’s Digital Security Bill (BDSA) creates several obstacles to the conducive use of the internet ecosystem due to several vague obligations, unchecked powers, disproportionate penalties, and unworkable compliance requirements. The coalition, to which Facebook, Google, Amazon, LinkedIn, Twitter, Yahoo! are members, pointed out that the Act can have a chilling effect on free speech, and highlighted issues with how offences are laid out in it. Other members of the coalition are Apple, Expedia Group, Line, Rakuten, Airbnb, Grab, and Booking.com.
Bangladesh had passed the Digital Security Bill 2018 in September last year. Protests have been carried out against the bill; Amnesty International has called the law an attack on freedom of expression.
The coalition pointed out its issues with the Act, and also made some recommendations:
The act can have a ‘chilling effect on free speech’; offences under Act vague and subjective
AIC said that certain provisions of the act such as Section 21, 25 and 31 will have a “chilling effect on speech” because they’re “vaguely drafted”. It cited Section 66A of India’s IT Act which the Indian Supreme Court struck down for being “open ended, undefined, and vague”. It also urged the Bangladeshi government bear in view the “well established” tenets of international human rights law such as Article 19(3) of the International Convention on Civil and Political Rights. It points out issues specific to different clauses:
- Section 21 of the act contains the punishment for any type of propaganda or campaign against the Liberation War, the Father of the Nation, National Anthem or National Flag.
Recommendation: The AIC said that the current definition of this offence is “not sufficiently precise” and can also result in “disproportionate penalties”.
- Section 25 and 31 of the act penalises individuals for content that is offensive, fear inducing, annoying – the AIC said that the understanding of these three words is “inherently subjective”. It said that the “threshold of annoyance or insult” varies from person to person and this lends a great deal of vagueness to the provision.
Recommendation: AIC also said that the interpretations content that is ‘spread[ing] confusion’ can lead to widely varying outcomes. How the executive or judiciary will understand this term remains confusing. The group also said that these provisions will have a “chilling effect” on on free speech and will curtail the growth of healthy journalism in Bangladesh.
- Section 27, the offence and punishment for cyber-terrorist activities is provided, and it makes damaging or destroying supply of daily necessities or services of public products or causes adverse effects on Critical Information Infrastructure an offence under the BDSA.
Recommendation: The group also said that Section 27 of the act requires more clarity since in its present form, it’s vague.
- Section 28 provides for punishing anything that could be construed as hurting religious sentiments, but doesn’t link it to any actual threat.
Recommendation: A law carrying penal consequences should closely tie in said penalties only to any actual threat or discrimination.
- Section 29 penalises defamatory content without linking the same to an actual offence of defamation in law.
Recommendation: Without clearly setting out the standards for what may constitute defamatory content, said AIC. It also said that a vast amount of content on the internet could be a commentary on others – like political reporting or satires. The group said that this section could be an impediment to important democratic expression.
Need for a ‘more predictable regime’ for content regulation; jurisdictional issue with telecom regulator
The coalition said the Act should have a more predictable regime to regulate content along with an established procedure to serve takedown notices to private entities, along with adequate procedural safeguards to avoid misuse. A social media service provider should not be held liable for any non-compliance if the order in question does not follow these procedural safeguards and due process requirements, it said.
Issue with takedown requests to the Bangladesh telecom regulator: Under Section 8 of the Act, the Digital Security Agency can request the Bangladesh telecom regulator – BTRC – to remove/block any information if it threatens breach of digital security. Similarly, even law enforcement agencies can request BTRC to remove any information/data undermines the solidarity of the country, economic activities, defence, religious morality or public order. The BTRC then has to remove access to the data. The coalition points out that the regulator has limited jurisdiction, and “this is not a requirement that is likely to be applicable to all service providers, but rather only to those entities (such as telecom and internet service providers) that the BTRC has jurisdiction over, which is also restricted to the territory of Bangladesh.”
AIC also recommended establishing proper “checks and balances” within the provision so that there is some scope of assessment. It recommended:
- Blocking orders must be issued only where there is a valid court/judicial finding that the content in questions violates Bangladeshi law. It should also provide evidence and reasoning sufficient to document the legal basis of the order
- Only when there is no alternative method to remove the content, should blocking orders be issued.
- Blocking orders must be in writing, reasoned, and be narrowly scoped to specific URLs or content.
- The time period for which the content should be restricted, should also be indicated.
Safe harbour provisions under Act in adequate; may force platforms to remove content without being asked to
Under Section 38 of the Act, intermediaries will not be held liable if it’s proven that the concerned violation was committed without their knowledge or that they had taken all measures to prevent the occurrence of the offence. The key issues with framing of the provision, according to the AIC are:
- This safe harbour has been framed inadequately since it only applies to offences covered under the BDSA. It doesn’t take into account other civil, criminal, or regulatory frameworks which may operate to regulate content and create liability.
- Greater clarity is required on the provisions relating to knowledge and precautionary measures of intermediaries. Currently, the provisions are loosely framed and might force an intermediary to proactively remove content in the absence of a valid request from an independent authority.
- Intermediaries are exempt from liability usually because they do not actively monitor all content that is communicated through their platforms. However, in its present form, the law could be interpreted in such a manner as to entirely subvert the intention of limiting intermediary liability.
- Placing the burden on the intermediary to prove that there was no knowledge and that “all possible steps to stop the commission of an offence” had been taken, is a high burden of proof. It is also subjective.