Sri Lanka blocked multiple social media networks, including Facebook and WhatsApp, following terrorist attacks on Easter Sunday, to halt the spread of misinformation. YouTube, Instagram, Snapchat, Facebook Messenger and Viber were also inaccesible, according to internet freedom tracker Netblocks. The bombings targeted churches and luxury hotels on Easter Sunday, killing 250 people and wounding 500 people. The social media ban was still in place as of this morning as multiple social media apps failed to load this morning. Twitter appears to be accessible as some government ministers and journalists have been tweeting out information. The Sri Lankan government said the decision was taken as “false news reports were spreading through social media”; it said the blockage will be effective until investigations into the attack concluded. A presidential adviser to Sri Lanka added that the decision was unilateral; the block came out of fear that misinformation about the attacks and hate speech could spread, provoking more violence, per the New York Times. The publication added that the government preemptively blocked internet before any social media triggered violence was known to have taken place.

What misinformation we know spread

Officials arrested 13 people following the bombings, but did not release names of any suspects, and no group has claimed responsibility of the attack. However, there was widespread speculation online about the possible perpetrators of the attack. According to BuzzFeed News, news outlets published names of suspects that had not been verified by officials. One video that named a man in connection with the bombings, and showed a photograph, got hundreds of thousands of views across Twitter and YouTube. Some websites used old photos with provocative headlines to promote those same names, adding to the spread of misinformation.

Concerns have been raised over the impact to citizens’s ability to communicate and impart information in the midst of crisis. Many Sri Lankan internet users are complaining about difficulty checking up on friends and family following the attacks.
– Netblocks.org

The Sri Lanka Red Cross had to clarify misinformation that one of its buildings was attacked in the bombings:

Nikhil adds: I think that in times of crisis governments worry about disinformation campaigns on social media fueling violence and disruption of public order. By shutting down social media, they end up losing an opportunity to counter rumours through reliable, verified speech. I do think that shutting down of social media is a disproportionate restriction on free speech: it ends up censoring everyone, including, for example, public requests for blood donors, fundraising for those in need of help, and allowing friends and family to check on each others safety.

Facebook’s statement (from CNN):

“Our hearts go out to the victims, their families and the community affected by this horrendous act. Teams from across Facebook have been working to support first responders and law enforcement as well as to identify and remove content which violates our standards. We are aware of the government’s statement regarding the temporary blocking of social media platforms. People rely on our services to communicate with their loved ones and we are committed to maintaining our services and helping the community and the country during this tragic time.”

Last year’s Facebook block after anti-Muslim attacks

In March last year, the Sri Lanka government ordered internet and mobile service providers to temporarily block Facebook, WhatsApp, Instagram, and Viber following a wave of attacks on Muslims by the majority Buddhist Sinhala community. The demand came to stop the spread of hate speech which could trigger more violence against the community.

Eventually, the government did temporarily block access to Facebook as a last resort. Facebook reportedly ignored years of calls from both government and civil society groups to control groups with spread hare speech, inciting violence. Government officials, researchers, and local NGOs pleaded with Facebook as far back as 2013, asking it to better enforce its own policy against platform abuse to target people for their ethnicity or religion. They repeatedly raised the issue with Facebook representatives in private meetings, by sharing in-depth research, and in public forums, but the company did nothing in response. Moreover, the company had at least until then, done little to address hate speech when it appears in Sinhala (instead of in English) because of a lack of Sinhala-speaking moderators.