Governments don’t really understand technology well, and the more knowledgeable they are, the better they will hold Facebook accountable, said Mark Luckie, a witness before the Delhi Legislative Assembly’s Peace and Harmony Committee. He argued that Facebook could not be trusted to self-regulate, and that this was the governments’ job. Luckie is a former employee of Facebook based in the city of Atlanta, in the United States, who left the company in 2018 after accusing it of not doing enough to address its lack of diversity.

This is one of several hearings of the Committee, which has been probing Facebook’s alleged role in aggravating the riots and violence in northeast Delhi in February this year. The Committee was constituted after the riots to foster harmonious relations among the general populace, and sprung to action in August after a report from the Wall Street Journal accused Facebook India’s public policy team of not policing hate speech by members of the ruling Bharatiya Janata Party (BJP) party. It had earlier also summoned Facebook India’s head Ajit Mohan, who then approached the Supreme Court to obtain a stay until further orders.


Read our full coverage of the Delhi Assembly’s Peace and Harmony Committee here.


Luckie, who deposed remotely from the US, worked at Facebook between October 2017 and November 2018 as “Strategic Partner Manager for Influencers”, according to his LinkedIn page. When Luckie resigned, he sent a memo to his colleagues, which he subsequently made public. In the, now well-known, memo he said that “Facebook has a black people problem”.

Luckie told the Committee on Thursday that he had worked on multiple teams, including content moderation, product development and research and development. The committee’s questions were largely concentrated on Luckie’s own experience while working with Facebook, and what he thought were the company’s deficiencies. Understandably, Luckie had very little concrete and specific input to add about Facebook’s role in the Delhi riots, which is the primary brief of the committee. However, it is the first time the Committee has heard from anyone who has had first-hand experience with the organisations’ processes and matters at hand.

Facebook is not politically agnostic

Luckie also commented on Facebook’s public policy teams, whose composition put it odds with the company’s claim that is a politically agnostic platform. He said the policy teams generally consisted of people from the organisations or fields they will work with as part of Facebook. He essentially implied that people such as former government officials at departments relevant to Facebook often get hired by the company to work with their former employers. “Facebook would like you to think it’s a politically agnostic platform, but that will never be true as long as it only hires from particular groups,” he said.

The only job of Facebook’s public policy team, according to Luckie, is to influence governments to bring legislation that favours the company. What is good for the company often differs from country to country, and this is what these teams assess and work with government officials on, he said, adding that, “you can tell how much Facebook cares about a government or region by how many meetings they have either with Mark [Zuckerberg, CEO] or with senior representatives.”

Multiple witnesses in the past have noted how Facebook India’s public policy team, headed by Ankhi Das until recently, has had close relations with the ruling BJP government.

Lacunae in content moderation

Luckie flagged several issues with Facebook’s content moderation process during the hearing:

  • Delays in moderation: Most content that is reported is moderated automatically by artificial intelligence-based (AI) algorithms, he said. If it reaches a human moderator, the process can get further delayed. For instance, if a post is deemed newsworthy or if it is an international incident, it can get escalated further up the chain, while the post remains up on the platform. He argued that giving on-ground moderators the freedom and information they need to intervene would curb the rate of spread of hateful or offensive content. Luckie speculated that this could have happened during the Delhi riots.
  • Lack of cultural context: Luckie argued that many moderators lack the cultural context of what they are supposed to moderate. For instance, slang that would be associated with black people would be misread, leading to faulty moderation. This problem has been flagged by previous witnesses too. For instance, hate speech against Rohingya Muslims in Myanmar was left unchecked due to a shortage of moderators who understood the Burmese language.

Luckie also noted how opaque the entire moderation process was, and for good reason: “Transparency would be Facebook’s downfall. They would want people to know as little infomation about the company as possible. All Facebook has is trust from the users. If users don’t trust the platform, they won’t use it.”

Facebook doesn’t change unless its under public, media scrutiny

Luckie spoke at length about his memo, and why he chose to make it public. “I published it because Facebook doesn’t change unless the public holds it accountable. It goes out of its way to make sure employees don’t discuss matter publicly, and I wanted to make sure that I didn’t follow that,” he said.

Facebook, he said, responds mostly to news events and negative public attention, which is when top executives step into the picture. Many things happen globally that also deserve their attention, but never gets it because it goes unreported. However, in any case, Facebook’s main goal is to make the issue go away so as to mitigate this negative attention. “Facebook will rarely do anything that is not in its self-interest,” he said.

Facebook algorithm propagates divisive, hateful content

The “fundamental problem with Facebook is that it has a lot of information that the public sees and much more that the public doesn’t see. A lot of that is used to fuel what people see in their news feed, what Facebook determines as most popular or what they should see that day. […] Facebook is basically changing people’s world view becuase of how it is using this data,” said Luckie.

Luckie said that user engagement, and not peace and harmony, is Facebook’s goal. It sees things like violent outbreaks or harassment on the platform as anomalies. Rather, they need to pro-actively look at preventing these incidents before they happen, he said.

  • Is Facebook a mere intermediary? Luckie disagreed with Facebook’s notion that it doesn’t have a say in how users use the platform, and isn’t liable for their actions. Facebook claims it is like a telephone, a platform that doesn’t have a say in how people use it. But this is not true, Facebook is not a telephone. It actively interferes in what people see and don’t see. It changes its algorithms. Facebook does aid violence on its platform. Unfortunately people are dying because of it, he said.
  • Hateful content drives engagement: Luckie said that hateful and divisive content is often what has the most shares and likes, which are the two metrics that Facebook uses to determine if it should be shown to others. When more people see this content, they also see the accompanied advertising. “So yes, Facebook is profiting-off of hate,” he said.

“Facebook has the best of algorithms inthe world, that show you the content that you want to see. It has algorithms that shows you the ads that you want to see. It has a complete history of what a person has posted, how many times they have been flagged. Yet, none of those resources have been deployed to the best of their abilities to address hate speech” — Mark Luckie, former Facebook employee

Lack of diversity within company

Luckie spoke at length about what is perhaps his main complaint against Facebook — the lack of diversity and representation within the company’s workforce. He argued that this has a real and profound impact on how the platform empowers or victimises certain communities. Facebook’s users are using the platform as a weapon against minorities in the US like the Latin and black population. Under-representation within the company leads to these communities not having a voice, that could advocate on their behalf, he said.

He said that dominant communities often game Facebook’s content moderation system to inflict harm on minorities. He noted how posts by black persons in the US calling for equality or commenting on political issues would get taken down after being flagged as “hate speech”. This was a practice that is seen on Facebook all over the world, he said.

Read more: