Big internet platforms will have to open up their algorithms for European regulators to scrutinise, as part of the region’s upcoming proposals to regulate digital services. Platforms will have to provide more information on how their algorithms work, submit regular reports on the content moderation tools they use and their effectiveness, and explain why users see certain ads, Margrethe Vestager, the EU Competition Commissioner, said last week.

Vestager was talking about Europe’s Digital Services Act, a draft of which is scheduled to be out in December. The DSA package is an attempt by the EU to modernise the legal framework surrounding digital services in its jurisdiction. It proposes rules that will decide the responsibilities and legal obligations of digital services — and per Vestager’s suggestion, algorithmic accountability will be a major part of the proposal, since platforms’ algorithms can often be a “black box”, making decisions in ways that no one really understands, not even the people who design them.

“So the rules we’re preparing would give all digital services a duty to cooperate with regulators. And the biggest platforms would have to provide more information on the way their algorithms work, when regulators ask for it. They’d also have to give regulators and researchers access to the data they hold – including ad archives,” Vestager said.

This development comes when decisions made by algorithms are being around the world, with lawmakers in several countries including in India calling for rules on algorithmic accountability. Recently, a UK-based trade group representing cab drivers in the country sued Uber for allegedly firing drivers on the basis of assessment made by its algorithms alone.

Algorithmic accountability under the upcoming Digital Services Act

Vestager said that the rules will ensure that “regulators get the information they need, to understand and govern the way algorithms work”. She said that artificial intelligence can allow discrimination to creep more easily into the way platforms work, especially if they’re trained on biased data. “Sadly, our societies have such a history of prejudice that you need to work very hard to get that bias out. And if we don’t know how they’re making their decisions, we can’t be sure that those choices aren’t based on harmful stereotypes – and to challenge those decisions, if they’re unfair,” she said. Other aspects of the package:

  • Increased accountability to users: Internet platforms will have to inform users when they take content down, and give then enough rights to challenge such removals, Vestager said. “They’ll also have to give us the ability to influence the choices that recommender systems make on our behalf,” she added.
  • Access to researchers: Researchers need to have access to data on platforms’ algorithms, “to understand how those algorithms are affecting our society”, Vestager argued. “The impact of the choices they [algorithms] make isn’t always obvious, until you dig down into the data and fully understand what is going on. And since those choices affect us all, that data can’t be a sort of esoteric knowledge, that only a small priesthood who work for these big platforms gets to see,” she added.

“So we can’t just leave decisions which affect the future of our democracy to be made in the secrecy of a few corporate boardrooms. That’s why one of the main goals of the Digital Services Act that we’ll put forward in December will be to protect our democracy, by making sure that platforms are transparent about the way these algorithms work – and make those platforms more accountable for the decisions they make.” — Margrethe Vestager, EU Competition Commissioner

Our data can’t be accessed by third parties: Google

Google said that data related to its digital services can not be accessed by third parties under contractual conditions, or as reported, aggregated information, or under a request to port data to another service, in response to European Union’s public consultation on its Digital Services Act package.

The Mozilla Corporation had recommended the European Union to make online platforms accountable for practices and processes that can amplify illegal and harmful content, and ensure that they do not amplify such content through their business practices.

Also read: