wordpress blog stats
Connect with us

Hi, what are you looking for?

Key takeaways from China’s pioneering draft regulations on recommendation algorithms

China’s attempt to regulate algorithms is a major breakaway from data regulations in any other part of the world.

“This policy marks the moment that China’s tech regulation is not simply keeping pace with data regulations in the EU, but has gone beyond them,” Kendra Schaefer, a researcher at Trivium China, tweeted in response to the draft regulations for recommendation algorithms published by the Cyberspace Administration of China on August 27.

China has been heavily regulating large parts of its economy in the last year, particularly the tech sector. It has launched investigations into and reprimanded large companies like Alibaba, Tencent, Didi, ByteDance, and others, and, most recently, passed the Personal Information Protection Law (PIPL), one of the world’s strictest data privacy laws. China’s crackdown not only has repercussions within the country but throughout the world. Its approach to regulating the tech industry will shape how other countries approach the same.

Here are key takeaways of the country’s latest draft regulations for recommendation algorithms.

The following content is based on unofficial translations of the original Chinese version provided by Stanford’s DigiChina center and China Law Translate

What are the new regulations targetting?

According to Article 2, the rules will apply to recommendation algorithms within China such as algorithms of the following types: generative or synthetic, personalised recommendation, ranking and selection, search filter, and dispatching and decision-making.

Advertisement. Scroll to continue reading.

Responsibilities of companies using recommendation algorithms

  • Should be used to disseminate positive energy and for good: Article 6 of the draft bill states that companies must use recommendation algorithms to uphold mainstream value orientations, optimise algorithmic recommendation service mechanisms, vigorously disseminate positive energy, and advance the use of algorithms upwards and in the direction of good.
  • Should not use unlawful or discriminatory user tags: Article 10 states that companies should not enter unlawful or harmful information as keywords into user interests or make them into user tags to use them as a basis for recommending information content, and may not set up discriminatory or biased user tags.
  • Companies must be able to identify and stop harmful or illegal content: Article 9 requires companies to be able to improve their means to identify illegal and harmful content and cease the transmission of such content as soon as it is discovered. Companies should also store relevant records and report them to the concerned government departments.
  • Marking algorithmically generated content: Article 9 requires that algorithmically generated or synthetic information should only be disseminated once it is marked as such.
  • Content on key sections like front pages should conform to mainstream value orientations: In addition to establishing and perfecting mechanisms for manual intervention and autonomous user choice, Article 11 states that companies must present information conforming “to mainstream value orientations in key segments such as front pages and main screens, hot search terms, selected topics, topic lists, pop-up windows, etc.”
  • Should be transparent and understandable: Article 12 states that companies must make recommendation algorithms used for search results, rankings, selections, push notifications, and other such use cases, transparent and understandable in order to avoid creating a harmful influence on users or triggering controversies or disputes.
  • Provide clarity on why and how recommendation algorithms are used: Article 14 states that companies must clearly notify users of the circumstances of the algorithmic recommendation services they provide and publicize the basic principles, purposes, operational mechanisms, etc., of these algorithmic services.
  • Algorithms targetting minors should be adapted to account for this: Article 16 stated that recommendation algorithms providing service to minors must protect minors online in accordance with the law and must allow minors to obtain information beneficial to their physical and mental health through models that are suited for minors.
  • Algorithms must protect labour rights: Targeted at algorithms that determine work schedules, such as that of delivery workers for food delivery platforms, Article 17 states that such algorithms should work in ways that protect laborers’ rights and interests.
  • Algorithms must protect consumer rights and must not extend unreasonably differentiated treatment: Article 18 states that companies that use recommendation algorithms to sell products or provide services to consumers must protect the consumers’ lawful rights and interests and must not use algorithms to carry out an unreasonably differentiated treatment in conditions such as transaction prices on the basis of consumers’ tendencies, trading habits, and other such characteristics.
  • Obligations of providers of recommendation algorithms with public opinion properties or having social mobilisation capabilities: Article 20 states that providers of recommendation algorithms with public opinion properties or having social mobilisation capabilities should report the provider’s name, form of service, domain of application, algorithm type, algorithm self-assessment report, content intended to be publicised, and other such information within 10 working days of providing services. Article 23 states that these providers must also conduct a security assessment according to relevant state regulations.
  • Personal information of users must be safeguarded: Article 25 states that related bodies and personnel participating in the security assessment, supervision, and inspection of recommendation algorithms must maintain strict confidentiality of the personal information, private information, and commercial secrets they learn and should not disclose, sell, or illegally provide this to other persons.

What should recommendation algorithms not be used for?

  • Should not lead users to addiction or high-value consumptions: Article 8 states that companies should regularly examine, verify, assess, and check algorithmic mechanisms, models, data, and application outcomes, etc., and may not set up algorithmic models that go against public order and good customs, such as by leading users to addiction or high-value consumption.
  • Should not be used to harm national security or social order: Article 6 also states that companies using recommendation algorithms should not engage in activities harming national security, upsetting the economic order and social order, infringing the lawful rights and interests of other persons and other such acts prohibited by laws and administrative regulations.
  • Algorithms should not be used to make fake accounts, likes, comments: Article 13 states that companies must not use algorithms to create fake accounts, give false likes, comments, reshares, etc, or engage in traffic hijacking.
  • Algorithms should not be used to manipulate results: Article 13 also states that companies must not use algorithms to block certain information, over-recommend, manipulate topic lists or search result rankings, or control hot search terms or selections. Companies should also not carry out self-preferential treatment, unfair competition, influence online public opinion, or evade oversight using algorithms.
  • Should not lead minors towards harmful tendencies or online addition: Article 16 states that algorithms targeted at minors should not incite the minor to imitate unsafe behavior, carry out acts violating social norms, or lead minors towards harmful tendencies that affect their physical or mental health. Algorithms should also not lead minors to online addiction.

What rights are users granted by the draft regulations?

  • Users must be provided control over what information is used by algorithms: Article 15 states that users must be provided with a choice to not have their individual characteristics targetted by recommendation algorithms and also be allowed to choose, revise, or delete user tags used by these algorithms.
  • Users must have the option to opt-out of recommendation algorithmic services: Article 15 states that companies must provide users with a convenient option to switch off algorithmic recommendation services and should immediately cease providing related services if the user chooses to switch off.
  • Users can seek remedy if algorithms have a major influence on their rights and interests: Article 15 says that if recommendation algorithms have a major influence on their rights and interests of users, these users have the right to seek an explanation as well as seek measures to improve or remedy the situation from the company deploying these algorithmic services.
  • Users must be to able conveniently file complaints: Article 26 states that companies must allow social oversight and set up convenient systems for users to file complaints, and such complaints must be promptly handled. Users must also be able to file appeals and receive feedback.

Role of government

  • Graded classification of companies providing recommendation algorithms: Article 18 states that the government will establish a categorised and graded management system of companies providing recommendation algorithms on the basis of public opinion properties or capacity for social mobilisation, the content types, the scale of users, the sensitivity of the data handled by the algorithmic recommendation technology, the degree of interference in user conduct, etc.
  • Companies should cooperate with government assessments: Article 24 states that relevant competent government departments must conduct security assessment, supervision, and inspection of recommendation algorithms, and give suggestions to correct discovered problems, and provide a time limit for rectification. Companies must cooperate with the departments carrying out such work, and provide the necessary technical, data, support, and assistance. Article 23 also states that companies must retain logs for a period of at least 6 months.
  • Penalties for violations: Depending on which article was violated, the penalty can range from a mere warning and order of rectification to fines up to 30,000 yuan and suspension of services. Where other laws, such as the Personal Information Protection Law, are violated, companies will be prosecuted according to the said law.

Also Read

Have something to add? Post your comment and gift someone a MediaNama subscription.

Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Releasing the policy is akin to putting the proverbial 'cart before the horse'.


The industry's growth is being weighed down by taxation and legal uncertainty.


Due to the scale of regulatory and technical challenges, transparency reporting under the IT Rules has gotten off to a rocky start.


Here are possible reasons why Indians are not generating significant IAP revenues despite our download share crossing 30%.


This article addresses the legal and practical ambiguities in understanding the complex crypto ecosystem in India.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ