Mobile apps in China which can influence public opinion or have social mobilisation capabilities will have to undergo a safety assessment before launching, according to a Reuters report. The new requirement is part of draft rules issued by the Cyber Administration of China (CAC) for governing mobile apps.
The CAC did not specify any apps or outline the assessment process other than to say it should be carried out in accordance with national regulations, the report added. However, the rules define mobile internet apps as application software that runs on mobile intelligent terminals to provide information services to users.
China has reportedly been wary of the growing clout of its tech companies which have attained mammoth valuations in the last few years. These measures can be viewed as yet another attempt to increase oversight of the country’s tech companies and keep them in check.
What other rules have been proposed for mobile app providers?
The draft rules stipulate that these rules will cover all activities that provide users with text, picture, voice, video and other information production, copying, publishing, dissemination, and other services through instant messaging, news information, knowledge Q&A, forum community, webcast, e-commerce, network audio and video, and life services.
Ask users for identity authentication: Application providers will have to ask for mobile phone numbers, ID numbers or unified social credit codes for registration if they provide users with information release, instant messaging and other services, under Article 6 of the rules.
Licence to disseminate news: “If an application provider provides Internet news and information services through the application, it shall obtain a licence for Internet news and information services, and it is prohibited to carry out Internet news and information service activities without permission or beyond the scope of the licence,” under Article 7.
No engagement in illegal activities: Application providers and distribution platforms will have to adhere to the constitution and not use applications to “engage in activities prohibited by laws and regulations, such as endangering national security, disrupting social order, and infringing on the legitimate rights and interests of others” under Article four of the draft rules.
Protect interests of minors: “Application providers shall adhere to the principle that is most beneficial to minors, pay attention to the healthy growth of minors, fulfil various obligations of network protection for minors, strictly implement the real-name registration and login requirements of underage users’ accounts,” as stated under Article 14.
Establish information content audit management mechanism: Applications providers will have to maintain an information content audit management mechanism under Article 9 of the Rules. They will also have to look after user registration, account management, information audit, daily inspection, emergency response and other management measures, and be equipped with professionals and technical capabilities corresponding to the scale of services. They are also responsible for reporting users to authorities who violate relevant laws and regulations and service agreements.
Social supervision: “Application providers and application distribution platforms shall consciously accept social supervision, set up convenient complaint reporting entrances, publicise complaint reporting methods, improve the acceptance, disposal, feedback and other mechanisms, and deal with public complaints and reports in a timely manner,” according to Article 22.
China’s attempt to regulate recommendation algorithms
China’;s cyber watchdog released another document governing companies’ use of recommendation algorithms, which will take effect in March this year. CAC released the draft rules to regulate recommendation algorithms in August last year. These rules sought to regulate the following types of algorithms:
- generative or synthetic,
- personalised recommendation,
- ranking and selection,
- search filter,
- dispatching and decision-making.
Here are some of the responsibilities of tech companies using algorithms:
- Upholding mainstream value orientations, optimising algorithmic recommendation service mechanisms, vigorously disseminating positive energy, and advancing the use of algorithms in the direction of good.
- Preventing use of unlawful or harmful information as keywords into user interests and the set-up of discriminatory or biased user tags.
- Making recommendation algorithms used for search results, rankings, selections, push notifications, and other such use cases, transparent and understandable in order to avoid creating a harmful influence on users or triggering controversies or disputes.
- Protecting laborers’ rights and interests while carving out work schedules, such as that of delivery workers for food delivery platforms.
- Looking after the consumers’ lawful rights and interests and avoiding the use of algorithms to carry out an unreasonably differentiated treatment in conditions such as transaction prices on the basis of consumers’ tendencies, trading habits, and other such characteristics.
- Maintaining confidentiality of personal information, private information, and commercial secrets.
What should they avoid using these algorithms for?
- Algorithmic models that go against public order and good customs, such as by leading users to addiction or high-value consumption and companies should regularly examine, verify, assess, and check algorithmic mechanisms, models, data, and application outcomes, etc.
- Activities harming national security, upsetting the economic order and social order, infringing the lawful rights and interests of other persons, and other such acts prohibited by laws and administrative regulations.
- Fake accounts, giving false likes, comments, reshares, etc, or engage in traffic hijacking.
- Self-preferential treatment, and unfair competition by influencing online public opinion, or evading oversight in addition to blocking certain information, over-recommending, and manipulating topic lists or search result rankings. They should not control hot search terms or selections.
- The algorithms must avoid influencing minors to imitate unsafe behavior, carry out acts violating social norms, or lead minors towards harmful tendencies that affect their physical or mental health.
Also read:
- Yahoo becomes the latest US technology firm to call it quits in China
- Why LinkedIn is being replaced with a bare-bones job search platform in China
- Why is China’s crypto ban its most severe one? and other FAQs on the latest measures
- Summary: China passes GDPR-like data privacy law, except that many restrictions do not apply to the government
- Reading List: Why is China cracking down on tech companies?
Have something to add? Post your comment and gift someone a MediaNama subscription.
I cover several beats such as Crypto, Telecom, and OTT at MediaNama. I can be found loitering at my local theatre when I am off work consuming movies by the dozen.
