China’s New Regulations on Controlling Algorithmic Recommendations in Applications
On 31 December 2021, the Cyberspace Administration of China (CAC), together with three other authorities, jointly published the Regulations on the Management of Algorithm Recommendations for Internet Information Services (Regulations). The Regulations, which will take effect on 1 March 2022, aim to regulate the use of algorithm recommendations for internet information services to enhance protection of users’ rights. This article highlights some of the requirements, especially on protecting the users rights and interests, relevant to the internet service providers that use algorithm recommendations.
In the following discussion, the texts inside a pair of brackets  are my comments added to put the requirements in context to facilitate the readers' understanding.
Chapter 1 of the Regulations are the legal basis of the Regulations, general principles, and the definition of "algorithm recommendations".
The use of algorithmic recommendation technology as used in these Regulations refers to the use of algorithm technologies including direct use of data, individualized pushing [e.g. profiled ads], prioritization and selections [e.g. hotel recommendations], search filtering [e.g. search engine optimization], scheduling and dispatch decision-making [e.g. booked-ride or food delivery dispatch], etc. to provide users with information.
Chapter 2 set down the general provisions such as the developers' periodic review and audit of the algorithms used, developers' assurance of algorithm's compliance to relevant laws, and the algorithm's assurance not to induce the users of excessive use and spending [e.g. addictive video games].
In the Regulations, the entire Chapter 3, consisting of seven articles, stipulates the user rights of several categories of users. The algorithms of the apps or applications targeted for the following categories of users must ensure that:
|Minors||The algorithm shall ensure a "minor friendly" mode is offered in the apps / websites to prevent the minors from obtaining information inappropriate to their ages, or to enable them to imitate unsafe behaviors and conduct contrary to social values, and must not use algorithmic recommendation services to induce minors' addiction to the internet.|
|Seniors||The algorithm shall facilitate seniors in exercising their rights, taking into consideration their needs for travel, obtaining medical treatments, spending and handling personal affairs [e.g. "senior mode" to recommend senior discounts or handicapped transportation]. The algorithms shall provide monitoring, identification and actions to refrain the seniors from fraud targeted to seniors [e.g. remove search results of fraudulent services that targeted seniors].|
|Persons seeking employmen||The algorithms shall protect the job seekers' lawful rights and interests such as the rights to receive compensation, statutory holidays and entitled leaves [e.g. not directing the users to known fraudulent recruitment websites].|
|Consumer||The consumers are not disadvantaged and treated unfairly because of the consumers' buying history [e.g. the price is not intentionally elevated because of the user's previous buying history of purchasing luxury items].|
Chapters 4 to 6 are enforcement and penalty clauses as well as the effective date.
In my personal and humble opinion, despite the Regulations good intention and positive direction, the language used in the Regulations could have been written more precisely with well-defined terminologies. From data protection's point of view, the Regulations imply that profiling and use of data exists -- the Regulations are just to ensure that the data are used "in good faith". The requirements in Chapter 3 could not have been possible without data profiling as well as some weighting and filtering. However, whether this weighting or filtering would be used fairly to users not falling into the above categories could call for a separate research study.
Article provided by INPLP member: Chris Yau (SGS, Hong Kong)
Dr. Tobias Höllwarth (Managing Director INPLP)