The Commission presented a new Regulation proposal on rules to prevent and combat child sexual abuse online, which seeks to permanently replace the temporary regime of Regulation 2021/1232 set to expire in August 2024. The current reliance on a temporary ‘stop-gap’ instrument underlines the difficulty to reach a compromise when regulating an issue situated at the crossroad of several existing instruments of European law – the e-Privacy Directive with its rules on the confidentiality of communications, the General Data Protection Regulation on the protection of personal data and the upcoming Digital Services Act on the regulation of intermediary services.
The Regulation proposal aims to prevent and combat two types of behaviour which constitute online child sexual abuse: the dissemination of child sexual abuse material and the solicitation of children (‘grooming’). Regulation 2021/1232 merely provided a legal framework for the voluntary monitoring of communications by certain interpersonal communications services to detect and remove child sexual abuse material. However, the Regulation proposal intends to impose mandatory, wide-ranging obligations on a broader array of online services deemed vulnerable to misuse for the abovementioned purposes: hosting services, interpersonal communications services, software applications stores and internet access services.
Key measures
Providers of hosting services and interpersonal communications services will have to conduct a risk assessment every three years and put effective and transparent measures in place to mitigate the risks they have identified. These measures are to be reviewed, discontinued or expanded as necessary (e.g., adapting content moderation or recommender systems, enforcement of T&C, age verification). The Regulation proposal therefore heavily incentivises the adoption of mitigation measures by service providers. Similar obligations, although with a reduced scope, will be applicable to providers of software application stores.
The reporting obligations imposed on hosting services and interpersonal communications services are far-reaching, including periodic reporting obligations on their risk assessment and mitigation efforts to the competent ‘Coordinating Authority’ designated pursuant to the Regulation, as well as punctual reporting obligations to the future EU Centre on Child Sexual Abuse in case of awareness of “any information indicating potential online child sexual abuse” on their services (which the EU Centre may further transfer to Europol).
Further, the competent ‘Coordinating Authority’ designated pursuant to the Regulation will have the power to request judicial or administrative authorities to issue orders for the detection of online child sexual abuse, for the removal or disabling of access to child sexual abuse material – depending on the qualification of the service provider. Please find hereunder a summary of the types of order each category of service provider may be issued:
- hosting services: detection, removal and disabling of access
- interpersonal communications services: detection or disabling of access
- internet access services: disabling access
Detection orders
Detection orders represent a major derogation of the principle of privacy of communications enshrined in the e-Privacy Directive. Stakeholders have considered that detection orders pose serious threats to the safeguard of fundamental rights and to the efficiency of encryption, as well as in effect constitute a privatisation of law enforcement tasks in the fight against online child sexual abuse.
Detection orders may be granted where (i) there is evidence of a significant risk of the service being used for the purpose of online child sexual abuse and (ii) the reasons for issuing the detection order outweigh negative consequences for the rights and legitimate interests of all parties affected. In addition, as these detection orders may relate to both known and new child sexual abuse material, as well as to ‘grooming’ behaviour, this detection will require refined analysis by service providers, and thus likely the use of automated or AI-based tools.
The Commission claims that this Regulation proposal is technology-neutral, in that service providers may choose which technologies to install and operate to detect online child sexual abuse. Indeed, while the EU Centre on Child Sexual Abuse (to be established pursuant to the Regulation) will make such technology available to service providers free of charge or subject to relevant licensing conditions, service providers may nevertheless use any technology which are (i) reliable (with a low error rate), (ii) effective in detecting said dissemination or solicitation, (iii) unable to extract any other information from the relevant communications than the information strictly necessary to detect said dissemination solicitation and (iv) least intrusive in terms of privacy and confidentiality, in accordance with industry state of the art.
Liability and enforcement
Similar to the Digital Services Act, the Regulation proposal currently contains a ‘Good Samaritan’ clause, whereby service providers will not be liable for child sexual abuse offences solely because they have carried out activities to comply with the Regulation (e.g., detecting, identifying or removing online child sexual abuse).
Regarding enforcement, national competent authorities (Coordinating Authorities) will be designated and tasked with the enforcement of the Regulation. They will be granted investigatory powers (e.g., provision of information and on-site inspections), enforcement powers (e.g., orders of cessation, imposition of fines and/or penalty payments, adoption of interim measures) and more. Additionally, a new European Union Agency to prevent and combat child sexual abuse is to be established in The Hague (‘the EU Centre on Child Sexual Abuse’).
Next steps
The Regulation proposal is open to feedback until at least 20 July 2022 and this feedback will be summarised and presented to the European Parliament and Council.
For further information, please contact:
Lisa Gius, Bird & Bird
lisa.gius@twobirds.com