| Dokumendiregister | Siseministeerium |
| Viit | 5-1/29-1 |
| Registreeritud | 26.05.2022 |
| Sünkroonitud | 05.12.2025 |
| Liik | Väljaminev kiri |
| Funktsioon | 5 EL otsustusprotsess ja rahvusvaheline koostöö |
| Sari | 5-1 Euroopa Liidu otsustusprotsessi dokumendid (AV) |
| Toimik | 5-1/2022 |
| Juurdepääsupiirang | Avalik |
| Juurdepääsupiirang | |
| Adressaat | Politsei- ja Piirivalveamet, Kaitsepolitseiamet, Sisekaitseakadeemia, Päästeamet, Siseministeeriumi infotehnoloogia- ja arenduskeskus, Häirekeskus, Sotsiaalkindlustusamet, Lastekaitse Liit, Riigiprokuratuur, Õiguskantsleri Kantselei, Andmekaitse Inspektsioon, Startup Estonia , Eesti Väike- ja Keskmiste Ettevõtjate Assotsiatsioon, Eesti Tööandjate Keskliit, Küberkriminalistika ja küberjulgeoleku keskus, SA Eesti Inimõiguste Keskus, Tarbijakaitse ja Tehnilise Järelevalve Amet, Eesti Infotehnoloogia ja Telekommunikatsiooni Liit, Riigi Infosüsteemi Amet |
| Saabumis/saatmisviis | Politsei- ja Piirivalveamet, Kaitsepolitseiamet, Sisekaitseakadeemia, Päästeamet, Siseministeeriumi infotehnoloogia- ja arenduskeskus, Häirekeskus, Sotsiaalkindlustusamet, Lastekaitse Liit, Riigiprokuratuur, Õiguskantsleri Kantselei, Andmekaitse Inspektsioon, Startup Estonia , Eesti Väike- ja Keskmiste Ettevõtjate Assotsiatsioon, Eesti Tööandjate Keskliit, Küberkriminalistika ja küberjulgeoleku keskus, SA Eesti Inimõiguste Keskus, Tarbijakaitse ja Tehnilise Järelevalve Amet, Eesti Infotehnoloogia ja Telekommunikatsiooni Liit, Riigi Infosüsteemi Amet |
| Vastutaja | Barbara Haage (kantsleri juhtimisala, sisejulgeoleku-, korrakaitse- ja migratsioonipoliitika asekantsleri valdkond, korrakaitse- ja kriminaalpoliitika osakond) |
| Originaal | Ava uues aknas |
Brussels, 13 May 2022
(OR. en)
9068I22 ADD'
JAl 641 ENFOPOL 256 CRIMORG 69 1X11 119 DATAPROTECT 149 CYBER 170 COPEN 182 FREIP 98 TELECOM 216 COIPET 332 MI 388 CONSOM 117 DIGIT 97 CODEC 690 IA 71
Council of the European Union
Interinstitutional File: 2022101 55(COD)
secretary-General of the European lommission, signed by Ms Martine DEPREZ, Director
12 May 2022
General S ecretarat of the louncil
WD(2022) 209 final
10M11SS10N S TAFF WORKING DOCUMENT IMPACT ASSESSMENT REPORT Accompanying the document Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL lavina down rules to orevent and combat child sexual abuse
COVER NOTE
From:
date of receipt: To:
No. lion doc.:
Subect
Delegations will丘 nd attached document S WD(2022) 209 final.
Enel.:S WD(2022) 209 final
FL/rn
EN JAI.1
9068/22 ADD
そ
EUROPEAN COMM 1SS10N
Brussels, 11.5.2022
SWD(2022) 209 final
COMMISSION STA FF WORKING DOCUMENT
IMPACT ASSESS 1ENT REPORT
Accompanying me dociiin ent
Proposal fora REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL
laying down rules to prevent and combat child sexual abuse
{COM(2022) 209 final} - {SEC(2022) 209五 nal} - {SWD(2022) 210 frnal}
EN EN
Contents
1.
2.
2.1
Z.Z
乙.j
3.
j . 1
j.Z
j.j
4.
4.1
4.L
5.
う.1
う.Z
つ.j
6.
6.1
6.2
7.
LI
LZ
8.
8.1
ど.Z
8.3
&4
9. HOW WILL ACTUAL IM PACTS BE MONITORED AND EVALUATED?..................115
ANN EXE S...................................................................................................118
Term/Acronym Definition
AI Artificial Intelligence
API Application Programming Interfaces
Classifiers A form of artificial intelligence, an algorithm that sorts data into labelled
classes or categories
CSA ChildSc xual Abuse
CSAonline CSA content refers to text-based exchanges, photos, videos and other material
illegal under EU law (CSA Directive). In this document it refers to the three
main types of abuse: known CSAM, new CSAM and grooming
CSADirective Directive 2011/93/EU of 13 December 2011 on combating the sexual abuse
and sexual exploitation of children and child pornography
CSAM ChildS exua1 Abuse Material, e.g. images and videos
CSEA ChildS exual Exploitation and Abuse
Darkweb Websites not indexed by conventional search engines, making use of masked IP addresses, which are only accessible with a special web browser
DSA DigitalS ervices Act Proposal for a Regulation on a S ing1e Market for Digital S ervices and amending Directive 2000/3 1/EC, COM(2020) 825 final
E2EE End-to-end Encryption
EECC Directive 2018/1972/EU of 11 December 2018 establishing the European Electronic Communications Code
E-evidence Electronic evidence: electronically stored data such as subscriber information, metadata or content data
Encryption Process of changing electronic information or signals into a secret code or
cipher
Grooming Offenders building trust and a relationship with a child in an effort to gain access to the minor for sexual exploitation or abuse. Also known as solicitation
Hash A unique digital code created by a mathematical algorithm ("hashing") that
becomes this file's signature, or its hash value
Hotline Child sexual abuse hotlines deal with questions about or reports of child sexual
abuse. They can report content to law enforcement, take action for CSAM to be removed from the internet and act as interest groups
'P address Internet Protocol address: a unique identifier allowing a device to send and
receive packets of information; a basis for correcting to the Internet
つ
International S tandard Classification of Occupations
Any type of software designed to disrupt the normal functioning of a computer, server, or computer network
National Centre for Missing and Exploited Children (US private, non-profit organisation) to which online service providers are required to report under US
law instances of potential child sexual abuse that they find in their networks
Over-the-Top conununications services enable direct interpersonal and
interactive exchange of information via electromc conununications (i.e. the
Internet), without colmecting to the public telephone network
Peer-to-peer sharing describes networks in which each computer can act as a
server, allowing files to be shared directly without the need for a central server
The most widely used tool based on hashing technology, available free of
charge, based on a licensing agreement tailored to avoid abuse and use for any other purpose than the detection ofCSA
The embedding of the rights and safety of users into the design and
functionality of online products and services from the outset
Sustainable Development Goals, a set of 17 interliiiked goals established by the UN in 2015 as "a blueprint to achieve a better and more sustainable future for
all people and the world by 2030"
Enterprises that do not exceed a staff headcount of 250 people, a turnover of
EUR 50M and an ainual balance sheet total of EUR 43M
A program under which an organisation designates certain persons or
organisations whose reports of online CSA are trusted to meet sufficiently high standards, and may be treated differently, for example by being given higher pnonty 釦r review
Uniform Resource Locator, i.e. the address of an internet object (e.g. an image, a video, or an entire website)
Maiware
NCMEC
OTTs
P2P
PhotoDNA
Saft-by-design
SDGs
SMEs
Trusted flagger
program
URL
3
1. INTRODUCTION: POLITICAL AND LEGAL CONTEXT
Lhildren tace a number it risks in their daily lives, both online and oln ine, trom which
they cannot fully protect themselves. One of these risks is that of being sexually abused
during childhood. The initiative assessed here aims to complement the existing EU framework by defining the responsibilities of certain online service providers to protect children against sexual abuse. In the absence of harmonised rules at EU level, providers of social media platforms, gaming services, and other hosting and online communications services find themselves faced with divergent rules across the internal market. The
proliferation of rules is increasing, with recent legislative changes in the Netherlands and
Germany, and at the same time there is evidence that current efforts at national level are insufficient to successfully address the underlvina p roblem.
Children have the fundamental right to such protection and care as is necessary for their
well-being, and their best interests must be a primary consideration in all actions relating to theml . Consequently, the fight against child sexual abuse (CsA) is a priority for the EU2. In the July 2020 EU strategy for a more effective fight against child sexual abuse, the Commission set out eight concrete actions, implementing and developing the right legal framework and catalysing multi-stakeholder efforts in relation to prevention and
investigation of these crimes and assistance to victims and survivors.
The legislative proposal that this impact assessment accompanies responds to the commitment undertaken in the strategy to propose the necessary legislation to tackle child sexual abuse effectively, online and off1ine3. In particular, this initiative:
1. sets out obligations to detect, report and remove child sexual abuse online to
bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse4; and
2. establishes an EU Centre to prevent and counter child sexual abuse to provide comprehensive support for the implementation of the proposed Regulation by service providers and to Member s ttes, in the fight against child sexual abuse5
The conmitmet and this initiative respond to the calls for action from the Council, the
European Parliament, and the European Economic and soc ia1 Committee6, and
EU Charter of Fundamental Rights, Art. 24(i) and (2). EU strategy for a more effective fight against child sexual abuse, CII (2020) 607, 24 July 2020, p.2. 'bld, p. 6.
'bidp.5. 'bid, p. 12. This initiative is the outcome of the connuitment in the strategy to start working towards the possible creation of an EU Centre to prevent and counter child sexual abuse.
European Parliament resolution of 26 November 2019 on children's rights on the occasion of the30血
almiversary of the UN Convention on the Rights of the Child (20i912876(RSP)); Council conclusions on combatting the sexual abuse of children of 8 0ctober 2019, No. i2862/i9; European Economic and S ocia1 Committee, Combatting child sexual abuse online,T EN/72i CII (2020) 568 final 2020/0259
COD, 29 0ctober 2020.
4
globally in multiple forums7, including by online service providers8 and in the media9, as
it has become evident that current measures are falling short of effectively protecting the
right of children to live free from sexual violence. This initiative is therefore expected, as the need to better prevent and combat child sexual abuse through additional legislation was already clear during the preparation of the 2020 strategy, and also during the inter- institutional negotiations of the Interim Regulation (see below).
The initiative aims to build on and complement the existing policy instruments in the
fight against CsA, which can be grouped into legislation, coordination and funding10
1. Legislation
The existing legal framework consists of measures in the areas of criminal law,
protection of privacy and personal data, and the internal market, regulating online and telecommunications services and content moderation. It includes:
horizontal instruments in the area of data protection and online privacy (e.g. GDPR11 and e-Privacy Directive 12 and its proposed revision13), and of the single market for digital services (e.g. e-Commerce Directive14 and the proposed Digital s ervices Act15),
sector-specific legislation, such as the Child s exua1 Abuse Directive16, the EuroDol Regultion17 and its orooised revision18, the Interim Regultion
7 E.g. at the December 2019 summit 0f the WePROTECT Global Alliance to End Child Se xua1
Exploitation Online , or by the "Five Eyes" (US, UK, Canada, Australia and New Zealand) in 2019.
8 S ee for example a call for clear legal frameworks to deal with harmful content by Facebook, Referring Former President Trump
's S uspension From Facebook to the Oversight Board. blog post by Nick
Clegg. VP of Global Affairs. 21 January 2021. 9 See, for example, the series of New York Times articles published from S eptember 2019 to February
2020, which exposed to the public, the depth and complexity of the problem. 1O Annex 5 contains additional information on relevant legislation and policy. n Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection
of natural persons with regard to the processing of personal data and on the free movement of such
data, and repealing Directive 95/46/C ('General Data Protection Regulation'), 0.IL 119, 4.5.2016. 12 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the
processing of personal data and the protection of privacy in the eleclionic communications sector
('Directive on privacy and electronic communications'), 0.IL 201, 3 1.7.2002. 13 Proposal for a Regulation of the European Parliament and of the Council concerning the respect for
private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications) C0M12017/010 final - 2017/03
(COD). 14 Directive 2000/3 1/EC of the European Parliament and of the Council of 8 June 2000 on certain legal
aspects of information society services, in particular electronic commerce, in the Internal Market
('Directive on electronic commerce'), 0.IL 178, 17.7.2000. 15 Proposal for a Regulation of the European Parliament and of the Council on aS ing1e Market For
DigitalS ervices (Digital S ervices Act) and amending Directive 2000/3 1/EC of 15 December 2020, C0M12020/825 final.
16 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on
combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA, OIL 335, 17.12.2011.
17 Regulation (EU) 20 16/794 of the European Parliament and of the Council of 11 May 2016 on the
European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/{A, 2009/934/JHA, 2009/935/I{A, 2009/936/lA and 2009/968/IlA, OJ L 135, 24.5.2016, p. 53-114.
18 Proposal for a Regulation of the European Parliament and of the Council amending Regulation (EU) 2016/794. as re2ards Euronol's cooneration with nrivate narties. the Drocessinu of nersonal data by
derogating from the application of certain rights and obligations under the
ePrivacy Directive19, and the Victims' Rights Directive20
Horizontal instruments
The General Data Protection RegulationぐGDPR)
What it does: the GDPR sets out rules on the processing of personal data relating to individuals, specifying the fundamental right to protection of personal data.
How CSA -related responsibilities are distributed between EU and Member S tates: as a horizontal instrument, the GDPR does not contain CSA -specific provisions, but it applies to all activities of processing personal data, including those related to CSA, except for those carried out by competent authorities for the
purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, which are covered by Directive 20 16/680/EU21. Member s tates are notably responsible for enforcement through their data protection authorities, and the European Data Protection Board (EDPB) is tasked with the consistent application of the GDPR.
How the proposed legislation builds on and interacts with the GDPR: the
proposed legislation builds on the GDPR, including its Article 6 which allows,
e.g., processing of personal data to comply with a legal obligation (Art.6 (l)(c)), or when processing is necessary for the purpose of legitimate interest (Art. 6 (l)(f)).
The ePrivacァDirec万ve and itsProPosed revision
What it does: the ePrivacy Directive and the proposed Regulation for its revision of
and harmonise national rules to ensure an equivalent level of protection
in particular the right to privacy of personal particularise
respect to the processing These ePrivacy rules
and with services.
are distributed between EU and Member the ePrivacy Directive and the proposed CSA-specific provisions; they apply to any
electronic communications. Member
through their competent national
fundamental rights and freedoms,
confidentiality of communications, data in electronic communications and complement the GDPR.
How CS A-related responsibilities S tates: as horizontal instruments, successor Regulation do not contain
of specified data categories in
responsible for enforcement processing Sttes are authorities
Europol in support of criminal investigations, and Europol's role on research and innovation of 9 December 2020,COM !2020/796 final.
19 Regulation (EU) 2021/1232 0f the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/C as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of
personal and other data for the purpose of combating online child sexual abuseOJ L 274, 30.7.2021, p. 41-51 20 Directive 20 12/29/EU of the European Parliament and of the Council of 25 0ctober 2012 establishing
minimum standards on the rights, support and protection of victims of crime, and replacing Council Framework Decision 2001/220/JHA, OIL 315, 14.11.2012.
21 Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the
protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA.
6
How the proposed legislation builds on and interacts with the ePrivacy
Directive and its proposed revision: the proposed legislation would limit the
scope of certain rights and obligations which are currently in the ePrivacy Directive, notably those on the confidentiality of communications and related data in order to enable companies to identify child sexual abuse taking place on their
systems a丘er the issuance ifa detection order, subject to strict safeguards.
The eComlllere Directive
What it does: the eCommerce Directive sets out a framework fir the provision of information society services in the internal market. One of its key principles is a conditional liability exemption framework for providers of specific categories of information society services. In principle, providers may not be held liable for information (including illegal content) that they host (store), cache (temporarily store) or transmit during the provision of their services, subject to the conditions laid down in the Directive. For example, this means that providers of hosting services may not be held liable for information they host, unless they gain actual
knowledge or awareness of the illegality and fail to act expeditiously. The Directive also prohibits Member S ttes from imposing general obligations to monitor their services or to actively seek facts or circumstances indicating illegal activity. The eCommerce Directive does not establish a legal basis for any processing of personal data.
How CSA -related responsibilities are distributed between EU and Member S tates: as a horizontal instrument, the eCommerce Directive does not contain CSA -specifc provisions. It governs activities of relevant service providers. Member S ttes are responsible for enforcement through their national authorities
How the proposed legislation builds on and interacts with the eCommerce Directive: the proposed legislation imposes narrowly targeted obligations to
detect, report and remove child sexual abuse online, based on specific indicators and requirements to ensure compatibility with the eCommerce Directive (see box
9).
The Digital S ervice Act proposal
What it does: the Digital S ervices Act (DSA) proposal, if adopted as proposed, and building upon the eCommerce Directive's framework, would provide a horizontal standard for content moderation by providers of intermediary services. It would remove a number of disincentives for providers' voluntary efforts to detect, remove or disable access to illegal content (including child sexual abuse material,CSAM ) and would create obligations for them to provide information on their content moderation efforts when requested by national authorities. The DSA would also create additional due diligence obligations tailored to specific categories of providers of intermediary services (e.g. hosting services, online platforms, very large online platforms) as well as transparency reporting obligations. For instance, it would require hosting services to put in
place notice and action mechanisms enabling any user or entity to notify them of the presence of suspected illegal content. Furthermore, the DSA would oblige very large online platforms to implement risk mitigation measures on their services. The DSA would also establish rules on its implementation and
enforcement, including as regards the cooperation of and coordination between
7
the competent authorities. The DSA would not establish a legal basis for any
processing of personal data.
How CSA -related responsibilities are distributed between EU and Member S tates: as a horizontal instrument covering all types of illegal content, the DSA does not contain CSA -specific provisions. The DSA would create a framework at EU level for the notification of materials noticed by users to companies, with
obligations for companies to respond to orders issued by public authorities in Member S ttes, as well as additional due diligence requirements for very large platforms. For the very large platforms, a stronger role for the Commission in the enforcement process is also being considered during the ongoing inter- institutional negotiations at the time of writing. How the proposed legislation builds on and interacts with the DSA as proposed: the proposed legislation complements the DSA notably by specifying mandatory removal of CSAM when ordered and a comprehensive reporting obligation tailored to the specificities of CSA online, which often takes place hidden from
public view and demands specific follow-up where identified. These specificities require a different approach from the horizontal one of the DSA. Finally, as the DSA aims to maintain some of the main principles of the eCommerce Directive,
including the prohibition of general monitoring obligation and the unavailability of the liability exemption for hosting services if failing to act after obtaining actual knowledge or aware of the illegality of the content, the considerations above made for the eCommerce Directive also aDp lv to the DsA.
The Victims ' RightsDirective
What it does: the Victims' Rights Directive establishes minimum standards on the rights of, support for and protection of victims of crime and ensures that they are recognised and treated with respect. They must also be granted access to
justice. How CSA -related responsibilities are distributed between the EU and Member S tates: as a horizontal instrument, the Victims' Rights Directive, applicable to all victims of crime, does not contain CSA -specific provisions. The EU adopted specific rules for victims of child sexual abuse and sexual exploitation under the Child Se xua1 Abuse Directive (see below), to respond more directly to the
specific needs of those victims.
How the proposed legislation builds on and interacts with the Victims' Rights Directive: whilst the proposed legislation focuses on strengthening the
functioning of the internal market by setting common rules aimed at preventing and combating the misuse of online services for CSA-relted purposes, it could also help support and facilitate the work of Member S ttes on assistance to victims of CSA, notably through the creation of the EU Centre to prevent and counter CSA, which would facilitate research and the exchange of best practices among Member S ttes. The proposed legislation does not create new obligations for Member S tates in this respect.
g
Sector-specific legislation
The Child Se xm1 Abuse Directive
What it does: the Child S exua1 Abuse (CSA) Directive's main objective is to harmonise minimum criminal law rules at EU level concerning the definitions of child sexual abuse and exploitation offences and corresponding sanctions and to
require the establishment of prevention measures in this area. It also requires Member S tates to ensure the provision of assistance and support to victims
before, during and after the conclusion of criminal proceedings. In terms of websites disseminating CSAM , the Directive requires Member S tates to take
necessary measures to ensure the prompt removal of webpages hosted in their
territory and to endeavour to obtain the removal of such pages hosted outside their territory. It also enables Member S tates to take voluntary measures to block access to web pages containing or disseminating CSAM within their territory, while providing safeguards (restriction is limited to what is necessary and
proportionate; users are informed of the reason for the restriction and of the
possibility of judicial redress). The Child S exua1 Abuse Directive does not establish a legal basis for any processing of personal data.
How CSA-related responsibilities are distributed between EU and Member S tates: the Directive defines a minimum set of standards at EU level to define and sanction these crimes, prevent them and assist victims. Member S ttes are
required to comply with these minimum rules and may go beyond them if they consider it necessary.S imi1ar1y, the Directive defines the responsibilities of Member S tates but leaves to national authorities to comply with those
responsibilities in the way that suits best the national specificities (e.g. on
prevention programmes). How the proposed legislation builds on and interacts with the Child S exua1 Abuse Directive: the former is intended to reinforce and complement the latter without creating urnecessary overlaps. Whereas the Directive focuses on defining the roles and responsibilities of Member S tates' authorities in the fight against CSA using the tools of criminal law, the proposed legislation focuses, from an internal market angle, on defining the roles and responsibilities of private companies offering their services in the S ing1e Market, notably concerning the
detection, reporting and removal of CSA online. Nonetheless, the proposed legislation could help support and facilitate the efforts by Member S tates to meet the obligations defined in the CSA Directive relating to prevention and assistance to victims, notably through the creation of the EU Centre to prevent and combat CSA. The proposed initiative caimot address remaining implementation issues with the Directive. A study has been launched to prepare the evaluation of the CS A Directive and at the moment there are ongoing infringement procedures against 21 Member S tates. The majority of the challenges Member S tates face in the
implementation concern offline prevention measures (in particular prevention programmes for offenders and for people who fear that they might offend) and criminal law definitions. Exchanges between the Commission and Member S tates are ongoing to ensure that they swiftly address these remaining issues. The Commission has also organised dedicated expert workshops with Member S tates to facilitate the exchange of lessons learned and of best practices in national
experiences in the implementation of the CSA Directive. That said, the present
9
legislative initiative could indirectly have a positive effect on the implementation
of the Directive, in particular through the EU Centre as an expert hub and facilitator of exchanges of knowledge and best practices.
The "Interim Reguたtim"
What it does: voluntary detection of CSAM and grooming in certain online communication services like instant messenger and email has been made subject, as of 21 December 2020, to comply with the ePrivacy Directive's rules on
confidentiality of communications, due to changes in the definitions of the
European Electronic Communications Code becoming effective and those services consequently fell under the ePrivacy Directive. To address this issue, the Commission proposed a temporary derogation from the application of certain
rights and obligations under the ePrivacy Directive, for the sole purpose of
detecting and reporting CsA and removing CsAM. The hrterim Regulation22, which entered into force on 2 August 2021, enables those services to continue such practices on a voluntary basis, provided those practices are lawful and, in
particular, meet a range of conditions. The Regulation ceases to apply three
years after its entry into force. The Interim Regulation does not establish a legal basis for any processing of personal data.
How CSA-related responsibilities are distributed between EU and Member S tates: the Commission is responsible for making a list of names and
organisations acting in the public interest against CSA to which providers report CSA online, for requesting the European Data Protection Board (EDPB) to issue
guidelines for the purpose of assisting the supervisory authorities in assessing whether processing falling within the scope of the Regulation complies with the
GDPR, and for preparing a report on the implementation of the Regulation. Member S tates are notably responsible for enforcing the Regulation and for statistics related to the detection, reporting and follow up of the CSA reports. How the proposed legislation builds on and interacts with the Interim
Regulation: the proposed legislation replaces the Interim Regulation, and uses it as a reference to present a long-term framework that maintains some of its elements and covers a wider range of services, including private communications.
The Ellrpol Regulation and its proposed revision
What it does: the Europol Regulation sets out the mandate of the European Union's law enforcement agency, which is to support and strengthen action by competent authorities of the Member S tates and their mutual cooperation including in preventing and combating serious forms of crime, such as sexual abuse and sexual exploitation. Among other tasks, Europol's current mandate allows the agency to collect, store, process, analyse and exchange information,
including criminal intelligence; to notify the Member S ttes of any information and connections between criminal offences concerning them and to coordinate, organise and imDlement investigative and oDerational actions to suvDort and
of the Council of 14 July 2021 on a of
the use
f ~r
2002/58/C as regards the services
Regulation (EU) 2021/1232 of the European Parliament and
temporary derogation from certain provisions of Directive
technologies by providers of number-independent interpersonal conimunications
processing of personal and other data for the purpose of combating online child sexual abuse.
10
strengthen actions by the competent authorities of the Member S tates. The
proposed revision of Europol's mandate would notably allow it to receive data from private parties directly, subject to certain conditions.
How CSA -relted responsibilities are distributed between EU and Member S tates. Europol can support Member S tates' actions in preventing and combating CSA crimes. In particular, Europol receives reports from online service providers via the US National Centre for Missing and Exploited Children (NCMEC) for 19 Member s tates23, completes these reports with its own information (if any) and forwards them to the Member S tates' authorities.
How the proposed legislation builds on and interacts with the Europol Regulation and its proposed revision. The proposed legislation creates an EU Centre to prevent and counter CSA, which will work closely with Europol. The Centre will receive the reports from online service providers, check that they are
likely to be actionable, i.e. they are not manifestly unfounded and can thus in
principle be acted upon, and forward them to Europol so that it can enrich the
reports with additional criminal intelligence, as well as to national law enforcement agencies. This would ensure that Europol and national law enforcement resources are focused on key investigative tasks such as swi丘ly rescuing victims from ongoing abuse, rather than on e.g. filtering out the reports that are not relevant. The revised Europol mandate would complement the
proposed legislation in particular on the ability for Europol to receive and process reports from the EU Centre originating from online service providers.
2. Coordination
The existing legal framework is complemented by practical efforts at EU level to step up the fight against CSA in all areas: investigations, prevention, and assistance to victims.
EU kvel cocPeration in investigations
What it does: Europol provides EU level coordination for investigation of cross- border cases. In addition, the EU policy cycle (EMPAcT)24 serves to coordinate the operational priorities of Member S tates' law enforcement authorities in the area of combating CSA, to organise joint operations and strategic approaches to
specific phenomena from a law enforcement perspective. Europol also helps coordinate investigations involving law enforcement agencies in third countries and in the Member S ttes.
How CSA-related responsibilities are distributed between EU and Member S tates: Europol supports operational action by law enforcement agencies in Member S tates at their request. Europol does not have executive powers (i.e. it is not a "European FBI"). How the proposed legislation builds on and interacts with existing EU level
cooperation in investigations: the proposed legislation aims to support the
existing cooperation in investigations by ensuring that the reports from online service providers that reach Europol and national law enforcement agencies are actionable and relevant. The EU Centre would not have any operational capability
the information directly from NCMEC due to e.g. extremely swifl action.
23 The rest of MemberS ttes have chosen to receive their national data retention regimes, which require
24 More information can be found here.
11
on investigations, but would support them indirectly by facilitating the process of
detection, reporting and removal 0fCSA online by service providers.
EU kvel cooperation m Prevention
What it does: at the moment, EU level cooperation in prevention of CSA is
fragmented and limited to ad hoc expert meetings organised by the Commission to support Member S ttes in the implementation of the CSA Directive, initiatives on awareness raising under EMPACT and Europol. The 2020 CSA S trategy aimed to boost EU level efforts on prevention by making it one of its pillars. S pecifca11y, the S trtegy included the EU Centre to prevent and counter CSA , which will also carry out certain tasks relating to prevention. TheS trategy also announced the launch of a prevention network of practitioners and researchers to support the EU Member S tates in putting in place usable, rigorously evaluated and effective prevention measures to decrease prevalence of child sexual abuse in the EU. The network will aim to give structure and regularity to exchanges of
knowledge and best practices between Member S ttes.
How CSA -related responsibilities are distributed between EU and Member S tates. The CSA Directive requires Member S tates to implement provisions while leaving it to them to determine exactly what these measures or programmes are. The degree to which the requirements of the Directive are fulfilled vary among the Member S tates (see section 2.2.3.). How the proposed legislation builds on and interacts with existing EU level
cooperation in prevention. The proposed legislation will establish the EU Centre, which will be the driving force of the work relating to preventing and combating CSA at EU level. Whilst the Centre would principally focus on its tasks set out in the envisaged legislation connected to the common rules for online service
providers to combat CSA online, the Centre could also contribute to and facilitate Member S tates' work relating to prevention, for instance through the involvement of multiple stakeholders and the sharing of best practices and lessons learned across Member S ttes. The proposed legislation will not create new obligations for Member S tates on prevention.
EUルvel cooperation in assistance to victims
What it does: EU level cooperation in assistance to victims takes place currently through the Victims' Rights Platform25, which deals with horizontal issues relevant for victims' rights. The platform brings together representatives of EU level networks, agencies, bodies and civil society organisations relevant for the
implementation of the EUS trategy on victims' rights. How CSA-related responsibilities are distributed between EU and Member S tates: the platform facilitates the implementation of the EU strategy on victims'
rights, which details key actions for the European Commission and for Member S tates. Also, the CSA Directive requires Member S tates to implement provisions related to assistance to victims, while leaving it to them to determine exactly what these measures are. The degree to which the requirements of the Directive are fulfilled varies among the Member S tates (see section 2.2.3.).
25 More information is available here
12
How the proposed legislation builds on and interacts with existing EU level
cooperation in assistance to victims: apart from its main tasks in the process of
combating CSA online, the EU Centre could also facilitate and support Member S ttes action in assistance to victims of CSA, specifically by serving as a hub of
expertise to support evidence-based policy development, help develop research on assistance to victims, including victims' needs and the effectiveness of short- term and long-term assistance programmes. The Centre will also support victims, at their request, in having their images and videos taken down by assisting them in exchanges with the relevant online service providers. The EU Centre could
participate in the Victims' Rights Platform to contribute to the discussion of horizontal issues concerning victims and to the implementation of the EU strategy on victims' rights. The proposed legislation will not create new obligations for Member S tates on assistance to victims.
Mldti-stake加lder cooperation at EU and global level
What it does: at EU level, the Commission facilitates multi-stakeholder
cooperation between service providers and national authorities in the fight against csA online through the EU Internet Forum26, which brings together online service providers and ministers of interior of all Member S tates. At global level, the Commission continues to contribute to increasing voluntary standards for the protection of children against sexual abuse by promoting multi- stakeholder cooperation, through the WeProtect Global Alliance to End Child se xua1 Exploitation Online (WPGA)27. How CSA-related responsibilities are distributed between EU and Member S tates: at EU level, the Commission organises the EU Internet Forum, in which Member S tates participate at ministerial level (once a year), and at various levels in the technical discussions. Depending on the initiative, Member S tates and!or the Commission may be responsible for the execution. At global level, the Commission participates in the policy board of the WPGA, as one of its founding members. Member S tates are WPGA members and notably participate in its biannual global summit (the next one will take place in Brussels in June 2022 and will be co-hosted by the Commission and the French Presidency of the Council of the EU). How the proposed legislation builds on and interacts with existing multi- stakeholder cooperation at EU and global level: the proposed legislation builds on the experience of the EU Internet Forum and the WPGA and aims to boost multi- stakeholder cooperation in the EU and globally in the fight against CSA , through the EU Centre. The Centre will be an independent facilitator that will bring together all the relevant actors in the EU and beyond in any aspect of the fight against CSA, including investigations, prevention and assistance to victims, to
ultimately facilitate and support Member S tates' action in those areas. The Centre will have a more operational focus than the EU Internet Forum and the WPGA,
26 More information is available here. The We Protect Global Alliance to E nd Child S exul Exploitation Online is a not-for-profit organisation resulting from the merger between UK-led We Protect and the Global Alliance Against Child S exua1 Abuse Online launched by the Commission in 2012. Its aim is to raise standards and to foster a slronger and more coherent response around the globe and across stakeholder groups. It includes 98 counlries, 45 comparies and 65 civil society organisations and international institutions.
13
which are centred on policy and are not designed to play a role in facilitating day- to-day efforts on the ground.
3. Funding
What it does: the 2020 strategy includes a commitment to continue providing funding for fighting child sexual abuse, e.g. to support the development of national capacities to keep up with technological developments. The Commission has organised regular calls for project proposals to fight the online and offline
aspects of child sexual abuse, with a total value of 61 million euro in the last 10
years (funded under H orizoii202o and Internal s ecurity Fund28). Notable
examples of EU-funded projects include: The INHOPE network of hotlines, where users can report child sexual abuse materials they encounter online (formerly funded through the Connecting Europe Facility programme, and currently under the DIGITAL Europe programme). The content is analysed, and if assessed as illegal, hotlines
notify the relevant online service providers requesting the swift removal of the
content, and report the case to the relevant law enforcement agency for victim identification purposes. National hotlines are an important element of
implementation of Article 25 of the CSA Directive, as a majority of Member S ttes has chosen to implement most of this article through the hotlines. As of
January 2022, the INHOPE network consists of 46 hotlines in 42 countries
(including all Member S tates except S 10vakia); o The International ChildSe xua1 Exploitation (ICSE) database at Interpol,
which is an important tool enabling law enforcement to identify victims
globally. The database has helped identify 23,564 victims worldwide at the time of wrltng29
The Commission has also financially supported the adoption of theβa rlahus model of child-friendly, multidisciplinary protection of child victims during criminal proceedings, which includes limiting the number of interviews of child victims and conducting them by trained experts, as a standard in the EU.
How CSA-related responsibilities are distributed between EU and Member S tates: the Commission manages the funding instruments mentioned above. That
said, part of the Internal Se curity Fund is managed by Member S tates under the
supervision of the Commission, and Member S tates also contribute own funding to the efforts, to a varying extent.
How the proposed legislation builds on and interacts with existing funding mechanisms: the creation of the EU Centre requires dedicated EU funding, and no
changes will be made to existing funding mechanisms. However, increased coordination and cooperation in prevention efforts facilitated by the EU Centre
may also result in more targeted and higher-quality proposals during future
funding rounds.
The latest open call for proposals of 16M EUR to prevent, assist victims, and combat child sexual abuse was launched on 16 December 2021, with a deadline for submission of proposals until 24
February 2022.
Interpol, Intemational Child S exua1 Exploitation database, accessed in January 2022.
14
ent Goals (SDGs)
Relevant S ustainble Devel
The most relevant S DGs for this initiative are 5.2., eliminate all forms of violence against women and girls, and 16.2., end abuse, exploitation, trafficking and all forms of violence
against children.
Other S DGs of particular relevance are those that address risk factors of CSA , such as SDG i on poverty (e.g. children forced by their parents to be sexually abused online), SDG 3 on health (e.g. given the short and long-term negative health consequences of CSAon children),S DG 4 on education (e.g. prevention campaigns to raise awareness of CSAonline risks), and S DG 9 on industry, imiovation and infrastructure (e.g. as the initiative aims to support service providers efforts to fight against CSA online, including tbrough the EU Centre).
15
2. PROBLEM DEFINITION
Table i shows the intervention logic (problem, drivers, objectives and options) that will be described and analysed in the impact assessment:
Tabk 1: probklll, probルr drivers,0勿ectives an d options伽tervention
E
Option D +
mandatory detection of
'grooining' (solicitation of children)
01)tiOis
Legislative
C D
OptionB Option C + +
mandatory mandatory detection of detection of known child new child sexual abuse sexual abuse material material
B
Non-
legislative A
Specific objectives General
objective
Problem drivers Problem
OptionA +
legislation 1) specifying the
conditions for
voluntary detection,
2) requiring
mandatory reporting and removal of online child sexual abuse,
3) expanding the EU Centre to also support detection,
reporting and removal
Practical measures to enhance
prevention, detection,
reporting and
removal, and assistance to
victims, and
establishing an EU Centre on
prevention and assistance to victims
1. Ensure the effective
detection, removal and
reporting of online child sexual abuse where
they are currently
missing
2. Improve legal certainty,
transparency and
accountability and ensure protection of
fundamental rights
3. Reduce the
proliferation and effects of child sexual
of rules
of efforts
16
abuse through harmonisation and increased coordination
improve the
Ihnctioning of the Intemal
Market by inlroducing clear, uniform and balanced EU rules to
prevent and combat child sexual abuse
1. Voluntary action by online service providers to detect online child sexual abuse has
proven insufficient
2. Inefficiencies in public- private cooperation between online service providers, civil
society organlsations and
public authorities hamper an effective fight against child sexual abuse
3. Member S ttes' efforts to
prevent child sexual abuse and to assist victims are
limited, divergent and lack coordination and are of unclear effectiveness
SOmedulla sexual abuse crimes are
not adequately addressed in the EU due to
challenges in their
detection,
reporting and action by relevant services providers, as well as
血sufficient
prevention a rd
assistance to victims. Diverging national responses negatively affect the Internal Market
2.1. What is the problem?
乙1.1. Definition and lllagnitme
The problem that this initiative tackles is that providers of certain online services offered in the EU face divergent rules at national level when it comes to their responsibility for
preventing and combating child sexual abuse on their services. At the same time, the existing responses at national level to some child sexual abuse30 crimes are proving insufficient
Challenges persist in detection, reporting and action by relevant service providers, as well as insufficient prevention, assistance to victims and cooperation. The divergence of national responses to the problem creates legal fragmentation which negatively affects the Internal Market.
Prevalence
At least one in five children falls victim to sexual violence during childhood31. A global study of childhood experiences in 2021 found that than one in three respondents (34%) had been asked to do something sexually explicit online during their childhood, and more than half
(54%) had experience a form of child sexual abuse online32. A recent survey in s pai concluded that two out five s panish adults suffered sexual abuse when they were children33.
The majority of victims are girls, who are more than twice as likely to be abused than
boys34.
Vulnerable children are more likely to fall victims of CSA online. In a recent survey about childhood experiences:
59% of respondents who identified as transgender and non-binary experienced online sexual harm, compared to 47% of cisgender respondents; 65% of respondents who identified as LGBQ+ experienced online sexual harm,
compared to 46% non-LGBQ+ people; 57% of disabled respondents experienced online sexual harm, compared to 48% of non-disabled respondents.
This document refers to child sexual abuse for simpiicity but it should be understood as covering also child sexual exploitation and child sexual abuse material. One in Five Campaign, Council of Europe, 2010-2015. Economist Impact survey of more than 5,000 18 to 20 year olds in 54 countries, published in the 2021
Global Threat Assessment. WeProtect Global Alliance. 2021. The forms of child sexual abuse online
surveyed (referred as "online harms") include 1) Being sent sexually-explicit content fiom an adult or someone they did not know before they were 18; 2) Being asked to keep part of their sexually-explicit online
relationship with an adult / or someone they did not know before a secret; 3) Having sexually-explicit images of them shared without consent (by a peer, adult, or someone they did not know before); and 4)
Being asked to do something sexually-explicit online they were uncomfortable with (by a peer, adult, or someone they did not know before). M. Ferragut, M. Ortiz-Tallo, M. J Blanca. Prevalence of Child S exu1 Abuse in S pin: A Representative Sa mp1e S ludy. Journal of Interpersonal Violence, 21 S eptember 2021.
Collin-V6zina, D., et al., Lessons leamed from child sexual abuse research: Prevalence. outcomes, and
preventive strategies, 18 July 2012, p. 6.S ee also M.S toltenborgh, 1.1. van Lizendoorn, E.M.Euser, M.J.
Bakermans-Kranenburg, A global perspective on child sexual abuse: Meta-analysis of prevalence around the world, 2011, pp. 79-101.
17
"Offline" and online CSA
The sexual abuse of children can take multiple forms, both offline (e.g. engaging in sexual activities with a child or exploiting a child for prostitution) and online (e.g. forcing a child to
engage in sexual activities via live streaming, or viewing or distributing online child sexual abuse images and videos).
The offline and online aspects of the crimes have become increasingly intertwined, and most csAcases today contain an online component35. For example, an offender may abuse a child offline, record the abuse, and share it online. Or the offender may establish a first contact with children online and then lure them to meet offhne and sexually abuse them36. It is therefore not possible to separate categorically between online and offline.
That said, this initiative focuses on the online aspects of the crime with relation to detection,
reporting and removal efforts, in particular by the providers of the services used. This is because the internet has become the main medium for sharing CSAM , as well as for
contacting children with the aim of abusing them. The internet facilitates the creation of communities in which offenders share materials and experiences. The volume of CSAM shared online has grown exponentially in the last years, while sharing of such material offline,
e.g. via mail services, remains at a very low level and was not signalled as a common issue encountered by law enforcement in CSA investigations during stakeholder consultations.
The Member S tates have sought to address this growing phenomenon through rules at the national level, reinforcing existing legislation or adopting new rules to improve the detection and follow-up on online child sexual abuse. This has inadvertently created a fragmentation of the internal market which negatively impacts the provision of certain online services, while at the same time failing to stem the proliferation of this particularly harmful content. Therefore, this initiative addresses the detection, reporting and removal in the online sphere, which enables and fuels offline and online abuse, as well as on prevention and assistance to
victims, where the online and ofline aspects are also closely related.
Interlinkages between detection, reporting and action, prevention, and assistance to victims
In addition to the online-offline interlinkages, all the different areas of the problem are also
closely related: detection, reporting and action (i.e. follow up to the reports, including removal by service providers and action by law enforcement), prevention, and assistance to victims. In general, for public authorities to be able to act and assist the victim, the crime has to be detected and reported, which in turn may prevent future crimes from happening (e.g. if the offender is arrested and the victim is rescued). This also applies to detecting grooming and to stopping the circulation of CSAM (known and few), which are both criminal behaviours. In addition, the continued circulation ofCSAM has a particularly harmful societal impact: the distribution of CSAM is a form of re-victimisation that occurs every time the images and videos are seen. The knowledge that the images and videos are being distributed is a continuous source of distress for victims. In addition, viewing of CSAM can lead to hands-on abuse as it supports potential offenders in normalising and rationalising their behaviour; recent surveys even indicate that this may often be the case37. When csAM is detected by service Droviders and investigated by law enforcement, it freauentlv leads to stoDDing
abuse cases have
2021.
Two thirds 0f law euforcement authorities surveyed indicate that over 70% of child sexual an online component (see the targeted survey of law enforcement authorities, Annex 2). ECPAT, S u mary Paper on Child S exua1 Exploitation, November 2020, p. 6. Protect Children, CSAM Users in the Dark Web: Protecting Children Through Prevention,
18
ongoing or future abuse of child victims by the offenders caught distributing CSA 1 and/or
grooming the child (see box 1 below).
Box 1: importance げdetection, reporting an d action in prevention and assistance to victims
The distribution of CSAM is closely linked to its production, and therefore physical sexual abuse 0f children. The detection and reporting of CSAM is therefore a key prevention tool and an important way to assist victims by also preventing re-victimisation.
The detection of CSA online frequently leads to stopping ongoing or future physical sexual abuse. This is clearly the case for new CSAM and grooming, which often reveals ongoing and/or imminent physical sexual abuse. But it is also the case for known CSAM , as viewing it often leads to hands-on abuse. In an anonymous online survey in the Darkweb, 37% of individuals who viewed CSAM had sought direct contact with a child after viewing the material38. Also, half of the offenders sentenced in the Us in 2019 for csAM related offences
(non-production) engaged in aggravating sexual conduct prior to, or concurrently with, the
csAMcharge39. The detection ofcsAM also stops its distribution, which fuels demand for more and new material and therefore new abuses. Offenders not only exchange CSAM
bilaterally but are typically required to contribute with new material to join online communities trading it. 44% of offenders convicted in the US for CS AM-relted offences
(non-production) participated in an online community, 77% required sentencing enhancements for possession of 600 or more images40. The material demanded has become more and more extreme. In the same 2019 Us data, 52% of cases included images or videos of infants or toddlers and 84% of cases required sentencing enhancements for images depicting sadistic or masochistic conduct or abuse of an infant or toddler.
Detection, reporting and action
The proportion of cases where CSA is discovered in a timely manner and prevented or
stopped is very limited. Oftentimes, children do not manage to seek help themselves, and those in their 'circle of trust' (i.e. family and other close contacts), in charge to provide protection and care, are often the abusers4 1' 0ne in three victims will never tell anyone and at least four in five csA cases are not reported to public authorities42. There are indications that the c OVID-19 crisis has exacerbated the problem43, especially for children who live with their abusers44
In this context, online service providers and in particular 'inline intermediaries'45 such as
messaging services, online forums, and online platforms (such as video-sharing and media-
sharing platforms, social networks, etc.) have acquired an important role.
38 Protect Children,CS A Users in the Dark Web: Protecting Children Through Prevention, 2021. u UnitedS ttesS entencing Commission, Federal S entencing of Child Pornography (non-production offences),
June 2021. 40
乃沼.
41 Gewirtz-Meydan, A., Finkelhor, D.,S exua1 Abuse and Assault in a Large National Sa p1e of Children and Adolescents, 16 S eptember 2019.
42 Ibid. S ee also M. Ferragut, M. Ortiz-Tallo, M. J Blanca. Prevalence of Child S exua1 Abuse in S pain: A
Representative Sam p1e S td. Journal of Interpersonal Violence, 21 S eptember 2021, which found that only 27.5 % of S panish adult victims of CSA have told someone about their experience while still a child.
43 Europol report on online child sexual abuse during the pandemic, 19 June 2020. 44 Unicef et al. COVID- 19 and its implications for protecting children online, April 2020. 45 See also the Impact Assessment accompanying the Proposal on a S ing1e Market For DigitalS ervices (Digital
S erices Act) and amending Directive 2000/31/C,S WD(2020) 348 final, December 2020, p.7 (para 15).
19
First, online intermediaries are often the only ones to have any possibility to detect the
ongoing abuse. Frequently, the abuse is only discovered thanks to the efforts of online service providers to detect CSAM on their services, and to protect children from being approached by predators online. The key role of these reports is evidenced by the fact that in some Member S ttes, up to 80% of investigations are launched due to reports from service
providers46. This is particularly the case in electronic (private individual or group) communications, which offenders frequently use to exchange CSAM and approach children, where the service provider is the only one that can detect the abuse. It is reflected in recent statistics showing that the vast majority of reports (more than 80% in 2020, up from 69% in
2019) originate in interpersonal comnrnnication services (e.g. messenger applications and
email)47, and surveys. In a recent one, two-thirds of respondents who received sexually explicit material online as children from an adult they knew or someone they did not know, received it through a private messaging service (68%), most commonly on their own
personal mobile device (62%)48.
secondly, the internet has also given offenders a new way of approaching children. They contact children on social media, gaming platforms and chats and lure them into producing compromising images of themselves or into offiine meetings. In addition, children are
spending more time online than ever before49, increasing the risk of coming into contact with online predators50.
Third, offenders frequently record the sexual abuse for repeat viewing and sharing. Where CSAMis shared online, the harm is perpetuated. The exponential development of the digital world has facilitated the global sharing of materials and the creation of networks of offenders via online intermediaries. The images and videos of CSA continue to circulate long after the abuse itself, and survivors often find themselves powerless to ensure removal of online content depicting their abuse5 1, In some cases, offenders continue to traumatise victims long after the abuse has taken place by creating fake accounts with the actual names of the victims. These accounts typically do not contain illegal content but they attract offenders familiar with the CSAM depicting those victims, who discuss the past abuse and the current personal information of the victims (e.g. where they live, work or family situation)52.
It is estimated that, at any given moment, across the world there are more than 750 000 individuals online exchanging CSAM , streaming live abuse of children, extorting children to
produce sexual material or grooming children for future sexual abuse53
The problem and problem drivers considered in the impact assessment apply to the three main
types of abuse: known C 5AM, new CSAM and grooming, also referred to as a whole as CSA online.
Targeted smvey of law enforcement authorities (see amiex 2, section 1). NCMEC, 2019 and 2020 data. Economist Impact, WeProtect Global Alliance Global Threat Assessment, 2021.
Europol, European Union serious and organised crime threat assessment, 12 April 2021. UNSW Sy dny, The impact of COVID- 19 on the risk of online child sexual exploitation and the
implications for child protection and policing, May 2021.
NCMEC, Captured on Film, 2019. WeProtect Global Alliance, Global Threat Assessment 2021. U.N. General Assembly, Human Rights Council, Report of the S pecial Rapporteur on the sale of children, child prostitution and child pomography, 13 July 2009.
20
βox2. currentsアstem lo detect and rportS且 on万nein琉eEU
The CSA detection efforts of online service providers fall into three categories: first, the detection of 'known' CSAM, that is, images and videos that have been reported or detected before and that have already been verified as constituting CSAM ; secondly, the detection of 'new'CSAM , i.e. images and videos that have not previously been detected and verified; and
third, the detection of 'grooming' (also referred to as solicitation of children), where offenders trick or threaten children into sharing compromising images or meeting them offline for the
purposes of sexual abuse54.
Currently, EU legislation allows certain online communication services like instant messenger and email to continue voluntary measures to detect and report child sexual abuse online,
provided that their activities are lawful and, in particular, meet a set of specific condltions55. In
general, the measures that providers take vary widely and proactive detection ofCSA online is still a rarity among service providers active in the EU.
The vast majority of CSA reports from service providers reaches law enforcement authorities in the EU through the Us National Centre for Missing and Exploited Children (NcMEc)56, which is therefore of key importance for the fight against CSA in the EU. While US law does not oblige providers to detect CSA online in their services, it does oblige service providers to
report it to NCMEC where they become aware of the abuse. NCMEC determines the relevant
jurisdiction(s) from where materials were uploaded. Where the report relates to an EU Member S tate, the report is forwarded to the US Department of Homeland S ecurity Investigations (1S') for onward transfer to Europol, or directly to the relevant EU Member State law enforcement authorities. 1S! plays an intermediary role as currently Europol cannot receive information directly from private parties, including NCMEC or service providers. Reports which are received by Europol are cross-checked and forwarded to the relevant Member S tate authorities. For reports relating to the US, NCMEC is able to provide a number of additional services, such as verifying that the reported content constitutes CSA according to the definitions under US law, and providing information on where the same content has been detected previously. This service cannot be provided for non-US reports due to the much
higher volumes (in 2020, 98% of the reports were n on-Us related)57.
NCMEC has also a hotline function to receive reports from the public (independent from the above reporting by online service providers). It is part of the INIOPE network of national
hotlines, which includes hotlines in most EU Member S tates where users can report CSAM that they may encounter accidentally; the hotlines then forward these reports to law enforcement and contact relevant providers to ensure removal. However, such reports from the
public make up less than 2% of content found as it is rare for people to come across CSAM and report it58. The 1NHOPE hotlines facilitate the takedown of CsAM hosted outside the
territory of the country where it is reported by identifying the country where the material is hosted and forwarding the information to the relevant hotline in that country for further notification to Dublic authorities, or to the service p rovider if no hotline exists.
The functioning 0f the technology to detect the various types of CSA online is explained in detail in annex 8
See section 1 on the "Interim Regulation". Annex 6 contains details on reporting and the processing ofCSAM reports. NCMEC, 2020 data: out of 21.7 million reports. 494 000 originated in the US.
NCMEC, 2020 data: out of 21.7 million reports. 21.4 million were from service providers.
21
While still only very few companies engage in voluntary detection of child sexual abuse, the
past few years have nonetheless seen a strong increase in reports of CSA online submitted by online service providers globally through NCMEC: from 1 million reports in 2010 to over 21 million in 2020. The number of reports concerning the EU (e.g. images exchanged in the EU, victims in the EU, etc.) has also dramatically increased: from 17 500 in 2010 to more than 1 million in 2020.
Figure 1:Eひrelated reports submittedりonline service providers, 2010-2020
Thousands
1046.35
722.98 725
461.3
270.69
+ 5 980
142.58
.
1200
1000
800
600
400
200 17.5 20.34 24.28 28.38 コ乙・りo
�
2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020
Box 3: new CSシ4M and self-generated content
Part of the increase in new CSAM is driven by self-generated child sexual abuse material. 'WF reported a 77% increase from 2019 to 2020 globally60. Whereas the first time the material is shared may be consensual, further resharing is typically not consensual. In a 2020
survey conducted by Thorn, 1 in 6 children aged 9 to 12 admitted that they had seen non-
consensually reshared nudes of other children, up from i in 9 in 201961.A separate survey by Economist Impact of 18-20 year olds on their childhood experiences found similar data: 18% of them reported experiencing a sexually explicit image of themselves being shared by a peer without consent62
First time sharing of self-generated material may be consensual but it may also be the result of online grooming. In the same survey conducted by Thorn, 50% of the children aged 9 to 17 said that they had sent the nudes to someone they had never met in real life, up from 37% in 201963
The amount of grooming cases reported globally increased by 98% in 2020 compared to the
previous year (37 872 in 2020 vs 19 147 in 2019), presumably due to the pandemic, when both children and offenders spent more time online and at home64.
The reports that service providers submitted in 2020 in relation to cases in the EU included 3.7 million images and videos of known CSA 1, 528 000 images and videos of new CSAM , and more than 1 400 grooming cases65
NCMEC, 2020 data: The data does not include the UK in the first years of the period to ensure
comparabili年 Intemet Watch Foundation ('WF), Aimual Report 2020.
Thom,Sel f-Generted Child Se xual Abuse Material: Youth Attitudes and Experiences in 2020, 2020. Economist Impact, WeProtect Global Alliance Global Threat Assessment, 2021.
Thom,S e1f-Generted Child S exual Abuse Material: Youth Attitudes and Experiences in 2020, 2020.
NCMEC, Online Enticement Reports S krocket in 2020, 21 January 2021.
つつ
Reports indicate that some companies active and with servers in the EU have now become the
largest hosts of CSAM globally (from hosting more than half of all CSAM detected in 2016 to 85% in 2020, with 77% in the Netherlands)66
Given the worsening situation, Member S ttes have started to take action unilaterally, adopting sectoral rules to deal with the challenge, which are necessarily national in scope and risk further fragmenting the Internal Market (see problem driver section 2.2.2.).
Stkh oldr' views EU citizens are concerned about these developments. 93% consider important the principle that children should be protected in the online environment, with 73% of respondents considering this principle very important for inclusion in a potential future list of EU digital principles67.
Prevention
Prevention is an essential component for tackling the problem at its roots.
There are two main types of prevention efforts:
1. Prevention efforts focused on children and their environment and on decreasing the likelihood that a child becomes a victim. Examples include awareness raising campaigns to help inform children, parents, carers and educators about risks and
preventive mechanisms and procedures, as well as training, and efforts to detect and
stop online grooming. 2. Prevention efforts focused on potential offenders and on decreasing the likelihood
that a person offends68. Examples include prevention programmes for persons who fear that they might offend, and for persons who have already offended, to prevent recidivism69.
setting out effective prevention programmes remains challenging. Resources are limited and lack coordination, and efforts, where present, are rarely evaluated to assess their effectiveness.
(see section 2.2.3. on problem drivers).
Assistance to victims
Assistance to victims is essential to mitigate the harm and severe consequences for children's physical and mental health caused by child sexual abuse (see section 2.1.3).
Victims require both immediate and long-term assistance, before, during and after criminal
proceedings and taking into account the best interests of the child. This assistance must be
specific, i.e. following an individual assessment of the special circumstances of each
particular child victim, taking due account of the child's views, needs and concerns70
However, immediate and long-term assistance remains limited, not sufficiently coordinated between relevant actors within and between Member S tates and of unclear effectiveness (see section 2.2.3.). This leads to information gaps, hampers the sharing of best practices and lessons learnt and decreases the efficacy of efforts.
NCMEC, 2020 data. Internet Watch Foundation ('WF), Aimual Reports of 2016 to 2020. Eurobarometer survey conducted inS eptember and October 2021 (26,530 respondents from the 27 EU Member S ttes). In a recent survey to offenders in the Darkweb, 50% of offenders stated that they wanted to stop offending and expressed feeling of shame, guilt and self-harm.S ee Protect Children,CSAM Users in the Dark Web:
Protecting Children Through Prevention, 2021. Di Gioia, R., Beslay, L.,
' Fighting child sexual abuse-Prevention policies for offenders, 3 0ctober 2018.
As required by Article 19(3) of the CSA Directive.
21.2'II7りノis it aproblem?
The fact that some child sexual abuse crimes are not adequately addressed in the EU isa
problem because it results in victims not being rescued and effectively assisted as soon as
possible, children being less protected from crimes, and offenders enjoying impunity. It affects public security in the EU and infringes children's fundamental rights under the Charter of Fundamental Rights of the EU (Charter)71, including the right to such protection and care as is necessary for their well-being, the right to human dignity and the right to
privacy. The continued presence and dissemination of manifestly illegal images and videos
online, and the very heterogeneous approach of service providers, affects private and public interests, hampering trust, innovation and growth in the single market for digital services, in particular due to the fragmentation created by divergent national approaches trying to address the problem of CSA online (see problem driver section 2.2.2.).
Additionally, CSA has societal and economic costs. In particular, it contributes to an increased risk of serious mental and physical health problems across the lifespan, and exerts a substantial economic burden on individuals, families, and societies. There are negative consequences at all stages:
Before the crime is committed: in the absence of proper preventative interventions, individuals who could have been stopped from abusing children may become first-time
offenders, offenders are more likely to re-offend, and children are more likely to become victims if they and their carers lack awareness of the threat when using online services. While the crime is being committed: the consequences of not detecting and addressing the crimes swiftly include prolonged suffering and harm for victims. In addition, it reinforces the perception of impunity, reducing deterrence and facilitating further
offending. After the crime has been conimitted: the consequences of not acting effectively after the crime include the inability to provide proper immediate and long-term assistance to
victims, with negative effects for victims and society as described above. In addition, it
may not be possible to prosecute offenders, which reduces opportunities for rehabilitation
before, during and after criminal proceedings to prevent reoffending.
2.1.3. Who isatfcted and加w?
First, children in the EU and elsewhere, who may fall victim to sexual abuse and suffer its
negative effects, both in the immediate and long-term72. Immediate effects include physical injuries and psychological consequences (e.g. shock, fear, anxiety, guilt, post-traumatic stress
disorder, denial, withdrawal, isolation, and grief), sexual behaviour problems and over- sexualised behaviour, academic problems, substance abuse problems, increased likelihood of involvement in delinquency and crime, and increased likelihood of teen pregflancyl Long- term effects include psychological and social adjustment problems that can carry over into adulthood and affect married life and parenthood. They include negative effects on sexual and overall physical health; mental health problems including depression, personality and
psychotic disorders, post-traumatic stress disorder, self-mutilation, attempted or completed suicide; and relational and marital problems including fear of intimacy and spousal violence.
secondly, online service providers. Member S ttes' efforts to tackle the challenge at national level create distortions in the single market for digital services (see problem driver section 2.2.2.). as providers have to comply with sector-specific rules under national laws at least in
71 See section 6.1.3 below. Institut National de Sa nt6 Publique, Gouvernement du Qmibec, Consequences of child sexual abuse, accessed on 20 April 2021; ODI Report: The cost and economic impact of violence against children, p.20. Masumova, F., A Need for Improved Detection of Child and Adolescent S exua1 Abuse, May 2017; Darkness to Light, Child S exua1 Abuse S tatistics, accessed on 20 April 2021.
24
some it the Jurisdictions where they are active, resulting in a more challenging business
environment for companies, in particular for smaller companies that are already facing difficulties of competing with their largest counterparts.
Third, users of online services. The detection, reporting and removal of CSA online currently lacks clarity, legal certainty and transparency. As a consequence, the rights and interests of users can be negatively affected. This can occur, for instance, in relation to unjustified reporting or removals, which may affect not only the users initiating the communications in
question but also those at the receiving end. The existing uncertainty may also have a
'chilling effect' on legitimate forms of communications or hamper the full participation of children in online services as their parents and carers become more and more aware of the risks but di not have access to transparent information about the levels of risk and about what measures services take to protect children.
Fourth, governments and public authorities. The competent public authorities (e.g. law enforcement or governments at national, regional and local levels) dedicate significant resources to act against CSA. In particular, they put in place prevention programmes and measures to assist victims, and conduct investigations after they become aware of possible CSA. Inefficiencies in the current system lead them to seek local solutions to incentivise and obtain more information from providers.
Finally, society in general, given that CSA has consequences not only for the victims, but also fir society as a whole74. s ocia1 costs correspond to the non-monetary consequences of the criminal acts, and include diminished quality of life for society and increased feelings of
insecurity among individuals. Economic costs include those of police and judicial services
(e.g. criminal prosecution, correctional system), social services, victim support service and victim comDensation rnogrammes, education, health, and emvlovment costs.
Box 4: estimated costsげchild sexual abuse
Victims of child sexual abuse require immediate and long-term assistance. The costs of
providing such assistance can be significant. For example, the total lifetime costs 0f assistance to victims arising from new substantiated cases of child sexual abuse in the United S tates in 2015 was estimated at UsD 1.5 billion per year75
The long-term effects of child sexual abuse on victims also include lifelong loss of potential earnings and productivity76. The total lifetime cost of such losses arising from new substantiated cases of csA in the Us in 2015 was estimated at UsD6 .8 billon per year77
0verall, the total estimated costs of child sexual abuse in the US in 2015 were estimated at UsD 11 billion per year78
2.2. What are the nroblem drivers?
22.1. Voんntary action 勿 on方ne service providers lo detect online chiM sexual abuse has
proven insfcient
Voluntary action varies significantly among companies
Institut National de Sa nt6 Publique, Gouvernement du Qmibec, accessed on 20 April 2021.
Letourneau, E ., The Economic Burden of Child S exua1 Abuse in the United S tates, May 2018. 76 乃id.
Ibid., based on combined estimated productivity losses for non-fatal and fatal cases. Ibid. The USD 11 billiori!year include the costs due to violence and crime, and suicide deaths (USD i
billion'er), and the costs due to loss of quality-adjusted life years (USD 1.6 billion/year), in addition to the victims assistance costs and productivity losses.
25
Online service providers are often the only entities capable of detecting that abuse involving
their services is taking place. Because detection is voluntary, some online service providers take comprehensive action, others take some action, and there are providers that do not take
any action against CSA at all. In addition, service providers often do not have access to reliable information on what content and behaviour is illegal in the EU to facilitate accurate
detection, proactively and voluntarily, resulting in a risk of both over- and underreporting.
There are currently 1 630 companies registered to report to NCMEC, which is the main entity to receive reports of proactive searches that companies perform on their system, and the d
facto global clearinghouse of reports of CSA online. This is a fraction of the online services used to commit these crimes. In 2020, of these 1 630 companies, one, Facebook, sent 95% of
reports, 5 sent 99% of reports, and only 10% sent one report or more79. There is no evidence that 95% of all the current cases of CSA online (including sharing of known and new CSAM, and grooming) occur on the services of that single company. In fact, experts agree that comparable levels of abuse occur in similar services from other companies, and the difference in detection levels is rather due to the different intensity of detection efforts80. For example, some providers may make efforts to detect abuse only in certain services they provide, or may make efforts to detect only certain types of abuse. This would mean that there is a substantial amount of CSA online that remains undetected.
Figure 2: brealidowo げreports submitted り lo/loe service providers globally io 202081
10_000
5卿
288
5に compaIlles
65 ,
97
.. ■一
m o
547
Go gle
し1 0
In addition, a number of service providers take action against users for suspected sharing of
CSA1, e.g. by banning user accounts, but do not report. For example, WhatsApp indicates that it bans around 300 000 accounts per month for this reason alone82. However, it has been
reported that WhatsApp reports to NCMEC only about 10% of these cases, as the evidence recovered is circumstantial only and in line with US legislation is insufficient for a criminal
National Centre for Missing and Exploited Children, 2020 Reports by Electromc S ervice Providers.
NetClean, Report 2019: A Report about Child S exu1 Abuse Crime, p.7, 32-33; NetClean, Report 2016: 10
Important Insights into Child S exua1 Abuse Crime, p.33. National Centre for Missing and Exploited Children, 2020 Reports by Electronic S ervice Providers.
WhatsApp, How WhatsApp helps fight child exploitation, accessed on 20 S eptember 2021.
26
mVe5tigation8」.」 Where that is so, there is on the one hand a risk that users are banned on the
basis of unclear and potentially insufficient evidence, while on the other hand actual abuse
may not be reported and investigated. This can have a significant negative effect on the fundamental rights of users84, and on the affected children.
These different approaches and the related risks also create asymmetries in the single market for digital services, as they have prompted a number of Member S ttes to adopt or consider national legislation to create a stronger and more effective approach (see problem driver section 2.2.2).
Voluntary action is susceptible to changes in companies' policies.
Because detection is voluntary, companies may decide to change their policies at will. One
example is Facebook's decision to implement end-to-end encryption (E2EE) on its private messaging service by default.
Existing detection efforts risk being severely hampered by the introduction of encryption in online services, which in spite of its benefits for cybersecurity and the protection of users' fundamental rights, such as freedom of expression, privacy, and data protection, also makes the detection of CSA online and the protection of fundamental rights of the victimised children more difficult85, when not impossible.
Box 5:e nd-to-end encryption, a policy change impacting chiM sexual abuse detection
In March 2019, Facebook announced plans to implement end to-end encryption (E2EE) by default in its instant messaging service86. These plans have been reiterated afterwards87, with the implementation taking place "sometime in 2023"88. In the absence of accompanying measures, it is conservatively estimated that this could reduce the number of total reports of CsA in the EU (and globally) by more than half89, and as much as two-thirds90. These estimates were confirmed after Facebook announced that it had stopped the detection ofCSA in its instant messaging service in December 202091, given the legal uncertainty it considered to be caused by the entry into force of the European Electronic Communications Code (see the information on the Interim Regulation in section 1). From 1 January to 30 0ctober 2021 the number of reports received by law enforcement in the EU dropped by two-thirds
compared to the same period in 2020 (972,581 reports vs 341,326 reports)92, a loss of 2 100
reports per day. In total in 2021, while there was a 35% increase in global reports, the number of reDorts relevant for the EU droixed by 47%93. Whereas in this case the tools to
83 Wired, Police caught one of the web's most dangerous paedophiles. Then everything went dark, May 2020. The number of Facebook reports in Figure 2 includes all Facebook platforms (i.e. also WhatsApp). According to the above, the number of WhatsApp reports would be around 400 000 versus around 20
million reports from Facebook platform. 84 Impact Assessment accompanying the DSA proposal,S WD(2020) 348 fmal, December 2020, p1 7. 85 EU strategy (footnote Error! Bookmark not defined.), p.2. 86 Facebook, A Privacy-Focused Vision for S ocia1 Networking, 12 March 2019. 87 Including during the UK's Home Affairs Committee hearing of 20 January 2021 on Online Harms. 88 Davis, A. (Head of S afet at Meta), We
'll protect privacy and prevent harm, writes Facebook safety boss, S uda Telegraph, 21 November 2021.
89 NCMEC, End-to-end encryption: ignoring abuse won't stop it. accessed 20 April 2021. 90 EU strategy (footnote 79), p.15. 91 Facebook, Changes to Facebook Messaging S ervices in Europe, 20 December 2020. 92 NCMEC. 93 NCMEC, 2021 data. The drop in reports is in particular due to the fact that Meta, the company responsibly
for the majority of reports, stopped the detection efforts in the EU in December 2020 and did not resume until November 2021
27
detect CSA were not used due to legal concerns, the practical effects are likely the same as
an implementation of E 2EE without mitigating measures94 would cause: the impossibility to detect CSA , since the detection tools as currently used do not work on E 2EE systems.
Google announced in November 2020 that it had started to roll out E 2EE on Google Messages95. Other similar services with E 2EE already incorporated (with presumably similar if not higher levels of csA 96) include WhatsApp, Apple's ilessage,s igna1 and Tele
In addition to affecting the detection of CSA online and the protection of fundamental rights of the victimised children, the use of E 2EE without mitigating measures reduces the means to prevent and combat CSA overall by "turuing-off the light" on a significant part of the
problem, i.e. decreasing the evidence base, including data on the scale of detectableCSA
online, which is essential to fight against overall CSA effectively through assistance to
victims, investigations, and prevention.97 In the absence of mitigating measures (e.g. tools that can detect CSA online in E 2EE systems, see annex 9), currently the possible ways to detect CSAonline in E 2EE
reports the abuse; and and data related to the
suspicious patterns of ne user names also includes
or the offender me exchange, I content. This
s邪tems are:
, i.e. either the child rts mnl its
e. the time of the user repo metadata
、 j / 、 I ノ
・1 (ノー
online exchange other than
activity (e.g. if someone repeatedly of people they do not know98)
sets up new profiles or messages a large number
Relying on user reports implies that the responsibility of reporting will be borne solely by child victims of sexual abuse in grooming cases, who in many cases are shamed or threatened into silence (see section 2.1.1. on underreporting), as the offender will obviously not report the abuse. This is already evident from the low number of user reports today.
service providers do not consider metadata as an effective tool in detecting csA 199. In
addition, the use of metadata is usually insufficient to initiate investigations100. Moreover, it is likely to generate a much lower number of reports than the detection of content, despite the level of abuse being the same (if not higher). As an example, consider WhatsApp (E2EE and therefore uses metadata as the basis of detection) and Facebook Messenger (not E 2EE and therefore uses content as the basis of detection). Whereas WhatsApp has around 50% more users than Facebook Messenger (2 billion vs 1.3 billion101), and therefore, presumably, higher level of abuse vroportional to the number of users, there were around 35 times less rerorts
M itigting measures refer to deploying E 2EE in a way that it enables the continued detection of CSA online.
Google, Helping you coimect around the world with Messages. 19 November 2020.
NSPCC, E nd-to-end encryption. Understanding the impacts for child safety online, April 2021. WeProtect Global Alliance to end child sexual exploitation online, Global Threat Assessment, 2021.
Davis, A. (Head of S afety at Meta), We 'll protect privacy and prevent harm, writes Facebook safety boss,
S uday Telegraph, 21 November 2021.
Pfefferkorn, R.,S tanford Internet Observatory, Content-Oblivious Trust and Sa fety Techniques: Results from a Su rvey of Online S ervice Providers, 9 S eptember, 2021. S ee in particular p.10-1 1.
10oWeProtect Global Alliance to end child sexual exploitation online, Global Threat Assessment. 2021,
reporting a statement by the Virtual Global Taskforce, an international alliance of law enforcement agencies (including Europol, Dutch Police, Interpol, US Homeland S ecurit Investigations, UK National Crime
Agency, Colombian Police and others) against CS A online.S ee also Wired, Police caught one of the web's most dangerous paedophiles. Then everything went dark, May 2020.
10l Statista, Most popular global mobile messenger apps as of October 2021, based on number of monthly active users. The overall numbers of users were the same in 2020.
28
from WhatsApp than from Facebook Messenger submitted to NCMEC in 2020 (400 000 vs
14 million)102.
Europol reports that the widespread use of encryption tools, including E 2EE apps, has lowered the risk of detection for those who offend against children103. Offenders are well aware of the possibilities thatE 2EE present to hide their abuse. In an analysis of offender forums in the Darkweb, it was found that a majority of discussions focused on topics such as technical tools for direct messaging or how to securely acquire and store content 104
Voluntary action leaves decisions affecting fundamental rights to service providers and lacks harmonised safeguards
A voluntary system leaves private companies to make fundamental decisions with
signicant impact on users and their rights 115 The challenges in this system are particularly evident when dealing withCSA , where there are fundamental rights and interests at stake on all sides - including the right to protection of their well-being and to privacy on the side of the
child, the right to privacy and freedom of expression and information for all users. As a result, if the rights of the child are deemed important enough to justi句 interfering with the rights of all users and of service providers, then it may not be appropriate to leave the decision on whether and if so, how to do so to the service providers.
In addition, the current voluntary action by online service providers to detect CSA online lacks long-term perspective and harmonised safeguards applicable to all relevant service
providers, including transparency. This is especially important as some of the voluntary measures that companies decide to take may interfere with users' rights, including those to
privacy and data protection. It is unclear which tools are in use and how they are used, or which procedures are in place to improve the tools and limit the number of false positives. While there is an obvious need not to warn off perpetrators or inadvertently provide guidance on how to avoid detection, there may be room for more information. As a result, users at
present may have no effective redress in case of erroneous removals; the possibilities of
scrutiny are limited; and there is no effective oversight by regulators. in addition, the existence and effectiveness of procedural safeguards differs widely across providers.
The Interim Regulation introduced a number of safeguards, such as annual transparency reports, consultation with data protection authorities on their processing to detect CS A online, and complaint mechanisms, so that content that has been removed erroneously can be reinstated (see section 1).
A number of important safeguards are contained in the DSA proposal, which lays down harmonized transparency requirements in case of content moderation based on providers own
initiative106, as well as in relation to mechanisms for removal and related user complaints107
102 NCMEC and Wired, Police caught one of the web's most dangerous paedophiles. Then everything went
虫也 May 2020. 103 Europol, EuropolSe rious and Organised Crime Threat Assessment (SOCTA), April 2021. 104 Analysis conducted in February 2021, as reported in WeProtect Global Alliance to end child sexual
exploitation online, Global Threat Assessment. 2021. 105 Impact Assessment accompanying the DSA proposal,S WD(2020) 348 fmal, December 2020, p.25. 106 See in particular Article 13(1)(c). 107 These include a statement of reasons in case a provider of hosting services decided to remove or disable
access to content and possibility of the recipient of the service to challenge any content moderation decision, see Articles 15, 17 and 18.
29
(iiven the gravity it impact on both sides - oer the child victims, materials depicting their
abuse, and the risk of (further) abuse, and for the suspected user, an accusation of having circulated CSAM - the above safeguards form an important baseline but do not go far
enough in the present context. h particular, the stakeholder consultations have shown the
importance of a universal reporting obligation for CSA online for the providers, using dedicated secure and fast channels, as well as of additional requirements on the technologies employed for automatic detection to ensure that they are both effective in detecting abuse and also limit the number of false positives to the maximum extent technically possible.
Under search For the
images
Voluntary action has failed to remove victims' images effectively
Victims are left on their own when images and videos of their abuse end up online. national criminal laws, hotlines in the EU are in principle not allowed to proactively for images and videos of a given victim, on the victim's behalf, to effect removal.
relevant victims themselves are also prohibited from searching for their own
as the possession of CSAM is illegal per se. Absent a requirement for
specified remove
same reason and videos, services providers to take proportionate measures to detect, report and
content, an effective removal system has not developedilS
Box 6: Voluntary princ加les to counter online chiM sexual abuse
The US, UK, Canada, Australia and New Zealand (the 'Five Eyes'), together with leading online service providers, civil society and academia, announced in 2020 a set of voluntary principles for companies to tackle child sexual abuse onlneio9. These address notably the
detection, reporting and removal of C sAM, as well as detection and reporting of grooming. Although multiple companies have committed to implementing the voluntary principles, including Facebook, Google, Microsoft, Roblox, S nap and Twitter, there is a lack of
transparency on the actions that companies are taking to implement those principles. As a
consequence, there is a lack of evidence of tan2ib1e results of that conmiitment.
civil 2.2.2. Ineがciencies in public-private cooperation between online service pro socieか organisations and public au琉orities hamper an effective fight againstSl4
This section describes the inefficiencies in public-private cooperation between the main actors in the fight against CSA, online and offline. In a majority of cases, the inefficiencies relate to
regulatory issues.
Cooperation between public authorities and service providers
Cooperation between public authorities and service providers is of critical importance in the
fight against CSA, particularly in relation to service providers' efforts to detect and report CSA online and remove CSAM.
m8While the legislative proposal would mandate the Centre to proactively look for CSAM and could include a
targeted liability exemption to shield the Centre and hotlines where necessary and appropriate, in addition, the Centre may need an authorisation from its host Member S tte to exclude that it is heid liable for its
proactive searches under national criminal law.S uch an authorisation would be part of the conditions for
establishing the EU agency in a given Member S tte (see section 5.2.2.1.).S imi1ar1, to ensure service
providers will not be held liable when searching their systems, the legislative proposal could include a
specific exemption from liability, building on the exemption contained in the DsA. 109 The voluntary principles are available here.
30
. Legal Iragmentatlon attecting the Internal Iviarket
Currently, although obligations under national law are increasingly introduced, companies offering online services in the EU still detect, report and remove CSA online from their services on a voluntary basis. There are at present no effective procedures under EU law for service providers to report to public authorities or to exchange information in a timely manner or swiftly react to requests and complaints. This hampers investigations and creates obstacles to addressing CSA and to protecting victims.
This has led to a number of Member S ttes preparing and adopting individual legislative proposals at the national level to create stricter rules for providers who fail to cooperate with
public authorities or do not put in sufficient efforts to detect and reportCSAM .S ome Member sttes adopted new legislation as recently as 2021 (e.g. Germany,110 Austria) and others are
currently preparing legislative proposals (e.g. Germany, France, the Netherlands) (see Annex
5). These efforts often involve establishing dedicated public authorities or designating existing authorities to enforce the new rules111, as well as strict time-limits for service
providers to remove CsAM upon becoming aware, subject to fines if they fail to do so112. At the same time, the reach of these efforts varies and they are constrained by the national laws of the Member S tates. The scope of relevant national laws and their obligations differ in terms of the services covered. For instance, some focus on social networks in general113, others on
hosting providers managing websites containing illegal content114 and yet others on online
platforms above a certain threshold (e.g. number of registered users and annual revenue)115. Approaches are by nature limited to national jurisdictions. Given the cross-border nature of the Internet, and by implication many service providers operating online as well as online
CSA, such a fragmented approach hampers the proper functioning of the internal market.
Moreover, such a fagmented approach cannot ensure the effective detection, reporting and removal of CSAM and the fight against grooming across the EU, beyond the borders of individual Member S ttes having the above-mentioned national legislation in place. Compared to one horizontal framework established at EU level, such a Member S tate-based
approach increases the costs of doing business in the EU as service providers have to adapt to various different sets of rules, which creates uncertainties and challenges in particular for smaller providers seeking to expand to new markets in the EU, and can stifle innovation and
competition.
noApril 2021 Modification of the Netzwerkdurchsetzungsgesetz 困etzDG) to include detailed reporting obligations in case of child pornography; see aimex 5, section 3 for further information.
1n For instance, in Germany, the draft Act amending the Protection of Young Persons Act provides for the
restructuring of a national media supervising body into a federal agency to oversee the implementation of the draft Act's provisions. In France, the Draft law to regulate online platforms aims to create a new national
(adniiniislrative) authority equipped for protecting minors (including combatting the commercial exploitation of the image of children under sixteen years of age on online platforms).S ee aimex 5, section 3 for further 血釦rmation. ror instance me i'etzui in u er any, me A via jaw in rrance or me uran taw on ugnung crnia sexuai abuse.in the Netherlands.S ee annex 5, section 3 for further information.
113 For example the NetzDG in Germany.S ee annex 5, section 3 for further information. 114 For instance Decree f0 2015-125 of February 5. 2015 in France.S ee annex 5. section 3 for further
miormauon. r or exampie me uran jaw on measures no protect users on conmiumcaiion pianorms(しO nununic aIlons Platform Act) in Austria. S ee annex 5, section 3 for further inおrmation.
31
Box 7. the CSzM issue 'n the Netherlnd
As highlighted above, reports indicate that some service providers active and with servers in the EU have now become the largest hosts of CSAM globally, with more than half of ai CSAMhosted in the Netherlands, given its strong internet infrastructure. The Dutch
government has made several commitments to address this issue, including investing in
partnerships between the Dutch Government and the private sector. This included a new free service called 'Hash Check S ervice' (operated by the EU co-funded Dutch INHOPE hotline EIK') made available to companies to scan their servers for known CSAM. Given that there is a small group of Dutch companies that only cooperate to a lesser extent, and some companies not at all, the Netherlands is also preparing a new law to deal with
companies that fail to cooperate. In the near future, companies will be under the supervision of a governing body that will have the authority to impose administrative sanctions on
companies that fail to cooperate. In addition to criminal law, this procedure speci丘cally aims to eradicate CSAM in a fast and efficient manner.
The national approaches create fragmentation on the Internal Market, hindering effective
cooperation between public authorities and service providers in the fight against CSA. The continued presence and dissemination of CSAM, and the very heterogeneous approaches of service providers, affect both private and public interests, hampering trust, innovation and
growth on the Internal Market (i.e. single market for digital services).S uch fragmentation increases compliance and operational costs of the actions in the fight against CSA for stakeholders such as online service providers that operate in several Member S ttes and may lead to legal uncertainty. Non-compliant service providers may move to and continue
operating from Member S ttes where national rules are less strict. Given the cross-border and international dimension of online service provision as well as child sexual abuse online, a
patchwork of national measures does not effectively protect children, and creates distortions in the functioning of the single market for digital services.
The proposed Digital S ervices Act will not be able to reduce this fragmentation to the extent
necessary, given its horizontal nature and the specific challenges posed by CSA (see section
5.1.). For example, the DSA would not create removal obligations.Som e Member S ttes have
already gone farther, like Germany, which for certain providers such as social networks has
imposed removal obligations by law16, as well as reporting obligations in case of detection of CS A1, specifying the data to be reported to federal law enforcement, as well as an obligatory notification to the user and other asDects117
Varying quality of reports
While reports from service providers via NCMEC have led to many cases of children being rescued from ongoing abuse, and of offenders arrested, law enforcement authorities estimate that only around 75% of reports they receive from service providers are actionable 1 18 The most common reason is that the report contains material that does not constitute child sexual abuse under the Member s tte's law119. This is largely due to a simple fact: Us-based service
n6Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz -
Network Enforcement Law), BGB1. 2017 I Nr. 61, 7.9.2017,§ 3 n. 2 and 3. 117 Gesetz zur Bekampfung des Rechtsextremismus und der Hasskriminalititt (Law to combat right wing
extremism and hate crime), BGB1 2021 I Nr. 13, 1.4.2021, Art. 7 n. 3. 118 Median of estimated % of reports that are actionable, see targeted survey to law enforcement (amlex 2,
section 1.1.3). 119 Targeted survey of law enforcement authorities (see annex 2, section 1.1.3).
ユつ
providers report to NCMEC material that may constitute CSA under US law1'0, which may
include content that is not illegal in the EU and omit content that is illegal in the EU. For
example, the CSA Directive leaves up to Member S ttes to make illegal sexual abuse material
involving individuals appearing to be a child but in fact older than 18, whereas US legislation requires that the material involve an "identifiable minor" to be illegal. On the other hand, the CSADirective criminalizes grooming only when the child is below the age of sexual consent, whereas it is always illegal in the Us for any person under 18121
Further challenges arise as a result of a lack of unified reporting requirements which
clearly set out the information to be included in reports. While US service providers are
obliged to make reports to NCMEC, much of the information to be included in the report is left at the discretion of the provider122. The service that NCMEC provides for Us-relted
reports (i.e. human review of the reports to ensure that they are actionable) is typically not available for EU-related reports, due to resource constraints. A lack of sufficient information is also one of the most common reasons cited by the law enforcement authorities of the Member s tates for a report not to be actionable123.
Lack of resources in law enforcement agencies
Absent the support provided by NCMEC to US authorities, each national law enforcement
authority is left to its own devices when analysing CSAM , despite the support provided by Europol to help coordinate cases. This requires a significant investment of resources and makes it very difficult to deal effectively with the large amount of reports these authorities
receive, and prevents an effective public-private cooperation against CSA.
Lack of feedback from public authorities to service providers.
Currently, there is no mechanism for systematic feedback from law enforcement to
companies on their reports. Where providers report content that is not illegal under the law of the relevant Member S tate, the provider is not made aware of that fact. This increases the likelihood of the provider reporting the same or similar content again in the future.
Challenges due to the international and cross-border nature ofCSA
There are several international and cross-border aspects to the fight against CSA online. In
many cases, these are inherent in the cross-border nature of the Internet. As a result, a single incident of online abuse may involve perpetrators and victims located in multiple jurisdictions. While certain minimum standards relating to CSA crimes have been widely adopted in criminal law in many countries, and within the EU the CSA Directive contains
specific requirements providing for a degree of harmonisation, specific national definitions and offences differ from one country to another.
In addition, long-standing difficulties with regard to cross-border access to electronic evidence pose a particular problem for the investigation of CSA online. Law enforcement
frequently needs additional information during investigations from service providers, which are often located in another Member S tate, or in a third country. Existing judicial cooperation is too slow and direct cooperation between service providers and public authorities is
unreliable, inconsistent and lacks transparency and accountability.S evera1 legislative oroposals and other ongoina initiatives aim to address these issues (see box 2 in Allllex 6).
120 ' Duty to Report
' , 18 U.S.C.§ 2258().
121 5ee Articles 5(7) and 6 of the CSA Directive, and 18 U.S. Code§ 2252A and§ 2422 respectively. 122 'Contents of Report
' , 18 U.s.C.§ 2258A(b).
123 Targeted survey of law enforcement authorities (see amiex 2, section 1.1.3).
33
Furthermore, due to the existing legal framework and the often important or even dominant
market position of US service providers, Member S ttes are heavily dependent in their fight against CSA on reports received from a third country, the US, through NCMEC.
Cooperation between civil society organisations and service providers
Cooperation challenges in notice and action procedures.
When they receive a notice from civil society organisations requesting them to remove
content, service providers in more than 25% of cases refuse to take action to remove the notified content or take a considerable time period to do so 124 Whilst there can be justified reasons for not taking action or for some delays in individuals cases (for instance, because of
uncertainty as to whether the notified content actually constitutes CSAM under the applicable laws), there is a particularly problematic group of providers known as 'bulletproof hosting providers', which refuse to assume any responsibility for content stored on their servers 125 It should be recalled that, at present, EU law does not provide for an obligation for providers to
report or act upon notified content, not even where it manifestly constitutes CSAM. Under the eCommerce Directive (Art. 14) and the proposed DSA (Art. 5, see section 5.1.), hosting service providers' failure to act expeditiously to remove or disable access to illegal content
(including CSAM ) would lead to loss of the benefit of the liability exemption. In such cases, the service providers may - but not necessarily will - be liable under the applicable national laws of the Member S ttes, depending on whether these national laws provide for liability for service providers.
Cooperation between public authorities and civil society organisations
Limited impact of hotlines' action in the EU due to regulatory gaps.
Inability to search proactively. As noted, hotlines operating in Member S tates are under national criminal law in principle not allowed to search CSAM proactively. They therefore tend to rely exclusively on reports from the public, which are of limited number and
fluctuating in quality. The number of user reports is significantly lower than those from
proactive efforts, as the situations in which someone comes across CSAM unintentionally an d
reports it are limited126. Also, user reports are often inaccurate, in particular compared with
reports from proactive searches127. For example, the only hotline that conducts proactive searches in Europe, [WF in the UK, reported that whereas about half of the reports it manages come from the public and half from proactive searches, only 10% of the total CSAM that it finds traces back to public reports vs 90% from proactive searches128.
Inefficiencies in cooperation on assistance to victims.
For long-term assistance to victims, there is room for improvement in the cooperation between public authorities and NGOs to ensure that victims are aware of the resources available to them. In addition. currently there is no cooDeration between Dublic authorities and
124 27% of allegedly illegal content URLs notified to service providers were not removed within 3 days, INIOPE 2020 Annual Report, May 2021.
125 S ee for example these cases in the Netherlands here and here. 126 In 2020, whereas service providers reported through NCMEC 65 million images and videos globally,
1NHOPE hotlines processed globally i miilion images and videos which originated from both the public and
proactive searches by a limited number of non-EU hotlines. 127 About 25% of the reports the hotlines receive from the public include illegal content, see INHOPE Annual
些p旦II,April 2020. 128 'WF. 2020 Annual Report. April 2021.
34
hotlines or other NGOs to support victims at their request in searching and taking down the
material depicting them.
Inefficiencies in cooperation on prevention.
Inefficiencies in cooperation exist notably on prevention programmes for offenders and for
persons who fear that they might offend. In some Member S ttes, NGOs carry out these
programmes with limited support from public authorities 129 In addition, the coordination between public authorities and NGOs on the programmes they respectively offer at different
stages is also limited (e.g. between the programmes that public authorities offer in prisons and the reintegration orogrammes that NGOs offer after the offender leaves p rison) 130.
Cooperation between public authorities, service providers and civil society o rgaisati
Lack of legal certainty:
For service providers. The Interim Regulation did not create an explicit legal basis for service providers to proactively detect CSA , and it only provided a temporary and
strictly limited derogation from certain articles of the e-Privacy Directive to allow the continuation of the voluntary measures to detect CSA , provided that these are lawful. Whereas some service providers invoke legal bases provided for in the GDPR for the
processing of personal data involved in them carrying out their voluntary actions to tackle CSA , others find the GDPR legal bases not explicit enough. The uncertainty thus deters some service providers from taking such voluntary action. For hotlines. The operation of hotlines is not explicitly provided for in EU law, and
only five Member s ttes explicitly regulate it131, with others relying on memorandums of understanding. This leads to the inability of hotlines to assess the content of
reports from the public in some Member S tates, or to notify the service provider directly, leading to fragmentation and ineffectiveness across the E U132
Lack of operational standards:
Law enforcement agencies, online service providers and civil society organisations have
separate systems and standards used in the detection, reporting and removal of CSA online.
They vary not only between the different types of stakeholders (e.g. between law enforcement and service providers) but also between the same type of stakeholder (e.g. between law enforcement agencies in different Member S ttes). This includes the use of multiple, differing databases of hashes used in the detection of known CSAM . This hampers the collective
ability to efficiently and effectively detect, report and remove CSAM, to identi句 and rescue
victims, and to arrest offenders.
Stk eh oldr' views Public authorities133 identified among the main challenges while investigating CSA cases: a) inefficiencies in
public-private cooperation between service providers and public authorities, and b) inefficiencies/difficulties with access to evidence due to technical chailenges. Over 80% referred to the increased volume of CSAM detected online in the iast decade and further flagged that there are insufficient human and technical resources to deai with it. These same stakeholders state that a common baseline (also in terms of a common classification
129 Di Gioia, R., Beslay, L., Fighting child sexual abuse - Prevention policies for offenders, October 2018. 130 S ee for example the results of 2020 evaluation of Circles UK, and EU funded project C IRLES4EU. 131 'CF et al.s td on framework可best practices to tackle child sexual abuse material online, 2020. 132 乃id. 133 The term 'public authorities' in the stakeholders' views boxes refers to law enforcement authorities and
other public authorities such as government ministries.
35
system and terminology) is required to support better law enforcement and judicial cooperation and information
sharing consistent with the cross-border nature of offending in CSAM . Civil society organisations stressed the need to improve cooperation between them and law enforcement authorities (74%) in the fight against CSA online (including by providing funding to enable cooperation, organizing joint trainings/meetings and ensuring better information sharing, as well as the need for legal recognition and a clear legal basis for the national hotlines). In addition, 73% of the respondents from civil
society organisation pointed out that improved cooperation with service providers is needed. Service providers highlighted the need for coordinated actions on a global level, and the importance of
exchange of best practices.
2.2.3. Member S tte'efforts to prevent child sexual abuse and to assist victims are limited,
divergent andたck coordination and areげunclear effectiveness
Prevention efforts
Limited. In relation to the two main types of prevention efforts described in section 2.1.:
o Prevention efforts to decrease the likelihood that a child becomes a victim. Awareness raising134 and training is limited in availability, particularly to
organisations and persons that come in regular and direct contact with children as
part of their jobs or vocational activities, in addition to carers and parents. A vast
majority of the abuse occurs in the circle of trust of the child. At the same time, those in regular and direct contact with children should have the knowledge and tools to ensure that children do not become victims, given their proximity to the child.
o Prevention efforts to decrease the likelihood that a person offends. Research into what motivates individuals to become offenders is scarce and
fragmented. This current lack of research makes it difficult to put in place effective programmes before a person offends for the first time, in the course of or after criminal proceedings, both inside and outside prison. As a result, there are
currently very few programmes in place135. Uncoordinated. Multiple types of stakeholders need to take action to enact a preventive approach that delivers results. This includes public authorities, the research community, NGOs, and providers of online services used by children. The various types of
practitioners in this field do not communicate sufficiently with each other and with researchers on the effectiveness of the programmes, lessons learned and best practices; language can be a further barrier. Expertise and resources to establish and implement such initiatives are not evenly distributed in the EU, and successful programmes are
mostly local endeavours. There are overlapping efforts in some areas, e.g. Member s tates designing similar programmes and campaigns in parallel136, whereas other areas, such as reaching out to potential offenders, are not sufficiently addressed.
. Unclear effectiveness. The few programmes that exist are rarely evaluated to assess their effectiveness and usability137. A recent systematic review of the published empirical literature on child sexual abuse perpetration prevention interventions found only five
134 The Commission- funded network of Sa fer Intemet Centres is a good example. It raises awareness on online
safety and provides information, resources and assistance via helplines and hotlines on a wide range of
digital safety topics inciuding grooming and sexting. 135 For an overview of prevention programmes in the EU and third countries, see Di Gioia R., Beslay, L. (2018)
Fighting child sexual abuse: prevention policies for offenders - Inception Report, EUR 29344 EN, doi: 10.2760/48791
136 Di Gioia, R., Beslay, L., ' Fighting child sexual abuse-Prevention policies for offenders, 3 0ctober 2018.
137 乃id
36
published evaluation studies, and these were methodologically limited (e.g. four examined
the same intervention only on adults in Germany, and the other one focused only on children aged 5 to 12)138
Assistance to victims' efforts
Limited. Victims of CSA do not always receive the tailored and comprehensive assistance required139, such as support in trying to stop the sharing and distribution online of the images and videos depicting their abuse, which perpetuates the harm.
Uncoordinated. Victims of CSA require comprehensive support that brings together all relevant sectors, including health, legal, child protection, education and employment. Such coordination between relevant actors within and between Member S ttes is lacking. The existing initiatives do not systematically make use of existing best practices and lessons learned in other Member S ttes or globally. This translates into information gaps on help resources, gaps in specialised support, and overall inefficiency of efforts.
Unclear effectiveness. There is little data on whether survivors have access to appropriate support, and existing research suggests that the level of satisfaction with support received is lo w140
Box 8. main sourcesずevidence on current efforts on prevention and assistance to victims
The CSA Directive requires Member S ttes to put in place prevention measures of
programmes of the two main types described in section 2.1.1. (i.e. programmes focused on children or on possible offenders), as well as assistance to victims measures. The Commission has been monitoring the transposition of the CSA Directive since 2013, when the deadline for Member S ttes to transpose it expired. One of the main challenges for Member S tates concern the transposition of the articles concerning prevention and assistance to victims141.
Member S tates have generally struggled to put in place the required prevention programmes or measures, in particular those for offenders and for people who fear that they might offend, as well as assistance to victims programmes. In some cases, these programmes have not been put in place yet and in others they are in place but they do not fully comply with the requirements of the Directive. The Commission organised six dedicated workshops in 2018 and 2019 to support Member S ttes in the transposition of these and other provisions and better understand the challenges.
These workshops, together with additional bilateral exchanges between the Commission and Member S tates, revealed a need for more structured and continuous support, as some
aspects of prevention and assistance to victims have not been traditionally an area of focus for Member S tates' action in the fight against CSA. The shortcomings typically originate in a lack of expertise in relevant areas, as well as difficulties in communication and coordination between key actors, e.g. different ministries. In particular when it comes to measures taraetina (Dotential) offenders, there remains significant room for i mrovemet.
1」どSto, M.; Letourneau,E .; Overview of perpetrator prevention evidence and existing progranmies, October
19, 2021. 139 Unicef, Action to end Child S exua1 Abuse and Exploitation: A Review of the Evidence 2020, 2020. 140 For example, a recent旦I g4y by the Dutch hotline EIK' shows that 81.7% of the boys who had been
victims of sextortion and were in touch with a counsellor were not satisfied with the support received. 141 Report from the Commission assessing the extent to which the Member S tates have taken the necessary
measures in order to comply with Directive 2011/93/EU of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pomoaraphy.C OM(20 16) 871 final.
37
In addition to the evidence gathered through monitoring the transposition of the Directive and
supporting its implementation, the feedback from stakeholders during the consultation
activities, in particular NGOs focused on child' s rights, shows the need for improving awareness and education of children, parents, and caregivers. This feedback also included the need for improving the availability of effective prevention programmes for offenders and
persons who fear that they might offend, as well as the assistance to victims' programmes142
2.3. How likely is the problem to persist?
The problem of CSA is likely to continue worsening, driven by the issues identified in the
problem drivers section.
Children will continue to spend more time online and thus be more exposed to predators operating online.S imilarly, predators will most likely also be spending more time online than
before, as teleworking arrangements expand and become part of the post-pandemic new
normal, and in response to the increase in opportunities to encounter children online.
Relevant services will continue to be misused for the purpose of CSA, in particular those that do not adopt meaningful voluntary measures. It is unrealistic to expect that, in the absence of incentives or obligations, the relevant service providers would implement sufficient voluntary measures, given that many have failed to do so to date despite the evident proliferation of CSA online. images and videos will continue to stay online.S ma11er
players in particular will continue to be dissuaded by the lack of legal certainty. The
fragemented legal framework can also lead to high compliance and operational costs for all service providers offering their services in the EU, since their obligations might differ and be more burdensome in one Member S tate than in another.
In the absence of EU action, Member S tates will see a need to step up and fill the gap, as some have already done or are in the process of doing. The increasing legal fragmentation concerning obligations on service providers to detect and reportCSA online (known and new material and grooming) and to remove that material, as well as the uneven application of
voluntary measures, would continue, in particular after the Interim Regulation expires. There are already inefficiencies in public-private cooperation between online service
providers and public authorities (such as law enforcement authorities) in exchanging information in a timely manner or swiftly reacting to requests and complaints. This hampers investigations and creates obstacles to addressing child sexual abuse online and to protecting victims. S uch inefficiencies would continue and potentially escalate as the overall volume of
illegal activity and content grows
The current technical solutions used to detect CSA online do not function in E 2EE electronic communications. It is likely that more service providers would incorporate end-to-end
encryption without effective measures to protect children. Encryption is an essential tool for
ensuring cybersecurity and the protection of users' fundamental rights such as freedom of
expression, privacy and personal data, but at the same time makes the detection of CSA online (and therefore the protection of fundamental rights of the child) much more difficult, if not impossible. This could result in more online 'safe havens' where offenders can freely exchange CSAM without fear of discovery and reprisal, normalise these crimes, actively encourage others to abuse children to generate new material, and where children may be uroomed and abused online.
142 Targeted online rouridtable with NGOs arid feedback from open public consultation (see amiex 2, section 3).
38
lt is unlikely that, across the board, companies will unilaterally divert investment into
developing technical solutions that allow reliable detection of CSA in encrypted systems, as well as a high level of privacy and protection of other fundamental rights, security against unauthorised access and transparency (see Annex 9 for a possible set of assessment criteria for these technical solutions). Deployment of these technical solutions would require financial resources to develop the solution for feasible deployment at scale and align it with companies' current infrastructures. S ma11er companies with limited resources are especially likely to encounter more difficulties, since work in this area is relatively novel and technical tools
although available, must be tailored to the specific service.
An example of the development of these tools is the announcement of new 'Child safety' initlatlvesi43 by Apple. Apple is working towards deploying technical tools to detect known CSAM on users' devices prior to encryption and storage in the cloud. The solution uses well-
developed hashing technology to generate a hash of the image the user is uploading and match it against a database of hashes of verified CSAM (see Annex 8). This takes place on the user's device prior to the image being encrypted, and does not interfere with the encryption safeguarding the transfer of data, preserving in this respect the privacy and security of data, and allowing detection of known C sAM.
However, a number of companies and privacy NGOs state that there is no possibility to
deploy such tools to detect CSA in the context of encrypted electronic communications that would ensure protection of privacy and security of communications. While they do not interfere with the encryption as such, these tools are seen as violating the spirit of end-to-end
encryption to the extent that it suggests a wholly private exchange where even illegal content is shielded, for the benefit of ensuring everyone's privacy. It is therefore likely that
spontaneous developments in encrypted communications that take into consideration children's safety and privacy and all fundamental rights at stake will remain limited, given in narticular the leaal uncertainty and vocal onnosition from some stakeholders.
As children will be increasingly exposed to predators online, prevention will play a
particularly important role. Parents and children will need the knowledge and tools to protect themselves. Without a solid and structured approach to awareness raising and education to benefit children, parents and caregivers, children will continue to fall victim to sexual abuse in greater numbers. This concerns both online abuse, which may be followed by crimes committed offline, but it applies also to purely offline abuse. While awareness of the problem is currently on the rise in a number of Member S tates when it comes to abuse in organised sports or other activities targeting children, an effective and systematic prevention response is still lacking. Whether sexual abuse takes place offline or online, children will therefore often continue to lack information on where to seek help, and the adults around them will not be in a nosition to notice or remedy the nroblem.
On the opposite side of the problem, people who are attracted to children will continue using the online space to find victims. Those who may want to seek support to overcome this attraction will often not dare to come forward in fear of legal consequences and social stigma. Instead, they will likely continue to seek information online, and often become drawn in by other predators into committing crimes, rather than finding professional help. Therefore,
143 For more information see Apple's post on Expanded Protections for Children. On S eptemer 3 2021, Apple announced that it would delay the implementation of the tools to gather additional feedback before
deploying them. At the time of writing, two of the three tools announced have been deployed (a tool to detect nudity in Messages, and expanded guidance inS iri,S potlight, and S afari S earch, whereas the tool to detect known CSAM remains to be deployed.
39
initiatives ad 1ress1ng more apparent aspects ot prevention, such as awareness raising
initiatives, will not be enough to address the entire problem, and the CSA issue is likely to continue growing. While there are some initiatives that reach out to persons who fear they may offend, without EU-level support and coordination, they will likely continue to be
limited, unevenly distributed and of varying effectiveness.
Increased online activity and consequent exposure of children to predators will unavoidably result in more victims. Victims will continue having difficulties to access long-term assistance. Without more developed support system in all EU Member S tates, the situation of victims will continue to vary. However, even in Member S ttes with more advanced support systems, many victims will be left to face the psychological, physical and economic
consequences of CSA without proper assistance, once the immediate proceedings around the crime are closed. In cases where the crime is never reported, victims and their families may not know where to seek help, or that they should be entitled to it.
Another problem that the victims will likely continue to face on their own are efforts to have their images and videos taken down swiftly and effectively. As this is rather a matter of
practical action against illegal content rather than of harmonised criminal law, it could not
adequately be addressed in a revision of the CSA Directive or the Victims' Rights Directive 144, and it is too specific of a problem to have been included in the DsA proposal. As
long as there is no proactive search for these images and videos, they will often stay online.
3. WHY SHO ULD THE EU ACT ?
3.1. Legal basis
In accordance with settled case law by the Court of Justice of the EU, the legal basis of a
legislative initiative has to be determined in light of the content and aim of the envisaged measures. Given that these measures are in part still under assessment, at this stage, no definitive conclusions can yet be drawn in this respect.
That said, given the problems that this impact assessment is addressing and the solutions
proposed, Article 114 TFEU was identified as the most likely legal basis for an EU intervention. Article 114 TFEU is the basis for measures which have as their object the estblislment and functioning of the internal market. In particular, Article 114 is the
appropriate legal basis to address differences between provisions of Member S tates' laws which are such as to obstruct the fundamental freedoms and thus have a direct effect on the
functioning of the internal market, and to prevent the emergence of future obstacles to trade
resulting from differences in the way national laws have developed145
This initiative aims to ensure the proper functioning of the internal market, including through the harmonisation of rules and obligations concerning certain online service providers in relation to providing services which are at high risk of being used for child sexual abuse and
exploitation online. As highlighted above under S ection 2.2.2, Member S tates have started
taking action unilaterally, adopting or considering rules to deal with the challenge posed by child sexual abuse online, which are necessarily national in scope and risk fragmenting the
Digital S ing1e Market. This initiative aims to ensure conmon rules creating the best conditions for maintaining a safe online environment with responsible and accountable behaviour of service providers. At the same time, the intervention provides for the appropriate
standards on the rights, support and Directive 2012/ 9/EU of 25 0ctober 2012 establishing minimum
protection 0f victims 0f crime, Ql L 315, 14.11.2012.
See, C-380/03 Germany v European Parliament and Council, judgment of 12 December 2006
40
144
145
supervision 0f relevant service providers and cooperation between authorities at EU level,
with the involvement and support of the EU Centre where appropriate. As such, the initiative should increase legal certainty, trust, innovation and growth in the single market for digital services.
Articles 82 and 83 TFEU, which constitute the legal basis for the CSA Directive, provide a basis for criminal law rules concerning, inter ahia, the rights of victims of crime and the definition of criminal offences and sanctions in the areas of particularly serious crime with a cross-border dimension such as sexual exploitation of children. As the present initiative would not seek to harmonise criminal law, Articles 82 and 83 TFEU are not the appropriate legal basis.
3.2.Su bsidiarity: necessity of EU action
A satisfactory improvement as regards the rules applicable to relevant online service
providers active on the internal market aimed at stepping up the fight against CSA cannot be
sufficiently achieved by Member S tates acting alone or in an uncoordinated way. In
particular, a single Member S tate cannot effectively prevent or stop the circulation online of a CSA image or video, or the online grooming of a child, without the ability to cooperate and coordinate with the private entities who provide services in several (if not all) Member S ttes. As presented above under S ectlon 2.1., several Member S tates took, or in the process of
taking, the initiative to adopt national laws in order to step up against the proliferation of CSA online. Although these approaches share the same objective, their way of achieving that
objective is somewhat different, targeting for instance different types of services and
introducing varying requirements and different enforcement measures.
In the absence of EU action, Member S tates would have to keep adopting individual national laws to respond to current and emerging challenges with the likely consequence of
fragmentation and diverging laws likely to negatively affect the internal market,
particularly with regard to online service providers active in more than one Member S tate (see problem driver section 2.2.2.). Individual action at Member S tte level would also fail to
provide a unified system for cooperation in the fight against these crimes between public authorities and service providers, leaving them to deal with different legal systems and
diverging rules instead of one harmonised approach.
This initiative would build on the DSA proposal, which creates a harmonised baseline for
addressing all illegal content, to create a coherent system throughout the EU for the specific case of CS A content, which is characterised in particular by its non-public nature and the
gravity of the crimes.S uch a coherent system cannot be achieved at Member S tate level, as also set out in detail in the Irupact Assessment accompanying the DsA proposal146
3.3.Subsi diarity: added value of EU action
Reduce fragmentation and compliance/operational costs, improving the functioning of the internal market
Legal fragmentation (divergence in national legislation to address these issues) increases
compliance and operational costs of the actions in the fight against CSA for stakeholders such as online service providers that operate in several Member S tates and may lead to legal uncertainty in particular when the fragmentation also causes conflicts of laws. EU action would provide legal certainty and a coherent approach applicable to entities operating in
146 Impact Assessment accompanying the Proposal for a Regulation on a S ing1e Market for Digital S erVices
(DigitalS ervces Act) and amending Directive 2000/3 1/C, 15 December 2020,S WD (2020) 348 final.
41
several Member S ttes, tacilitating the scaling up and streamlining it their ettorts in the tight
against CSA and improving the functioning of the Digital S ing1e Market.
Given the cross-border aspects of the problem, having regard to the inherent cross-border nature of the Ill ternet and to the many services provided online, the number of policy areas concerned (single market for digital services policy, criminal law, economic issues, and fundamental rights including the rights of the child, freedom of expression, privacy and data
protection), and the large range of stakeholders, the EU seems the most appropriate level to address the identified problems and limit legal fragmentation. As previously described, CSA, in particular in its online aspects, frequently involves situations where the victim, the
abuser, and the online service provider are all under different national legal frameworks, within the EU and beyond. As a result, it can be very challenging for single countries to
effectively define the role of and cooperation with online service providers without common rules and without fragmenting the S ing1e Market (see problem driver section 2.2.2.).
Facilitate and support Member S tates' action on prevention and assistance to victims to increase efficiency and effectiveness
While Member S tates are best placed to assess the gaps and needs, and implement action in their local context, they often lack information on what prevention and assistance to victims
programmes are available, how effective they are, and how to approach their implementation in practice - who needs to be involved, what are the technical and legal pre-requisites and estimated costs. EU level action can provide a forum for exchange of necessary information and expertise to avoid duplication of efforts and blind spots. EU action can also help identify best practices and lessons learned at national level (from Member S ttes or third countries) and incorporate them into EU-level initiatives, so that other Member S tates can benefit from them. This may also prevent a "whack-a-mole" effect in which a Member S tate
successfully addresses a problem in its territory but the problem just moves to another Member S tate (e.g. hosting of CSA M online).
While some exchange in this area exists, the feedback from experts in the field indicates there is a need for a structured framework for such exchanges. EU level action promoting and
disseminating research would help to enrich the evidence base in both areas and could
possibly even link initiatives across Member S tates, boosting efforts. EU action could also include practical support to local interventions, e.g. translations of existing materials from another Member S tate, possibly leading to significant cost savings at national level.
The EU level action on prevention and assistance to victims at issue here would not impose any additional obligations beyond those included in the CSA Directive. Indeed, the main focus of the present initiative is on strengthening the functioning of the internal market by setting common rules aimed at combating the misuse of online services forCSA -relted
purposes. Nonetheless, the action could also contribute to facilitating and supporting Member S ttes' work to comply with the existing obligations, notably through the sharing of
expertise and best practices benefitting from the central position it occupies in connection to its principal tasks regarding the detection and reporting of online CSA .
Reduce dependence on and facilitate cooperation with third countries
Currently, in practice, law enforcement authorities of the Member S tates depend almost
entirely on NCMEC, a private organisation located in the Us, as the main source of reports of CSA online. EU action could ensure, among others, that such dependence is reduced and that the detection, reporting and removal of CSA online is done through EU mechanisms that operate according to EU rules, including the necessary safeguards. in addition, EU
42
mechanisms could be more closely linked to what is illegal in the EU and its Member S tates,
rather than relying on definitions from third-country jurisdictions. This would enhance the
precision of efforts, reduce the impact on third parties, and better target measures.
4. ORIECTWEs: WHAT 1S TO BE ACHIEVE D?
4.1. General objective
The general objective is to improve the functioning of the internal market by introducing clear, uniform and balanced EU rules to prevent and combat CSA, notably through imposing detection, reporting and removal obligations on certain online service providers.
4.2.S pecific objectives
There are 3 specific objectives that address the problem drivers identified in section 2.2.:
1. Ensure the effective detection, reporting and removal of online CSA where they are
currently missing. This specific objective is of particular relevance to problem driver 1, as the current voluntary action by online service providers and under diverging national laws is insufficient to effectively detect, report and remove CSA online across the EU, i.e.
by not detecting some crimes or by not being effective in dealing with those detected. It is also of relevance to problem driver 2, since part of the current inefficiencies in the
detection, reporting and removal process are due to inefficiencies in public-private cooperation.
2. Improve legal certainty, transparency and accountability and ensure protection of fundamental rights. This specific objective is of particular relevance to problem driver 1, as the current voluntary action by online service providers and the action taken under
diverging national laws is not sustained on a clear, uniform and balanced EU-level framework that provides long-term legal certainty, transparency and accountability and ensures protection of fundamental rights. This objective therefore reflects the need to create a clear framework, with the appropriate safeguards to ensure respect for children's rights and all users' rights, including the right to freedom of expression, right to private life and communications as well as data protection, and to provide regular information about its functioning, including e.g. transparency reports on technologies used for the identification of CSA content.
3. Reduce the proliferation and effects of CSA through harmonisation of rules and increased coordination of efforts. This specific objective is of particular relevance to
problem drivers 2 and 3. Coordination issues are at the core of the inefficiencies in public- private cooperation in problem driver 2, and improved coordination could boost Member S ttes' efforts on Drevention and assistance to victims.
Contribution to relevant S DGs
The three specific objectives directly contribute to achieving the most relevant S DGs for this
initiative, 5.2., eliminate all forms of violence against women and girls, and 16.2., end abuse,
exploitation, trafficking and all forms of violence against children.
specific objectives i and 3 also directly contribute to achieving other S DGs of relevance, such as S DG 1 on poverty and S DG 3 on health, by reducing the proliferation and effects of CSAand ensure the detection, reporting and removal on CSA online where it is currently missing. Contributing to prevent and/or stop the abuse can reduce the negative consequences on health, including mental health, which may have a negative impact on the economic future
43
0f the child (e.g. through substance abuse or decreased productivity).S pecific objective 3
helps achieve S DG 4 on education (e.g. through the awareness raising campaigns or the
exchange of related best practices facilitated by the EU Centre). Finally, specific objective 2
helps achieve S DG 9 on industry, ilmovation and infrastructure (e.g. as the initiative aims to
support service providers efforts to fight against CSA online, including through increasing legal certainty and the required safeguards that do not hamper ilmovation on the technologies to detect, report and remove CSA online).
5. WHAT ARE THE AVAILABlE POLICY OPTIONs?
5.1. What is the baseline from which options are assessed?
further EU policy action is taken. The following section assesses the absence of the initiative, i.e. how the existing and already would address the problems and objectives for EU action
scenario no scenario in instruments
In the baseline the most likely plamed policy identified:
1. Legislation
Existing and upcoming EU legislation is not likely to effectively address challenges in
detection, reporting and removal of CSA online and prevention of CSA , and assistance to victims. The proliferation of CSA online would be expected to continue in line with current
developments. S pecifica11y, the added value (i.e. what it can achieve in preventing and
combatting CSA ) and the limitations of the existing and upcoming EU legal instruments are the following:
Horizontal instruments
The GDPR.
What it can achieve in the fight against CSA : online service providers have relied on
legal bases in the GDPR for the processing of personal data required in relation to their
voluntary activities to combatCSA online, e.g. under e.g. legitimate interest (Art6 (l)(f)) or vital interest (Art. 6(1)(d)) considerations.
Limitations: the GDPR as a horizontal instrument does not contain CSA-specific
provisions, i.e. provisions that explicitly allow or mandate the processing of personal data
for the purpose of combatting CSA online.
The ePrivacyDirec万ve and itsPrGposed revision
What it can achieve in the fight against CSA: the ePrivacy Directive and its proposed revision allow restrictions of certain rights and obligations under their scope, inter ahia to
prevent or prosecute CSA.S uch restrictions require a proportionate legislative measure, under national or EU law. With the entry into force of the Interim Regulation, subject to
compliance with a set of conditions, certain rights and obligations are temporarily limited
(Articles 5(1) and 6(1) of the ePrivacy Directive for certain providers of online conirnunications services), for the sole purpose of detecting and reporting CS A online and
removing C sAM.
Limitations: As horizontal instruments, the ePrivacy Directive and its proposed revision do not contain CSA-specific provisions. Member S ttes are notably responsible for enforcement through their competent national authorities (see also interim Regulation below).
44
The eCommerce Directive
What it can achieve in the fight against CSA : with regard to hosting services, the eCommerce Directive is notably the basis for the notice and action mechanism in which
parties such as users or hotlines notify online service providers of the presence of CSAM available in their services, so that it can be removed.
Limitations: the eCommerce Directive does not contain CSA -specific provisions, i.e.
provisions that explicitly enable or oblige online service providers to detect, report or remove CSA online. Furthermore, as noted, while failure to act expeditiously can lead to the hosting service providers not being able to invoke the liability exemption (and could thus be held liable under national law), there is no legal obligation upon the service
providers to act, even when notified of manifestly illegal CSA .
The Digital S ervice Act
What it can achieve in the fight against csA : the DsA proposal147, once adopted, will o provide a horizontal standard of obligations for content moderation by providers of
intermediary services; eliminate disincentives for these providers' voluntary efforts to detect, identify and remove, or disable access to illegal content; and create obligations for them to provide information on their content moderation activities and on their users when requested by national authorities. These
provisions are likely to encourage providers to implement voluntary measures and will also create more transparency and accountability for providers' content moderation efforts in general;
o create due diligence obligations tailored to certain specific categories of providers (notice and action mechanism148, statement of reasons, internal complaint-handling system, reacting swiftly to notices issued by trusted flaggers, notification of
suspicions of criminal offences etc.) and transparency reporting obligations. In
particular, it will oblige very large platforms to assess risks and implement the
necessary risk mitigation measures on their services. These measures will
encourage users and trusted flaggers to report suspected illegal content and
providers to follow-up on these reports more swiftly. The obligations on very large platforms are also likely to contribute to lessening the prevalence of illegal content online and users' exposure to such content;
o establish rules on its own implementation and enforcement, including as regards the cooperation of and coordination between the competent authorities. This can lead to faster and more efficient content moderation efforts across the EU,
including with regard to CSAM .
Limitations. Due to its general and horizontal nature and focus on public-facing content, the DSA only addresses the issue of CSA partially. Its approach is appropriate for the wide range of heterogeneous illegal content for which the DSA sets the overall baseline, but it does not fully address the particular issues concerning the detection, reporting and removal of CSA online.S peci丘ca11y:
o Voluntary detection: the DSA does not specify the conditions for the processing of personal data for the purpose of voluntarily detecting CSA online;
o Mandatory detection: the DSA does not include any obligation to detect CSA online. Obliaations to carry out risk assessments and take effective risk
147
148 Imp The
act Assessment accompanying the DSA proposal,S WD(2020) 348 fra!, December 2020. an obligation on providers of hosting services to process the notice received
CSAM), including taking a decision on any follow-up to it, and the possibility
DSA proposal includes
mIce atting Implil
g. by hotiines comb sanctions for non-cc
@ 可
45
mitigating measures, as applicable, apply only to the largest online platforms,
consistent with their general nature; o Reporting: although it contains some provisions in this respect, the DSA does not
provide for a comprehensive CSA reporting obligation, since it focuses on cases where an offence involving a threat to the life or safety of persons has taken
place, is taking place or is likely to take place. Also, given the diverse nature of content that could be concerned, the DSA does not determine specific reporting requirements (i.e. what minimum information should the report contain) and does not provide for the involvement of a body like the EU Centre in the reporting process.
o Removal: like the eCommerce Directive (see above), the DSA sets out liability exemptions that encourage removal, it but does not include any removal
obligations 149
In particular, while the DsA, once adopted, should show significant impact especially when it comes to publicly accessible content, its effect is likely to be less pronounced on content exchanged secretly and in non-public channels (e.g. in interpersonal communications), as is typical for the majority of CSA online. Considering this and the above limitations, the DSA will not eliminate the risks of legal fragmentation introduced by the national initiatives on combatting CSA online. These are likely to
provide a more specific and targeted approach than the DsA, and partially targeting different services, in order to ensure an effective and taraeted resionse to CSA online.
The Victims'Rights Directive
What it can achieve in the fight against CS A: as a horizontal instrument, the Victims'
Rights Directive covers the assistance, support and protection to all victims of crime. The CSA Directive contains additional specific rules that respond more directly to the specific needs of CSA victims.
Limitations: the Victims' Rights Directive refers to the need to cooperate with other Member S tates to improve the access of victims to the rights set out in the Directive but it does not contain specific mechanisms to do so. And, as mentioned above, this Directive does not address only CSA victims, for which dedicated mechanisms to facilitate the
exchange of best practices, which take into account their specific needs, may be required.
Sector-specific legislation
The ChiM Se xmMbuse Directive
What it can achieve in the fight against CSA: the CSA Directive focuses on defining the role of Member S tates and their public authorities in preventing and combating these
crimes, and to assist victims.S pecifica11y, the Directive defines criminal behaviour online and offline, sets the minimum level of maximum sanctions, and requires Member S tates to ensure adequate assistance and support to victims, as well as to put in place prevention measures.
Limitations: as a criminal law instrument, the CSA Directive does not aim to regulate online service providers and so it does not provide sufficient specification of the role of
149 The DSA proposal (and the e-Commerce Directive) establish the conditions under which a service provider cannot be held liable in relation to illegal content in its services and not the conditions under which a
provider can be held liable, as this is up to national or EU law (such as this proposal on CSA ) to determine.
46
service providers and the procedures to apply. In addition, the scope of the actual
obligation (as a criminal law instrument) has to be limited to the own territory, which makes it a less effective tool given the global nature of the Internet.
Themterim Reguたtim What it can achieve in the fight against CSA : it makes it possible for providers of
number-independent interpersonal communications services to continue or resume their
voluntary measures to detect and report CSA online and remove CSAM , provided they are lawful and, in particular, meet the conditions set.
Limitations: as a temporary measure with the aim of bridging the period until long-term legislation (that is, the present initiative) is put in place, it applies only for three years (until 3 August 2024) and does not establish a legal basis for any processing of personal data. The service providers within the scope of the Interim Regulation would therefore not be able to continue their voluntary activities when the Regulation ceases to apply. In
addition, the Interim Regulation is not suitable to offer a long-term solution, since it only addresses one specific part of the problem, for a limited subset of services (number independent interpersonal communication services), and relies fully on voluntary approaches.
The Europol Reguルtion and itsProPosed revision
What it can achieve in the fight against CS A: the revised mandate of Europol should enable Europol, in cases where private parties hold information relevant for preventing and combatting crime, to directly receive, and in specific circumstances, exchange personal data with private parties. Europol would analyse this data to identify all Member S ttes concerned and provide them with the information necessary to establish their
jurisdiction. To this end, Europol should be able to receive personal data from private parties, inform such private parties of missing information, and ask Member S ttes to
request other private parties to share further additional information. These rules would also introduce the possibility for Europol to act as a teclmical channel for exchanges between Member S ttes and private parties.S uch a development would contribute to
increasing the level of cooperation between the three aforementioned stakeholders,
potentially improving the effectiveness of CSA investigations. Limitations: in and of itself, the revised mandate of Europol will not contribute to a
comprehensive solution to address CSA online, which requires a multi-faceted approach. Enabling a more efficient exchange of personal data between Europol and private parties is a necessary but not a sufficient condition for achieving this objective.
the existing EU level cooperation in in the fight against csA 150 and will
cooperate area. For
2. Coordination
EU level cooPeration加investigations What it can achieve in the fight against CS A:
successes investigations has produced significant likely continue to do so.
Limitations: the ability of Europol and law enforcement agencies in the EU to crime
unique CSAM in investigations is limited by the resources that they can allocate to this
example, Europol has only been able to examine 20% of the 50 million
47
15O Seefor example here and here
images and videos in its database151. The EU Centre could play an important role in
supporting Europol in these tasks.
EUたvel cooperation m Prevention What it can achieve in the fight against CSA : the network of experts on prevention will continue developing and adding more members, both researchers and practitioners, mostly from the EU but also globally, so that it can ultimately support Member S tates in
implementing the prevention articles of the CSA Directive.
Limitations: currently, the Commission services themselves are supporting the work of the network by coordinating its work and providing a secretariat. However, there are limits to the level of support these services can provide to the network, in particular as the network expands. The activities of the network could therefore be constrained to a level that would not allow it to reach its full potential of support to Member S tates.
EUたvel cooperation 加 assistance to victims
What it can achieve in the fight against CSA: the Victims' Rights platform would faciliate the exchange of best practices mostly on horizontal issues related to victims'
rights, and mostly on policy-related issues, Limitations: the focus on horizontal issues could limit the effectiveness of the platform for CSA victims, given the specificities of these crimes and their short- and long-term effects on victims.
Mldti-stake加ider cooperation at EU and global level
What it can achieve in the fight against CSA: at EU level, the EU Internet Forum (EUIF) has faciliated discussion between public authorities and online service providers in the EU in the fight against CSA at all levels, from ministerial to tecimical (see annex 8 for an
example of output of technical discussions under the EUIF). It is expected that similar discussions continue in the future. At global level, the WPGA has advanced countries' commitment towards a more coordinated response to the global fight against CSA, based on global threat assessments, and a model national response. These have helped to clarify the challenges and assist member countries in setting achievable practical goals, and it is expected that they will continue to do so in the future.
Limitations: at EU level, the focus of the EUIF is to faciliate targeted exchanges between
public authorities and online service providers. The forum is not designed for discussions with a wider variety of stakeholders, including practitioners. Moreover, participation is
voluntary and there are no legally binding obligations. At global level, the EU will continue supporting global efforts through the WPGA. In the absence of a single European information hub, exchanges of expertise and best practices with leading centres worldwide (e.g Australian Centre to Counter Child Exploitation, NCMEC, Canadian Centre for Child Protection) will be limited. This will in particular concern initiatives on prevention and assistance to victims, leaving EU Member S ttes to their own devices.
likely to research
. Funding What it can achieve in the fight against CSA: action using EU funding is continue in the current project-based form, both as calls for proposals as well as
3
151 European Parliament Intergroup on Children's Rights expert meeting on EU legislation iii the fight against child sex abuse online, 15 0ctober 2020, see 59:29
48
made by service providers1', the number 0f such reports (and therefore overall
reports) could eventually decrease significantly; a similar drop in reports could be expected with the broader deployment of E 2EE by default in these services; Member S ttes' law enforcement authorities would continue to receive the (fewer) reports through NCMEC, submitted by a small number of service providers and assessed in accordance with US law, which has different definitions of illegal content than EU law. The quality of the reports would remain at today's levels; victims' images and videos will continue to circulate online. Law enforcement authorities will be unaware of the undetected crimes and unable to identi句 and rescue victims and investigate and prosecute these cases; the full potential of the hotlines would remain underutilised as they would continue to lack a legal basis to search for CSAM proactively, despite the higher effectiveness
compared to being totally dependent on users' reports; without harmonised standards on the responsibilities and actions expected from service providers in the fight against CSA , their different approaches will fail to offer a reliable standard for the protection of users' rights153; the worsening situation would increase pressure on Member S ttes to take action on a national level once the Interim Regulation expires to address the legal vacuum
creating a risk of further fragmentation of the S ing1e Market.A patchwork of national measures would not effectively protect children, given the cross-border and international dimension of the issues, and would create distortions in the functioning of the single market for digital services. While these will be partially addressed by the DSA, once adopted, a significant degree of &agmentation is expected to persist and possibly grow, given the manifestly illegal nature of CSA M and the specific channels for its dissemination and proliferation (see problem driver section 2.2.2.); without further EU facilitation of efforts, Member S ttes' action on prevention and assistance to CSA victims is not likely to significantly improve. The sharing of best
practices between Member S ttes will continue to be punctual and unstructured, an the current limitations in effectiveness of existing programmes are likely to persist, as well as the dunlication of efforts.
Baseline costs
In the baseline scenario, no costs would be incurred by the creation and running of the Centre or any new organisation. However, the inefficiencies in the prevention, investigation and assistance to victims of child sexual abuse are expected to have a negative economic impact on society.A higher number of victims will experience a diminished quality of life, likely resulting also in productivity loss, and will require significant support, putting a strain on
public services.
The economic impact on public authorities will depend upon the level of action taken by service providers, which will dictate the number of reports received by those authorities. The economic impact on service providers will depend on their level of engagement against these crimes. The existing legal fragmentation and legal uncertainty would remain and could act as a barrier to growth and innovation within the smnnle market for digital services and hamner
ork, and
Seesection 2 arid aiinex 6, section 2. As noted in the impact assessment for the DsA, in the absence of a targeted regulatory framew
companies are setting and enforcing the mies themseives, driven mainly by their commercial interests not consistentiy addressing the societai concems inherent to the digitai transformation they are enabiing.
50
152153
the fight against CSA. In the absence 0f a central hub fragmented efforts would continue,
driving up the economic costs for individual entities.
As seen in box 4, the impact of CSA on its victims generates significant costs. Assuming similar costs and prevalence of CSA in the US as in the EU, adjusting for the larger population in the EU, the estimated annual CSA costs in the EU (and therefore the cost of no
action) is EUR 13.8 billion154.
5.2. Description of the policy options
In the determination of available policy options, tbree main considerations played a decisive role.
First, there are important rights at stake: on the one side, the rights of the child to be
protected and the interest in preventing the circulation of CSAM as illegal content violating the intimacy and right to privacy of the victim; on the other side, the rights of all users
especially to freedom of expression, privacy of communications and data protection. Naturally, the rights and interests of the providers, such as freedom to conduct business, are to be taken into account as well.
Second, offenders have proven savvy at moving to services that are less effective in detecting CSAonline. Consequently, the policy options need to ensure an even application of the
rules, in order to avoid simply pushing the problem off from one platform and onto another.
Third, more effective measures may not amount to imposing a general obligation on
providers of intermediary services to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal activity. The Commission has
recently confirmed its commitment to this principle, as reflected at present in Article 15(1) of the e-Commerce Directive155 and in Article 7 of the DsA proposal.
Box 9: prohibition of gnerlんnonitoring obligations
The exact meaning and extent of the prohibition to impose a general monitoring obligation is
only gradually becoming clear. A case-by-case assessment is required to determine whether in a given situation the prohibition is respected or violated. The Court of Justice of the EU
(CJEU), in its case law, has indicated certain criteria for deciding whether an obligation to monitor the information which intermediary service providers transmit, or to actively seek facts or circumstances indicating illegal activity, is to be considered general and thus
prohibited. Thus far, the CJEU has dealt with this question in the context of copyright infringement and defamation, where the illegality or not of content may not be immediately apparent. It has not yet had to assess a similar obligation with regard to manifestly illegal content such as most CSAM. Also, the case law available thus far relates to obligations resulting from orders based on national law, not EU legislation. The precise content and scope of the obligations in question are naturally also an important factor to be considered.
Based on the case law of the CJEU, it is required that a fair balance be struck between all relevant and conflicting fundamental rights as stake, such as those mentioned above. For
instance, it ruled156, in the context of combating intellectual property rights infringements, that it is not allowed to imvose an obliaation which cumulatively meets the followina conditions:
154 Includes direct costs (victims' assistance) and lifelong loss of potential earnings and productivity, see section 6.2.2. on benefits for more details (box 20).
155 0.1 L 178, 17.7.2000, p. 1-16. 156Cases C-70/10 and C-360/10 - SABAM .
51
applies tor all customers ln abstracto and as a preventative measure, in particular
without further specification of the content to be identified; at providers' own cost; for an unlimited period; and
is based on a system for filtering most of the information to identify electronic files
(stored on a provider's servers), including future content.
In a different context, namely, an order aimed at tackling a particular item of content that the national court had held to be defamatory, as well as content equivalent thereto, theCJE U ruled157 in essence that:
a service provider can in principle be ordered to take measures to detect and remove the item of defamatory content, even if it means monitoring the content provided by other users than the one who had initially posted the content; such an obligation can also be extended to content equivalent to the defamatory content, subject however to a number of conditions (only minor differences as
compared to the defamatory content, sufficient specifications by the court issuing the
order, no need for an indetendent assessment by the service provider.
All policy options that can be considered theretore need to meet a number it specitic requirements in order to limit any interference with fundamental rights to what is strictly necessary and to ensure proportionality and compliance with the prohibition of general monitoring obligation:
Obligations have to be targeted to those services which are at risk of being used for
sharing CSAM or for grooming children.
They have to strike an appropriate balance between the interests afd
(fundamental) rights associated with ensuring an effective approach to combating CSA and protecting children and their rights, on the one hand, and on the other hand the interests and rights of all users, including freedom of expression, privacy of communications and data protection, as well as avoiding an excessive burden on the service provider. To ensure that balance, they have to contain appropriate conditions and safeguards to ensure proportionality, transparency and accountability. Given the significant impact on fundamental rights, the effectiveness of the measures and of these conditions and safeguards should be subject to dedicated monitoring and enforcement mechanisms.
In line with the above requirements, the policy options assessed take a graduated approach, addressing the problem drivers from different angles and in various degrees, with an
increasing level of obligations and intrusiveness. This cumulative logic was chosen because the measures that form the options not only are not mutually exclusive, but are also coIllplementarv. Dresentina svner2ies that the combined oDtions can benefit &om.
As a result, in addition to the baseline, five options are retained for assessment, as first
presented in the intervention logic in table 1. The building blocks of these options are the retained policy measures that resulted from scoping and analysing the full spectrum of
possible EU intervention, from non-legislative action to legislative action.
how the measures combine to form the retained policy options: gure 3 below
Case C-18/18 - Facebook Ireland.
.1 「I
「上 巧
52
Figure 3. overview qfpolicy options and corresponding measures
Options
O A B C D E
J
JJJ
JJ
J
Practical measures to enhance voluntary eEorts
2. EU Centre onP匹些亜ion and assistance to victims
ーNo action
ーNon-legislative モ - 3. EU Centre onPI壁皿ion and assistance to victims and
combattue CSA online
- 4. Legislation speciing the conditions for voluntary detection
- 5. Obligation 切r望皿and r皿oveCSA influe
- 6. Obligation 切旦也旦known cs皿1
- 7. Obligation to血竺! unknown CSAM
- 8. Obligation to皿旦旦 grooming
ーLeeislative
EU actionー
The retained policy options were selected for their potential to contribute to creating a level
playing field across the EU, lessening legal fragmentation, increasing efficiency in tackling the problem (e.g. by facilitating Member S ttes action through sharing of expertise), and
creating more balanced circumstances for all the affected providers, while also contributing to
reducing their compliance and operational costs.
αnd αnd
to enhance prevention, detection, reporting measures
victims, and establishing an EU Centre on prevention
5.2.1. 0ptun 4 : practical rem ovaL and assistance to assistance to victims
This option is non-legislative and includes practical measures to stimulate cross-sectorial
cooperation among relevant stakeholders in prevention and assistance to victims, and enhance voluntary detection, reporting and removal of CSA online by relevant online service providers, within the boundaries of the existing legal framework (measure 1). This
option also includes an EU Centre to support and facilitate information sharing on
prevention and assistance to victims (measure 2).
1. Practical (i.e. non 1e2islative〕 measures to enhance and support voluntary efforts of relevant information society service providers to detect, report and remove CSA online, and to enhance prevention and assistance to victims. Examples of practical measures to enhance detection, reporting and removal include developing codes of conduct and standardised reporting forms for service providers, improving feedback mechanisms and communication channels between public authorities and service providers, and
facilitating the sharing of hashes and detection tecimologies between service providers. Examples of practical measures to enhance prevention and assistance to victims include facilitating research and the exchange of best practices, facilitating coordination, and serving as a hub of expertise to support evidence-based policy in prevention and assistance to victims.
53
2. EU Centre on prevention and assistance to victims.
This measure would create an EU-funded expertise hub, managed by the Commission with support from a contractor (similar to the Radicalisation Awareness Network,
RAN158). Among others, it would support Member s tates in implementing the relevant
provisions of the CSA Directive (e.g. through expert workshops), and serve as a hub of
expertise to support evidence-based policy and avoid duplication of efforts. It would also
help develop and disseminate research and expertise, and facilitate dialogue among stakeholders. This would allow Member S ttes to benefit from best practices and lessons learned in the EU and globally. Having both prevention and assistance to victims in the same hub would increase the possibilities for coherence and cross-fertilisation between both strands of work.
The purpose of prevention efforts led by the EU Centre would be to support Member S tates in putting in place tested and effective prevention measures that would decrease the prevalence of CSA in the EU and globally. The scope of these efforts would cover the two main types of prevention initiatives, i.e. 1) those that reduce the likelihood that a child becomes a victim (e.g. awareness raising and educational campaigns and materials for schools), and 2) those that reduce the likelihood that a person (re)offends. The Centre would facilitate Member S ttes' action on prevention by serving as a hub of
expertise at the service of Member S ttes, notably to help avoid duplication of efforts and to foster an evidence-based approach to prevention policies.
Under the lead of the EU Centre, a network of experts on prevention would facilitate the development of these efforts, the involvement of multiple stakeholders and the
sharing of best practices and lessons learned across Member S tates. The network would enable a virtuous cycle of practice to research and research to practice, while
enabling the cascading down of best practices and new developments from EU and
global level to national and regional levels. The Centre would support the work of the network by e.g. hosting relevant repositories of best practices, providing statistics and other data relating to the prevalence of offending, offender profiles and pathways, and new crime trends particularly those relating to perpetrators' use of tecbiiology to groom and abuse children.
The EU Centre will not have any power to impose any initiative on prevention to Member S ttes, i.e. it will not coordinate in the sense of determining "which Member S tate is obliged to do what". Its tasks in this respect will be ancillary to its principal tasks, which relate to the implementation of the detection and reporting processes.
With regard to assistance to victims, the Centre would play a similar role: facilitate the
implementation of the practical measures on assistance to victims by serving as a hub of expertise to support the development of evidence-based policy and research on assistance to victims, including victims' needs and the effectiveness of short and long- term assistance programmes. 'f addition, the Centre could provide resources to help victims find information on support that is available to them locally or online. The Centre would not provide assistance to victims directly when those services are already provided or would be best provided at national level, to avoid duplication of efforts. Also, the Centre would serve as a facilitator at the service of Member S ttes, including by sharing best practices and existing initiatives across the Union. In that sense, it would facilitate the coordination of Member S tates' efforts to increase effectiveness and
158 See here for more information about the Radicalisation Awareness Network. The hub wouid not take the form of an agency.
54
efficiency. S iml1ar1y to prevention, the Centre will not have any power to impose any
initiative on assistance to victims to Member S ttes, including on issues concerning health, legal, child protection, education and employment.
The possibility to create an EU Centre on prevention and assistance to victims is further
explored in Annex 10, as implementation choice A. As existing entities or networks cannot be expected to fulfil this role, a central entity is the most viable solution. The Centre could also help to improve the cooperation between service providers and civil
society organisations focusing on prevention efforts.
5.2.2.(加tuon : option 豆+legislation刀spec妙ing the conditions pr voluntary detection,刀 requiring mandatory reporting and removalげonline child sexual abuse, and刃expanding the EU Centre to aおo support detection, reporting and removal
This option combines the non-legislative option A with legislation to improve the detection,
reporting and removal of CSA online, applicable to service providers offering their services in the EU. It would provide 1) a long-term regulatory framework for voluntary detection
(measure 4); 2) put in place mandatory reporting in case CSA online is found (measure 5); and 3) set up an EU Centre to facilitate detection, reporting and removal of CSA online, as well as prevention and assistance to victims (measure 3).
1) Legal framework for voluntary detection of CSA online. This measure would build on and complement the DSA proposal, to address the specific challenges inherent in CSA that cannot be addressed with general systems building on notification by users and trusted fiaggers as envisaged by the DSA, and provide a framework for relevant service
providers to voluntarily detect CSA online, including known and new CSAM and
grooming. It would replace the Interim Regulation, building on its safeguards in a more
comprehensive framework, covering all relevant services, i.e. also those defined in the DSA and not only the electronic communications services within the scope of the Iterim
Regulation (i.e.. providers of instant messaging and email). The legal framework would
provide increased legal certainty also when it comes to the basis and conditions for irocessina of nersonal data for the sole nurnose of detection of CSA online.
Given in particular the impact on fundamental rights of users, such as personal data
protection and confidentiality of communications, it would include a number of
mandatory limits and safeguards for voluntary detection. These would notably include
requiring service providers to use technologies and procedures that ensure accuracy, transparency and accountability, including supervision by designated national authorities. The legislation could set out the information rights of users and the mechanisms for complaints and legal redress.
Stk ei oldr' views from the open public consultation on voんntarv measures
groups mat and remove and general
The percentage 0f responses to the open public consultation from each of the main stakeholder indicated that the upcoming legislation should include voluntary measures to detect, report CSAonline was the following: public authorities 25%, service providers 13%, NGOs 9%,
public 10%. The support for voluntary measures was highest for known material and lowest for grooming (e.g. 11.3% for known material, 9.7% for new material and 6.5% for grooming in the NGO group).
2) Legal obligation to reportCSA online. Relevant service providers would be required to
report to the EU Centre any instance of suspected CSA that they become aware of, based on voluntary detection measures or other means, e.g. user reporting. This obligation would build on and complement the reporting obligation set out in Article 21 of the DSA
55
proposal, covering the reporting of criminal offences beyond those involving a threat to
the life or safety of persons (e.g. possession of CSAM). In order to enforce the reporting obligations, competent national authorities in the Member S tates would be designated. The legislation would also include a number of conditions (e.g. to ensure that the reports contain actionable information) and safeguards (e.g. to ensure transparency and
protection of personal data, see section 5.2.3.).
Legal obligation to removeCSA online. As mentioned earlier, under the eCommerce Directive and the DSA proposal, hosting service providers are required to expeditiously remove (or disable access to)CSAM that they obtain actual knowledge or awareness of, or risk being held liable due to the resulting unavailability of the liability exemptions contained in those acts. Given that this system encourages but not legally ensures removal, it would be complemented by rules ensuring a removal obligation in cases of confirmed
CSAonline; where necessary, national authorities would be empowered to issue a removal order to the concerned providers requiring them to remove the specificCSAM on their services. The rules would be accompanied by the necessary conditions (e.g. to ensure that the removal does not interfere with ongoing investigations) and safeguards (e.g. to ensure transparency and protection of personal data and freedom of expression), including rules on redress. Member S ttes' national authorities would be competent for enforcement, relying where relevant also on the expertise of the Centre.
SMEswould also be required to report and remove in accordance with the above rules,
benefiting however from additional support by the Commission and the Centre through: tools to facilitate the reporting and removal, made available by the EU Centre at no cost, for SME s to use in their services if they wish, reducing their financial and operative burdens;
guidance, to inform S MEs about the new legal framework and the obligations incumbent on them. This guidance could be disseminated with the help of industry associations; and
specific training, delivered in collaboration with Europol and the national authorities.
3) EU Centre to prevent and counter CSA. The Centre would incorporate the supporting functions relating to prevention and assistance to victims of measure 2 and add the
ability to support the detection, reporting and removal efforts, including by helping ensure transparency and accountability.S pecifica11y, it would:
facilitate detection by providing online services clear information on what is CSA in the EU through access to a database of CSA indicators (e.g. hashes, AI
patterns/classifiers) to detect CSA in their services. The Centre would help create and maintain this database of indicators that would reliably enable the detection of what is defined as CSA according to EU rules (notably the CSA Directive), as determined by courts or other independent public authorities. The material would come from multiple sources including previous reports from service providers, concluded investigations by law enforcement, hotlines or direct reports from the public to the EU Centre (e.g. from survivors requesting the Centre for support to have materials depicting their abuse taken down). The Centre would also facilitate access (in particular to S MEs) to free-
of-charge technology that meets the highest standards for the reliable, automatic detection of such content; facilitate reporting, by becoming the recipient of the reports of CSA concerning the EU that providers detect in their online services. The Centre would serve as an
56
intermediary between service providers and other public authorities (notably law
enforcement authorities), supporting the reporting process by 1) reviewing the
reports to ensure that those other public authorities do not need to spend time filtering out reports that are not actionable and can make the most effective use of their
resources; and 2) facilitating the communication between those other public authorities and service providers in case of requests for additional information from
public authorities or requests for feedback from service providers (if needed); facilitate removal, by noti句ing in certain cases to the service providers materials considered to be known CSAM and requesting removal, as well as following up on these requests. This would entail supporting victims that request to have material that features them taken down; no such service exists to date. The Centre could also be
given a mandate to conduct in certain cases searches of CSAM, using the databases of indicators159. The Centre could track whether the removal has taken place. Where removal is not effected in a timely manner, the Centre could refer to national authorities for action (e.g. issuing of removal orders).
Box 10: dstributim げtas鳥between the EU Centre and Member S tats
Prevention and assistance to victims: the Centre, although this would not constitute its
principal task, it could, through the functions described in section 5.2.1., help facilitate Member S tates' efforts in these two areas, notably to comply with their obligations under the CSA Directive. This initiative would not introduce new obligations on Member S tates on
prevention and assistance to victims, including in relation to the cooperation with the Centre, which would remain an optional resource at the service of Member S tates that wish to benefit from it.
Detection, reporting and removal of CSA online: the Centre, through the functions described
above, will also serve as a facilitator of Member S tates' efforts on investigations, as well as a facilitator of service providers' efforts to comply with the obligations under this initiative,
particularly in relation to detection and reporting. The Centre would not have the capacity to initiate or conduct investigations, as these will remain under the responsibility of national law
enforcement, or coordinate them, as this will remain under the responsibility of Europol. It will not be empowered to order service providers to remove CSAM, either.
Given the key functions above, the Centre would become a fundamental component of the
legislation, as it would serve as a key safeguard, by acting both as the source of reliable information about what constitutes CSA online and as a control mechanism to help ensure the effective implementation of the legislation. The Centre would ensure
transparency and accountability, by serving as a European hub for the detection, reporting and removal of CSA online. In receiving reports, the Centre would notably have visibility on the effectiveness of detection (including rates of false positives), reporting and removal
measures, and on the spreading of CSAM and grooming across different platforms and
jurisdictions.
Box 11: independence げthe EU Centre
To be able to play its main role as a facilitator of the work of service providers in detecting reporting, and removing the abuse, and of the work of law enforcement in receiving and investiaatina the reDorts from service Droviders. it is essential that the Centre be indeDendent
159 The proactive search could be done usinga "web crawler", similar to the one used in Project Arachnid by the Canadian Centre for Child Protection.
57
trom service providers, to be able to serve both as the source it reliable intormation about
what constitutes CSA online, providing companies with the sets of indicators on the basis of which they should conduct the mandatory detection, and as a control mechanism to
help ensure transparency and accountability of service providers; and
from law enforcement authorities, as the Centre must be neutral to be an effective facilitator and must ensure that it maintains an objective, fair and balanced view.
To ensure that, it will be subject to periodic reporting to the Commission and to the public. The Centre should also be independent from national public entities of the Member S tate that would host it, to avoid the risk of prioritising and favouring efforts in this particular Member S tate
The Centre would also reduce the dependence on private organisations in third countries, such as NCMEC in the US, for the fight against CSA in the EU. The Centre would operate within the EU and under EU rules and would reduce the need for international transfers of
personal data of EU residents to third countries, notably the Us.
To be able to carry out its functions, specifically to support the process of detection, reporting and removal, the Centre would, in accordance with the EU's personal data acquis, be
provided with the appropriate legal basis to allow it to process personal data where needed. with service providers, law enforcement, EU The Centre would be able to coop
institutions, but also with similar entities worldwide, such as NCMEC, given the global nature of CSA.
Discussion of the implementation choices for the Centre
This section summarises the process to determine the preferred implementation choice for the
Centre, explained in detail in Annex 10.
The process had three stages: 1) mapping of possible implementation choices; 2) analysis of the choices and selection of the most promising ones for further analysis; 3) qualitative and
quantitative analysis of the retained choices and determination of the preferred choice.
1) Mapping of possible implementation choices
Currently there is no entity in the EU or in Member S tates that could perform the intended functions for the Centre without significant legislative and operational changes, and therefore no obvious/immediate choice for the implementation of the Centre.
The process to determine the implementation choices started with a mapping of existing entities and their present functions and forms in order to identify possibilities to build on
existing structures and make use of existing entities, or simple use them as possible references or benchmarks for setting up a new entity of the same type. For the mapping purposes, the examples were divided in two main types, depending on whether they required specific legislation to be set up:
1) entities that do not require specific legislation to be set up: a) Centre embedded in a unit in the European Commission (DG HOME, e.g.
Radicalisation and Awareness Network, RAN). b) Entity similar to the EU centre of exvertise for victims of terrorism.
2) entities that require specific legislation to be set up: a) Centre fully embedded in an existing entity:
EU body: Europol;
58
Fundamental Rights Agency (FRA).
Other: national entity (public or private such as an NGO); international entity (e.g. LNHOPE network of hotlines).
set up as a new entity: EU body:
executive agency (e.g. European Research Executive Agency, REA,
European Education and Culture Executive Agency (EACEA)); decentralised agency (e.g. European Monitoring Centre for Drugs and
Drug Addiction (EMCDDA), European Institute for Gender Equality (ElGE), European Union Intellectual Property Office (EUIPO)).
Other: national entity:
foundation set up under national law (e.g. Academy of
European Law (ERA), set up under German law); Member S tte authority (e.g. new Dutch administrative
authority to combat CSA and terrorist content online, under
preparation). international entity:
inter-governrnental organisation (e.g. European S pace Agency (EsA), European Organisation for theSa fety of Air Navigation (EUROCONTROL));
joint undertaking (public-private partnership, e.g. hinovative Medicines Initiative, Clean S ky Joint Undertaking);
non-governmental organisation (e.g. CEN/CENELEC, EuroChild).
b) Centre
The mapping also included three relevant entities outside of the EU, which carry out similar functions to those intended for the EU centre, and which could provide useful references in some areas (e.g. costs, organisational issues, etc).
US National Centre for Missing and Exploited Children (NCMEC); Canadian Centre for Child Protection (C3P); and
Australian Centre to Counter Child Exploitation (ACCCE).
Finally, the mapping also included possible combinations of the above choices (i.e. functions distributed between several entities), in particular with Europol:
Europol + a unit in the Commission;
Europol + and NGO (e.g. a hotline);
Europol + new national entity.
2) Analysis of the choices and selection of the most promising ones for further analysis The analysis of the possible choices took into account the following criteria:
Functions, i.e. the ability to effectively carry out the intended functions to contribute to achieving the specific objectives of the initiative.S pecifically:
o Facilitate prevention efforts. o Facilitate support to victims. 〇 Facilitate the detection, reporting and removal of CSA online, including by
ensuring accountability and transnarencv.
59
Forms, i.e. the form in which the Centre is set up, and the extent to which that
supports carrying out the intended functions.S pecifically: Legal status: both the legal basis to set up the centre (if any) and the legislation
long-term conflict of
ctions (e.g. processing of personal data). would allow the centre to ensure
fun that
to allow it to perform its sources 〇 Funding: the
sustainability and independence of the centre, while avoiding interest.
o Governance: it should ensure 1) proper oversight by the Commission, and other relevant EU institutions and Member S ttes; 2) participation of relevant stakeholders from civil society organisations, industry, academia, other public bodies (in particular considering that the Centre would need to work very closely with Europol, the Fundamental Rights Agency, and national
authorities); 3) ensuring independence and neutrality of the centre from
overriding private and political interests, to be able to maintain a fair and balanced view of all the rights at stake and to play its main role as facilitator.
Each of the possible implementation choices mapped earlier was analysed according to the above criteria. This detailed analysis led to discarding a number of possible choices, in
particular having the Centre fully embedded in Europol, notably due to:
Challenges to carry out certain tasks in connection to the assistance to victims and prevention, particularly by acting as a hub for information and expertise, some of which are significantly different from the core law enforcement mandate of Europol. Adding these tasks would require a revision of the mandate and significant capacity building efforts, with the risk that these tasks are eventually deprioritised compared to the core tasks of supporting investigations. While Europol has an explicit empowerment to set up centres under Art. 4 of the Europol Regulation, these centres are of a different nature and refer to internal departments focusing on implementing Europol's existing mandate in relation to specific types of crime. This empowerment therefore cannot be used to expand Europol' s mandate to cover the new tasks.
Constraints of being part of a larger entity. Being part of a larger entity could limit the ability of the centre to dispose of its own resources and dedicate them exclusively to the fight against CSA, as it could be constrained by other needs and priorities of the
larger entity. It may also limit the visibility of the centre, as child sexual abuse is only one of the many types of crime Europol deals with. Moreover, embedding fully the Centre in Europol could create an imbalance and it would be difficult to justify that
Europol expands its mandate to cover prevention and assistance to victims only in the area of child sexual abuse. This could lead to Europol gradually deviating from its core law-enforcement mandate and covering prevention and assistance to victims in
multiple crime areas, becoming a "mega centre" of excessive complexity to be able to attend to the specificities of the different crime areas adequately. Difficulties to appear as an independent afd neutral facilitator. The intended main role for the Centre is to serve as a facilitator to both service providers and law enforcement authorities of the process of detection, reporting and removal of CSA online. Europol's core mandate, however, is to support law enforcement. This may prevent Europol from appearing to all parties involved as an independent and neutral facilitator in the entire detection, reporting and removal process. Furthermore, service
providers expressed during the consultations legal concerns about working too closely with law enforcement on the detection obligations, in particular if they are required to use the database of CSA indicators made available by the Centre for these detection
60
obligations. There is a risk that that content data of CSA online (i.e. images, videos
and text) could not be used for prosecution in the US. This is due to the US legal framework (US Constitution) preventing from using content data detected by companies acting as "agents of the state" as it could be the case if the companies were mandated to detect content data using a database of indicators (e.g. hashes/Al
classifiers) provided by law enforcement rather than by a non-law enforcement entity.
Another choice that was discarded following analysis was setting up the Centre as a private law body under the national law of the Member S tte hosting it. The main reason is that the Centre would not be able to carry out effectively the function of supporting the detection,
reporting and removal of CSA online. These tasks imply implementing EU law, which in
principle only Member S ttes or the Commission can do.
The detailed analysis of all the possible implementation choices resulted in three "legislative" choices (i.e. that require legislation to set up the Centre) retained for the final assessment160:
1. Creating a self-standing, independent EU body (i.e. a dedicated decentralised agency) with all the intended centre functions: to support detection, reporting and removal of CSA online, and facilitate Member S tates' efforts on prevention and assistance to victims.
2. Tasking Europol with supporting detection, reporting and removal of CSA online and
creating an independent private-law entity (or tasking an existing one) for
prevention and assistance to victims. 3. Tasking the Fundamental Rights Agency (FRA) with all functions.
3) Qualitative and quantitative analysis of the retained choices and determination of the
preferred choice.
Qualitative analysis 1. Centre as a self-standing EU body (decentralised EU agency): Arguments in favour:
Independence, which would allow it to help ensure transparency and
accountability of companies' efforts to detect CSA online and serve as a major safeguard and a fundamental pillar of the long-term legislation. Independence is essential to the centre's key function as facilitator and intermediary between private companies and public authorities. The legislation setting it up could be designed in a
way that 1) guarantees the sustainability of the Centre through stable EU funding; 2) the governance is such that it ensures appropriate oversight by the Commission, and includes the participation of Member S ttes and relevant stakeholders.
Ability to dispose of its own resources, fully dedicated to the fight against CSA.S taff dedicated solely to the mandate of the Centre, rather than having to meet other
objectives as part of a larger entity. Possibility to receive secured funding from the EU
budget. Political accountability for its financial management would be ensured
through the annual discharge procedure and other rules ordinarily applicable to decentralised agencies. Greater visibility of EU efforts in the fight against CSA, which would help facilitate the cooperation between the EU and stakeholders globally.
160 The non-legislative choice (i.e. practical measures) was also retained for final assessment for comparison purposes (see annex 10), excluded here for simplicity. Legislation is required to enable the Centre to achieve its intended objectives, notably to support detection, reporting and removal of CSA online (e.g. to manage the database of indicators, or to review the reports from the service providers).
61
Possibility to carry out all relevant functions in the same place (contribute to the
ort and assist victims and facilitate prevention) and detection of CSA online liaise with all relevant stakeholder groups, which creates higher EU added value and a more effective and holistic response against CSA .
Arguments against: Annual costs would likely be slightly higher than in the other choices. These annual costs are indicative and could be higher or lower depending on the precise set-up and number of staff needed (see cost s uniary table in the quantitative assessment section
below). The budget to cover this funding would need to be found within the scope of 2021-2027 Multiannual Financial Framework, from the Internal Sec urity Fund budget. It will require significantly more time and effort to set up (including the decision on the seat of the agency) and get it fully operational as we cannot build on existing institutional legal frameworks (although these could serve as a reference) and would have to create a new mandate, and find, hire and train a number of dedicated non-law enforcement experts, including for management and control functions. The need for increased supervision would entail an increased workload at DG IIIE and additional staff could be needed.
The cooperation with Europol and national law enforcement would have to be created anew.
2. Part of the Centre within Europol and part as an independent entity: Arguments in favour:
The annual costs will most likely be lower than creating a new body as the Centre would benefit from economies of scale with Europol, (e.g. building, infrastructure,
governance, management and control system), although building and governance costs could be offset by those of the new entity (see cost summary table below). The part of the Centre as part of Europol could directly benefit from its expertise and established mechanisms (including concerning personal data protection) to deal with the reports from service providers.
Arguments against: The ability of the Centre to serve as a major player and safeguard in the detection and reporting process, a key feature of the long-term legislation, would appear limited as it would not be independent from law enforcement.
In the case of false positives, companies would be reporting innocent persons to law enforcement directly. The ability of the Centre to dispose of its own resources and dedicate them to the
fight against CSA may be limited by other needs and priorities of Europol in other crime areas. This could also jeopardize its ability to deliver on these additional and visible tasks.
Europol would be dedicating a substantial amount of resources to tasks such as
manually reviewing the reports from companies to filter false positives, determining the jurisdiction best placed to act, etc. That may not be the best use of law enforcement's resources, which could be otherwise dedicated to conduct
investigations leading to the rescue of victims and the arrest of offenders, given the limited availability of law enforcement officers.
62
Less visibility of EU efforts in the fight against CSA, as these would be split between
two entities, and Europol 's area of focus is vast, which could limit its ability to facilitate the cooperation between the EU and stakeholders globally.
3. Tasking the Fundamental Rights Agency (FRA) with all functions:
Arguments in favour:
Annual costs would most likely be slightly lower than creating a new body, as the centre could benefit from economies of scale with FRA (e.g. governance, management and control system). The initial costs would also be slightly lower than creating a new
body or in the Europol+ option, thanks to the possibility to leverage the existing building and infrastructure (see cost summary table below). The focus of FRA on fundamental rights could reinforce the perception of
independence, which is key to help ensure transparency and accountability of
companies' efforts to detect CSA online and of the outcome of the follow up of the
reports by law enforcement. This would also allow FRA to serve as a major safeguard of the detection process. In the case of false positives, companies would not be reporting innocent persons to law enforcement directly.
Possibility to carry out all relevant functions in the same place (contribute to the detection of CSA online, support victims and facilitate prevention) and liaise with all relevant stakeholder groups.
Arguments against: The ability of the Centre to dispose of its own resources and dedicate them to the
fight against CSA may be limited by other needs and priorities of FRA. This could
jeopardize its ability to deliver on these additional and visible tasks.
Although it would be possible to build on the existing institutional framework to some
extent, repurposing it may still entail significant effort to accommodate these new tasks in a long-existing and established entity. The setup of FRA and its governance structure are specific to its current mandate. S ignificant changes to that mandate and the governance structure would be required in order to integrate the EU Centre into FRA. Given past difficulties in revising the mandate of FRA, there would also be siguificant additional risks in reopening the relevant regulation. The cooperation with Europol and national law enforcement would have to be created anew.
The annual and initial costs may be lower than creating a new body but they will still be substantial, e.g. to find, hire and train a number of dedicated non-law enforcement
experts, and to carry out the centre functions (including manually reviewing the
reports from companies to filter false positives, determining the jurisdiction best
placed to act, and supporting Member S tates on prevention and assistance to victims). There would be a significant imbalance in FRA's mandate: as it would double in
size, half of it would be dedicated to CS A and the other half to its current tasks.
63
Quantitative analysis
Costs. The following table summarises the estimated costs for the three retained implementation choices of the EU Centre161:
161 These costs estimates refer to 2022 costs aud to the Centre operating at full capacity. The estimates do not take into account inflation and the related accumulated costs during the ramp-up period until the Centre
operates at full capacity.S ee the legislative financial statement accompanying the legislative proposal for more exact cost estimates taking into account inflation and the breakdown of different staff positions.
64
Tabl 2:sulllmmァ げestimated costspr the implementation
1. EU body (e.g. agency)
ptions げ琉eEU centre
2. Europol + separate entity
S taff
(number of people) Detection, reporting, removal
Prevention
Assistance to victims
Operational staff
Overheads staff
Operational staff
Overheads staff
Operational staff
Overheads staff
3. FRA
7 0 5
10 2
10 2
99
13,9
4
3,2
6戸
23,7
4
Separte entity
皿 10
4 10
4 28
Europol 70
5
N/A
75 103
タ一 1
2一 べ)一
声
つつ 1
っつ 8
10,6 14,5
妬 f 6 ー m
5
4 ¥
第 邸
7015 10
4 10
4
Total staff (number of people) 162
Staff (MEUR/year)
113
15,
Infrastructure (MEUR/year) initial costs
Annual costs
Operational expenditure (MEUR/year)
Total annual costs (lEUR)
Total initial costs (lEUR)
5
3,2
6,6
25,7
5
162 28 posts corresponding to the prevention aud assistance to victims functions in all options could be non-EU staff and be covered by a call for proposals/grant. They would therefore not be part of the EU establishment plan and would not have impact on the future EU budget (e.g. pensions, etc).
65
As a reference, existing agencies of comparable size have the following actual annual costs:
Staff Number of people
MEUR/year
People/MEUR Infrastructure (MEUR/year)
Operational expenditure (MEUR/year) Total (MEUR'year)
EMCDDA 100
m 8 2 写 幻 1 9
」 晒 胃 2 2 7 4 m
As indicated above, 28 posts corresponding to the prevention and assistance to victims functions in all options could be non-EU staff and be covered by a call for proposals/grant. In
particular, in the case of option 2, Europol + separate entity, the possibility to cover these
posts through a call for proposals/grant would not remove the need for a separate entity, as the
envisaged prevention and assistance functions are currently not carried out by any organisation. Even if an existing entity applied for the potential call for proposals/grant, it would need to expand to accommodate the 28 posts, with the estimated infrastructure costs of
e.g. rental of buildings, IT systems and audits, and the operational expenditure costs of e.g. support to expert networks, translation and interpretation, dissemination of knowledge and conirnunication (see Annex 10, section 4.2.). Furthermore, a single separate entity should deal with both the prevention and assistance to victims functions to ensure organisational efficiency, given the strong interlinkages between both functions.
Annex 4 includes additional information on the points considered in the above estimates.
Benefits. The main quantitative benefits derive from savings as a result of reduction of CSA associated
costs, i.e. savings relating to offenders (e.g. criminal proceedings), savings relating to victims
(e.g. short and long-term assistance), and savings relating to society at large (e.g. productivity losses).
It is assumed that the implementation choice that is the most effective in fulfilling the functions of the Centre would also be the one helping achieve that highest reduction of CSA and therefore the one with the highest benefits. Annex 4 contains estimates of these benefits, to be taken into account for the sole purpose of comparing the options. As it is expected that a dedicated EU agency would be the most effective in fulfilling the Centre functions, it would also be the one generating the highest benefits.
66
Preferred option
The anal界ical assessment and comparison process above indicates that the preferred implementation option for the Centre would be a dedicated EU decentralised agency163 This is the option that would best contribute to achieve the specific objectives of the initiative, while respecting subsidiarity and proportionality and protecting fundamental rights. It will be
possible to provide the EU agency with the necessary legal framework to carry out its
functions, in particular those in relation to facilitating the detection, reporting and removal of CSAonline. The a dedicated and decentralised EU agency, in accordance with the common approach agreed by the European Commission, the European Parliament and the Council of the EU in 2012164. As an EU agency, it would be financially independent and be funded by the EU, which would further support the Centre's independence.
In addition to the periodic reporting to the Commission and to the public described above, the Commission and Member S tates would further supervise the Centre and its activities, in
accordance with the general rules applicable to decentralised EU agencies165. These rules include in particular a governance structure that supports both the independence of the agency and the participation of relevant stakeholders, notably through a management board with
representatives of all Member S tates and the Commission, an executive board, and an executive director appointed following an open and transparent selection procedure.
In terms of organisation, the Centre would work closely with the European Police Agency (Europol), the EU Agency for Fundamental Rights (FRA) (e.g. in contributing to
transparency and accountability as well as to assessments of the fundamental rights impact of new measures), national law enforcement and other relevant authorities, as well as the national hotlines. This setup would ensure that existing resources can be relied upon to the maximum extent possible while preserving the independence that is fundamental to the role of the Centre一
Box 12: relations between the Centre as a new EUagency and Europol
The Centre as a new EU agency would cooperate closely with Europol, in particular on
facilitating the reporting of CSA online, as described above.
The Centre would be the recipient of the reports from service providers. It would review these reports and ensure that they are actionable, i.e. that they are not manifestly unfounded and could thus lead to law enforcement authorities to initiate an investigation where they deem this necessary and appropriate. In doing so, the Centre would ensure that possible false
positives do not reach law enforcement and the service providers are informed of the possible errors. These tasks could free up resources at Europol and national law enforcement agencies, which are currently dedicated to filtering the reports.
Once the Centre confirms that the report is actionable, it would forward it to Europol and/or national law enforcement for action in accordance with the existing rules, including as
regards Europol's mandate. Europol could enrich with criminal intelligence the reports
163 To be funded by the Internal S ecurity Fund managed by the European Connnission Directorate General fir
Migration and Home Affairs. 164 JointS ttement of the European Parliament. the Council of the EU and the European Comniission on
decentralised agencies, 2012. 165 S ee the JointS tatement of the European Parliament. the Council of the EU and the European Coumsission
on decentralised agencies, 2012.
67
received trom the Lentre, Klentityrng lmis between cases in dlitterent Member S ttes, sharing
the reports with national law enforcement agencies and supporting these agencies by facilitating cross-border investigations. The Centre would not have any competence to launch
investigations; this would remain under the exclusive competence of national law enforcement authorities.
The Centre would also notably cooperate closely with Europol on the preparation of the databases of indicators, on the basis of which the service providers would be required to detect CSA online, building on existing databases at Europol and at national level. New material from reports (from service providers, hotlines and/or the public) and finished
investigations by law enforcement will, where justified in view of confirmation by courts or
independent administrative authorities, be added to these databases in the form of newly generated indicators, to ensure that they remain updated and as relevant as possible.
Box 13. European Iセrliament views on the EU Centre
The European Parliament has welcomed166 the idea to establish the European Centre to
prevent and counter child sexual abuse that the Commission first announced in the 2020 EU
strategy for a more effective fight against child sexual abuse, following the call of the Parliament in 2019 for an EU child protection centre167 that would help ensure an effective and coordinated response to child sexual abuse in the EU.
In addition, during the negotiations for the In terim Regulation, Members of the European Parliament repeatedly expressed their expectations that an EU Centre could help limit the international transfers of personal data of EU citizens to the US, hold companies accountable, and publish transparency reports about the detection, reporting and removal process.
tk eh oldr' views on me EU Centre切prevent and counterCS且
All the main stakeholder groups that responded to the open public consultation supported the creation of an EU Centre that would provide additional support at EU level in the fight against CSA online and offline, to maximize the efficient use of resources and avoid duplication of efforts. The support was highest among academia and research institutions (100% of responses), as well as public authorities and NGOs (85% of
responses). 40% of the responses from service providers, business associations and the general public expressed explicit support.
More than half of the responses (51% of all responses to the consultation) indicated that the Centre couid support Member S tates in putting in place usable, rigorously evaluated and effective multi-disciplinary prevention measures to decrease the prevalence 0f child sexual abuse in the EU. It could also support victims in
ensuring removal of child sexual abuse materiai online depicting them. The Centre could serve as a hub for
connecting, developing and disseminating research and expertise, as well as facilitating the communication and
exchange of best practices bertveen practitioners and researchers.
Public authorities pointed out that the Centre could maintain a single EU database of hashes of known CSAM in order to facilitate its detection in companies' systems (76% of responses from this group). The Centre could also support taking down CSAM identified through hotlines (62% of responses from this group).
Service providers indicated in the targeted consultations that they would prefer to report to an EU Centre rather than to law enforcement directly, as they currently do in the US with NCMEC.
Stakholdr' views on new C&4 lejiislation from the open public consultation
Slrategy 2020 on the EU S ecurit Union
of the 30th occasion children's rights on the
(2019/2876(RSP)).
on
European Parliament resolution of 17 December
(202012791(RSP)).
European Parliament resolution of 26 November 2019
anniversary of the UN Convention on the Rights of the Child
68
166
167
Respondents from public authorities (62% of the total responses from tis group), companies (5 6%), business
associations (60%) aud civil society organisations (74%), supported new legislation to ensure legal certainty for those involved in the fight against CSA. In particular, the legislation should notably:
provide the right incentives for the detection ofCSA M;
provide a clear legal basis for the processing of personal data to detect, report and remove CSA online;
clarify and resolve conflicts aud fragmentation in existing, pending and proposed legislation across Member S tates as well as at EU level; and
be fhture-proof(i.e. that it remains effective despite future technological developments)
5.2.3.(加tuon C: option β+mandatory detectionげknown CS二4M
This option builds on option B and imposes on relevant providers an obligation to performa risk assessment on whether their services are likely to be used for the sharing of known CSAMand propose mitigating measures to reduce that risk. Where the risk assessment (after proposing the mitigating measures) reveals a level of risk that is not minor, national
competent authorities would issue orders to detect material that has previously been
reliably confirmed by courts or other independent public authorities as constituting CSAM. These orders would be limited in time and would apply regardless of the technology used in the online exchanges, including whether the service is encrypted, to ensure that the legislation is technology neutral. The obligation to detect would be limited to relevant service providers in this context, i.e. those identified as the main vectors for sharing and exchange of known CSAM. Only a subgroup of the providers required to submit a risk assessment would receive a detection order, based on the outcome of the risk assessment taking into account the
proposed mitigating measures. The legislation would list possible risk factors that the
providers should take into account when conducting the risk assessment. In addition, the Commission could issue guidelines to support the risk assessment process, after having conducted the necessary public consultations.
Known CSAM is the most common type of CSA online currently detected (in 2020 service
providers reported seven times more known images and videos than new ones, and 2600 times more known images and videos than grooming cases, see section 2.1.1.). The detection of new CSAM and grooming would remain voluntary, whereas reporting and removal (upon the reception of a removal order) would be mandatory for all types of CSA online, as described in option B . In order to ensure its effectiveness, effective and proportionate sanctions would be instituted for providers who fail to comply with the obligation. These sanctions would be imposed by Member S tates' competent national authorities. More sDeci丘callV. the D rcess would look as follows:
Mandatory risk assessment Relevant service providers would be required to assess the risk that their services are misused to distribute known CSAM. The risk factors to consider could include, depending on the service concerned:
the business model of the service provider, its corresponding user base, including whether the service is available directly to end users (as opposed to, e.g., providing services to businesses), the verification of user identity in the registration process, the possibility to share images and videos with other users, e.g. by message or through sharing ifa link to resources hosted on the service provided, in services offering a chat/messaging functionality, the possibility to create closed
groups, which can be joined upon invitation from a member only, the way in which the services are designed and operated,
69
the ways in which the services are actually used, and any corresponding impact on the
risk 0f distribution of known CSAM,
previous detection of CSAM on the service or on a similar service with a comparable risk profile.
from the Centre Gn on representative
supp1 tests
of the risk assessment, the service provider could request competent national authorities in performing detection
As part and/or
anonymised samples, in order to establish the presence or not of known CSAM.
Providers would then be required to report to the competent national authority on the risk assessment and on any mitigating measures that they plan to adopt or have already adopted. The competent national authority would review the risk assessment and determine whether the assessment has been properly conducted and whether the mitigation measures proposed by the service provider are sufficient. If needed, the competent national authority could request the service provider to resubmit the risk assessment or additional information pertaining to it.
Detection order On the basis of this risk assessment and the criteria laid down in the initiative, the competent national authority would decide whether a detection order for known CSAM should be issued to each specific service provider, by a court or an independent administrative authority (which could be the national authority if it meets the independence criteria). A service
provider falls under the jurisdiction of the Member S tte in which it has its main establishment or in which - if it has no main establishment in the EU - it has designated a
legal representative, building on the approach already adopted in the Terrorist Content Online
Regulation168 and proposed in the DsA. Competent national authorities would cooperate in a network to ensure harmonised application of the mies, building where possible on the structures to be put into place for the DSA. The detection order would be limited in time and renewable based on an updated risk assessment, and would be accompanied by specific supervisory powers for the authorities, including on the detection technology deployed, and
by measures to ensure transparency.S uitb1e redress for affected service providers would be
provided for.
Support by the EU Centre The EU Centre would support service providers in three ways:
1) By providing practical or technical information to service providers that could help them giving effect to their legal obligations and contributing to the preparation of
guidance and best practices documents where needed;
2) By making available to service providers a database of indicators of known material
(e.g. hashes and URLs 169) that providers would be required to use to facilitate accurate detection of known CSAM. The indicators would correspond to material confirmed as
illegal in the EU, as set out above. In addition, the Centre would also facilitate access for service providers to free-of-
charge detection tools. These tools would be automated and have a high accuracy rate, and have proven reliable for over a decade (see box 14 below and annex 8, section 1V70. Providers would not be mandated to use the tools Drovided by the
168 0f L 172, 17.5.202 1, p. 79-109. 169 The URLs in this database would point to a specific image or video, rather than an entire website. 170 They have to date been made available inter ahia by NCMEC and are available for use subject to a licensing
agreement that limits the use of the tool to the detection of CSAM , to the exclusion of any other content.
70
Lentre, as long as their tools meet the requirements (sateguarcis) specitled in the
legislation (see below). Responsibility for the use of these tools and any resulting decisions by the service providers would remain with the service provider themselves.
3) By reviewing the reports submitted by service providers to ensure accurate reporting to law enforcement, and providing support, including through feedback on accuracy, to further improve accuracy levels, to prevent imposing excessive obligations on the
providers and in particular to avoid imposing the obligation to carry out an
indeoendent assessment of the i 11e2a1it of the content detected.
The support of the Centre would be particularly useful to SMEs , which would also be
subject to the above requirements and could thus also receive a detection order from national authorities. The Centre and the Commission could provide additional support to SMEs in the form of guidance, to inform SME s about the new legal framework and the
obligations incumbent on them. This guidance could be disseminated with the help of
industry associations. It may also be possible to provide specific training, in collaboration with Europol and the national authorities.
Box 14: hasんng and URL dtection tooム
Hashing is the most common technology to detect known CSAM. The most broadly used
example is Microsoft's PhotoDNA171. It creates a unique digital fingerprint ('hash') of the
image or video and compares it to a database containing hashes of material verified as being CSAM. If the hash is not recognised, no information is kept. The technology does not identify persons in the image/video and does not analyse the context.
PhotoDNA has been in use for over 10 years by organisations globally, including service
providers, NGOs and law enforcement in the E U172. Its rate of false positives is estimated at no more than i in 50 billion, based on testing173. Microsoft provides PhotoDNA for
free, subject to a licensing agreement requiring strict limitation of use to the detection of CSAM. Organisations wishing to use the technology must register and follow a vetting process by Microsoft to ensure that the tool will be used by the right organisation for the
sole purpose of detecting C sAM.
Other examples of hashing technology used for these purposes, and operating on similar
principles, include YouTube CsAI Mtch174, Facebook's PDQ and TMK+PDQF175. The largest database of hashes is held by NCMEC, with more than four million hashes ofcsAM images and 500 000 hashes ofcsAM videos176. Every hash contained in the database has been viewed and agreed upon as being CSAM by two experts at NCMEC on the basis of strict criteria (see Annex 8).
URL lists are also used to detect known CSAM. Currently they are typically prepared by national authorities (e.g. law enforcement, such as the National Centre for Combating Child
Pornography in Italy, or the Judicial Police in France, OCLCTIC, supervised by the National Commission on Computing and Freedoms, CNIL, and supported by the national hotline Point
Microsoft's information on PhotoDNA. 172 More information is available here. 173
Testimony of Hany Fanid, PhotoDNA developer, to House Committee on Energy and Commerce Fosteringa Healthier Internet to Protect Consumers. 16 0ctober 2019.
174 YouTubeCSAI Match. 175
Open-Sourcing Photo- and Video-Matching Technology to Make the Internet Sa fer.
176NCMEC, as of S eptember 2021.
71
de Contact177) and transmitted to internet service providers to blocka ccess178. s ome Member
Sttes (e.g. Bulgaria) use Interpol's Worst of List ('WIL), which contains addresses with
images and videos that depict severe abuse, with real children, younger than 13, and which have been verified by public authorities from at least two different countries or agencies179.
tk eh oldr' views from the open public consultation on mandatory detection
Public authorities that responded to the consultation were in favour (81% of respondents) of mandatory detection, including in encrypted systems.
Sme companies (31%) and business associations (40%) supported that such obligation shall not apply regardless of whether these services use encryption. Business associations also stressed the role of encryption in
ensuring the online safety and confidentiality of communications of marginalised groups and groups at risk, and that encryption should not be weakened.
Children's rights NGOs were in favour of mandatory detection also in encrypted systems, while pointing out that it should be in line with applicable privacy and other laws.
Privacy rights NGOs stressed the need of preserving strong encryption, and opposed all solutions identified to detect CSA in encrypted systems.
Individuals stressed that service providers should not be obliged to detect CSA online in encrypted services.
Conditions and safeguards
The obligation to detect known CSAM would apply regardless of the technology deployed in the online exchanges. As described in the problem definition (section 2.2.1.), some
technologies used in online exchanges require adaptation of existing detection technology to detect CSA online: for example, while the principal methodology of comparing hashes would remain unchanged, the point in time at which identification is performed would need to be
adjusted in end-to-end encrypted communications, to take place outside the communication itself. In addition, a number of companies have developed tools that seek to identify online using metadata. While these tools are not yet comparable to content-based analysis tools180 in terms of accuracy, child protection and accountability, they could possibly develop to an equivalent standard in the future. Also, some providers have already deployed tools that
perform content-based detection in the context of end-to-end encrypted communications,
demonstrating the swift development of technologies in this area.
The legislative proposal should remain technology-neutral also when it comes to possible solutions to the challenge of preventing and detecting online child sexual abuse. Under this
option, the obligation to detect known CSAM would therefore be an obligation of results,
meaning that detection has to be of sufficient overall effectiveness regardless of the
technology deployed. For example, in a test sample where a specified percentage of material constitutes known CSA 1, the detection tool should correctly identify a comparable amount of CSAM , in line with the state of the art in detection technology when it comes to accuracy. This is to be demonstrated by the service providers. The legislation would set out conditions for the technologies deployed and corresponding supervision powers for national authorities, without however specifying the technologies that must be put in place to enable detection, to
177 CNIL, Rapport d 'Activit6 2020.
178 Article 25 of the CSA Directive inciudes a provision for voluntary blocking of websites contaiiring and
disseminating CSAM . For more information, see the report from the Commission assessing the imDlementation of that Article. C0M(2016〕 872.
Inrerpol, niocKing ana caregorizing conrenr. 180 Pfefferkorn, R.,S tanford Intemet Observatory, Content-Oblivious Trust and S afety Techniques: Results
from a Su rvey of Online S ervice Providers, 9 S eptemer, 2021. S ee in particular p.10-1 1.
72
ensure that the legislation remains proportionate, technology neutral and future proof.
Service providers would be &ee to implement the technical solutions that are most compatible with their services and infrastructures, provided they meet the standards (see below for details on standards).
The obligation to detect regardless of the technology used in the online exchanges is
necessary to ensure not only that the services that, following the risk assessment, should be
detecting known CSAM, can do so in practice, but also to prevent creating a negative incentive to put in place certain technologies solely to avoid the detection obligations. It would therefore ensure that the legislation achieves its general objective of improving detection, reporting and removal ofCSA online.
The obligation to detect regardless of the technology used in the online exchanges, together with all the required safeguards (see below), is also necessary to help ensure a fair balance of the affected fundamental rights181
Box 15.' Detection げCSシ4online in e nd-to-end encrypted communications
End-to-end encryption (E2EE) is an important example of a technology that may be used in certain online exchanges. While beneficial in ensuring privacy and security of
comnrnnications, encryption also creates secure spaces for perpetrators to hide their actions, such as trading images and videos, and approaching and grooming children without fear of detectlon182.Thls hampers the ability to fight these crimes and lowers the protection of the fundamental rights of the child and therefore creates a risk of imbalance in the protection of all the fundamental rights at stake. Any solution to detect CSA needs to ensure a fair balance between:
on the one hand, the fundamental rights of all users, such as privacy and personal data
protection, the freedom to conduct a business of the providers, and
on the other hand, the objective of general interest associated with tackling these very serious crimes and with protecting the fundamental rights of children at stake, such as the
rights of the child, human dignity, prohibition of torture and inhuman or degrading treatment or punishment, and privacy and personal data protection.
The Commission organised in 2020 an expert process under the EU Internet Forum to answer the following question: given an E 2EE electronic communication, are there any technical solutions that allow the detection of CSA content while maintaining the same or
comparable benefits of encryption (e.g. privacy)? 183 Annex 9 summarises the work of experts from academia, service Droviders, civil society organisations and governments, which
I8lAs amiounced in the EU strategy to tackle Organised Crime 2021-2025, in parallel to this initiative, the Commission is steering a process to analyse with the relevant stakeholders the existing capabilities and
approaches for lawful and targeted access by law enforcement authorities to encrypted information (i.e. any kind of content, not necessarily illegal in and of itself) in the context of criminal investigations and
prosecutions and will suggest a way forward in 2022. The scope of this process is therefore different from
proactive detection by online service providers, solely on their own systems, of whether CSAM is being exchanged or grooming is taking place. While different in scope, both initiatives will need to be coherent with the general position of the Commission to promote strong encryption and avoid any general weakening.
182 S ee in particular, Interpol, General Assembly Resolution on Sa feguarding children against online child sexual exploitation, 24 November 2021.
183 In a different process with a different scope, the Commission is also analysing with relevant stakeholders the
existing capabilities and approaches for lawful and targeted access to encrypted information in the context of criminal investigations and prosecutions. The Commission will suggest a way forward in 2022 based on a
thorough mapping of Member S ttes' efforts to deal with encryption and a multi-stakeholder process to
explore and assess concrete options.
73
finished at the end of 2020. The expert group mapped the possible solutions and highlighted
the most promising ones following a technical assessment across five criteria: effectiveness, feasibility, privacy, security and transparency. in relation to the question asked, the expert group concluded at the time that such technical solutions did exist at different levels of
development, but had not been deployed at scale yet184
In August 2021, Apple announced the launch of its new 'Child sa fety' initiatives185, including on-device detection of known CSAM . This solution, similar to two of the solutions identified by the expert group as the most promising, appears to be a viable and technically mature solution to detect known CSAM outside the context of electronic communications, and regardless of whether or not any electronic communication is encrypted186. ins eptember 2021, Apple announced that the deployment of this solution would be delayed to gather additional feedback from customers, advocacy groups, researchers, and others before
launching it, in view of criticism in particular from privacy advocacy groupsl87 It has since
deployed detection of images containing nudity sent or received by a child through on-device
analysis on incoming and outgoing images, providing a warning to children not to view or send them. When sending or receiving such images, children have the option to noti句 someone they trust and ask for helD188.
Meta's WhatsApp, which is end-to-end encrypted, has also been deploying tools to identify CSAM on its messaging service, based on unencrypted data associated with the comnuncation189. However, Meta has also acknowledged the limitations of its current detection tools in public government hearings, indicating that it expects lower numbers of detection compared to unencrypted communications,190 and has referred far fewer cases to NCMEC compared to Meta's Facebook Messenger191
While companies would be free to decide which technology to deploy, the competent national
authority will be empowered and required to supervise. If needed, it could make use of the technical expertise of the EU Centre and!or independent experts to determine relevant technical or operational issues that may arise as part of the authority's assessment whether the
technology that a given service provider intends to use meets the requirements of the
legislation. In particular, the competent national authorities would take into account the
availability of the technologies in their decision to impose a detection order, ensuring the effective application of the obligation to detect. In the cases in which the technology to detect CSAonline was not yet available to be deployed at scale, the legislation could foresee for the
competent authorities the possibility to consider this circumstance when deciding the start date of application of the detection order on a case by case basis. The EU Centre and the
184Technical solutions that could be applied to identifyCSAM URLs inE 2EE connaunications are already in
use today. For example, services like WhatsApp or S igna1 scan the URLs of a message before it is encrypted for spar and malware, and to show the user a preview of the webpage the URL points to.
185 For more information see: https://www.apple.comlchild-safety/. 186 For a technical summary of how the tool works, see here. Instead of scanning images in the cloud, the
system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an uureadable set of hashes that is securely stored on users' devices. Differently from the solutions identified in the expert process under the EU Internet Forum, Apple's solution does the hashing and matching when the image is uploaded to
iCloud, not when the image is sent or received in a coniniunication (as in the expert process' solutions). 187 The plans in relation to the launch of the tool remained unchanged at the time of writing, see here. 188 As reported on CNET. 189 S ee WhatsApp
's FAQs on this matter. 190 House of Commons, Home Affairs Committee hearing of 20 January 2021, Q 125-142. 191 NCMEC and Wired, Police caught one of the web's most dangerous paedophiles. Then everything went
dark, May 2020.
74
Lommission could tacilitate the exchange it best practices and cooperation among providers
in the deployment efforts of new technologies.
The legislation would specify the necessary safeguards to ensure proportionality and a fair balance between all the affected fundamental rights. In particular, as service providers put in
place technical solutions that allow the detection of CSA online regardless of the technology used in the online exchanges, there is a need to regulate the deployment of these solutions, rather than leaving to the service providers the decision on what safeguards to put in place.
Service providers have strong incentives already to ensure that all tools they deploy are reliable and as accurate as possible, to limit false positives. In addition, safeguards are of
particular importance to ensure the fair balance of fundamental rights in the context of
interpersonal communications, where the level of interference with the relevant fundamental
rights, such as those to privacy and personal data protection, is higher compared to e.g. public websites.
The legislation would set out three types of safeguards, on 1) what standards the
technologies used must meet, 2) safeguards on how the technologies are deployed, and 3) EU Centre-related safeguards. They would, as far as possible, build on the detailed safeguards of the Interim Regulation, to ensure coherence and minimise disruption. These safeguards could include or be based on:
1)S tandards the technologies must meet:
be in accordance with the state of the art in the industry; be sufficiently reliable in that they limit to the maximum extent possible the rate of errors regarding the detection of CSA, subject to independent expert certification; be the least privacy-intrusive, including with regard to the principles of data protection by design and by default laid down in the GDPR; not be able to deduce the substance of the content of the communications but solely be able to detect patterns which point to possible CSA (i.e. only determine whether the content matches known CSAM, without assessing or extracting anything else); make use of the indicators provided by the EU Centre to detect known CSAM (see below on EU Centre-related safeguards);
2) How the technologies are deployed, i.e. when deploying these technologies the providers should:
conduct a prior data protection impact assessment and a prior consultation
procedure as referred to in the GDPR, to be repeated when the technologies are
significantly modified; establish internal procedures to prevent abuse of, unauthorised access to, and unauthorised transfers of, personal and other data; ensure human oversight, where necessary. While the tools for detection of known CSAM are accurate to such a high degree that human review of each and every hit is not required, the oversight should encompass spot checks and tests to ensure the continued reliability and verify consistent accuracy rates; establish appropriate redress mechanisms to ensure that users can lodge complaints with them within a reasonable timeframe for the purpose of presenting their views; inform users in a clear, prominent and comprehensible way:
of the fact that the service providers use technologies to detect known CSAM and how they use those technologies;
75
o which consequences such use may have tor the users and avenues tor redress
related thereto; retain the content data and related traffic data processed for the purpose of
detecting known CSAM and its subsequent actions (reporting, removal and possible other consequences, redress, responding to competent law enforcement or judicial authorities' requests) no longer than strictly necessary for those purposes, and no
longer than the maximum period defined in the legislation;
give competent authorities access to data, solely for supervisory purposes; and
publish transparency reports on how the technologies used have been deployed, including operational indicators such as error rates (see section 9 on monitoring and
evaluation).
3) EU Centre-related safeguards. The Centre would be a fundamental component of the
legislation and will serve as a key safeguard by:
making available to service providers the indicators that they should use to detect known CSAM according to EU rules (notably the CSA Directive), as determined by courts and other independent public authorities (see description of EU Centre under
option B );
reviewing the reports submitted by the companies and contributing to ensure that the error rate stays at a minimum in particular by making sure that possible reports submitted by mistake by service providers (i.e. do not contain CSA online) are not forwarded to law enforcement, and providing feedback to service providers on
accuracy and potential false positives to enable continuous improvement;
facilitating access to free-of-charge technology that meets the highest standards for the reliable, automated detection ofCSA online;
publishing annual transparency reports which could include the number and content of reports received, the outcome of the reports (i.e. whether law enforcement took action and if so, what was the outcome), and lists of service providers subject to detection orders, removal orders and sanctions (see section 9).
Given the key role of the Centre, the legislation should also include a set of safeguards to ensure its proper functioning. These could include:
carrying out an independent and periodic expert auditing of the databases of indicators and its management thereof,
carrying out independent expert verification or certification of tools to detect,
report and remove CSA online that the Centre would make available to service
providers;
creating clear and specific legal bases for the processing of personal data, including sensitive personal data, necessary for the performance of the Centre's functions, with the amroDriate limitations and safeguards:
In addition, as a decentralised EU agency, the Centre would be subject to all corresponding transparency and accountability obligations that generally apply to such agencies, including supervision by the EU institutions.
views on safejiuards from the open public consultation
Public authorities indicated that it is critical to implement robust technical and procedural safeguards in
order to ensure transparency and accountability as regards the actions of service providers.
legislation should provide legal certainty for all stakeholders (e.g. service child protection organisations) involved in the fight against CSA online and
76
NGOs pointed out that the new
providers, law enforcement and
improve transparency and accountability. Almost 75% of views from NGOs uiiderlined that transparency
reports should be obligatory and standardized in order to provide uniform quantitative aud qualitative information to improve the understauiding of the effectiveness of the technologies used as well as about the scale of CSA online. Legislation could foster the development of an EU-wide classifications ofCSAM .
Business associations highlighted that it is critical to publish aggregated statistics on the number aud types of
reports of CSA online received in order to ensure transparency and accountability regarding actions of service
providers (40% of their replies). Moreover, some respondents (including companies and business associations) reflected that fully harmonised definitions (beyond the minimum harmonisation provided by the CSA directive) would help reduce EU fragmentation.
Academic and research institutions also stated that transparency reports should be obligatory, and evaluated
by an independent entity (75% of their replies). All of them stated that these reports need to be standardized in
order to provide uniform quantitative and qualitative information to improve the understanding of the effectiveness of the technologies used as well as the scale of child sexual abuse online.
of material that has not
'known', CSAM). As
previously identified as a heightened need to act
5.2.4. ption D. ption C+mandatory detectionげnew CSl4M
This option is the same as option C but adding mandatory detection as opposed to (i.e. 'new been previously verified as CSAM
described in section 2.1 .1., the detection of new content (i.e. not
CSA1) often reveals ongoing or recent abuse and therefore implies as soon as possible to rescue the victim.
As in option C, to ensure that the legislation is technology neutral, the obligation would apply regardless of the technology used in the online exchanges.
The detection of grooming would remain voluntary, whereas reporting and removal of confirmed CSA would be mandatory for all types of CSA online, as described in option B .
Mandatory risk assessment
Expanding the risk assessment outlined in Option C, service providers of relevant services,
notably providers of interpersonal communication and hosting services, would be required to also assess the risk that their services are misused to distribute new CSAM . As there is no difference between "known" and "new"CSAM beyond its having been seen and confirmed
by an authority, the distribution vectors are typically identical. Hence, risks and experiences relating to the detection of known CSAM could be taken into account in this regard. However, the risk factors would also take into account the specificities of new CSAM , and in
particular the risk that the service is used to distribute self-generated material (see box 3 in the
problem definition, section 2.1.1 .). For interpersonal communications services, the risk assessment should also include an analysis of objective factors that may point to a heightened likelihood of sharing of CSAM, which could possibly include group size, gender distribution,
frequency of exchange and frequency and volume of images and videos shared. In addition, the risk assesment could be based, e.g., on spot checks, particularly in the absence of previous exDerience on the same or comDarable services.
rity on Ir have
The service providers would be required to report to the competent national autho the risk assessment, including the mitigating measures that they plan to adopt c
already adopted, and the same considerations as in option C would apply.
Detection order
Simi1ar1y to option C, on the basis of this risk assessment, the competent national authority would decide whether a detection order for new CSAM should be issued to a service
provider, for one or more relevant services it provides. The order should be limited to the
77
strictly necessary; where possible and technically feasible, particularly for interpersonal
conirnunications services based e.g. on the objective factors identified in the risk assessment, it should be limited to relevant parts 0f a given service. The detection order would be limited in time and renewable based on an updated risk assessment.S uitb1e redress for affected service providers would be provided for.
Support by the EU Centre The EU Centre would support service providers in three ways:
1) By making available to providers the database of indicators of new material (e.g. AI
classifiers) that providers would be required to use to detect new CSAM, while
ensuring a technology neutral approach. The indicators would be based on material
determined by courts or other independent public authorities as illegal under EU law.
2) By making available to providers, free-of-charge, technologies to facilitate detection.
Providers would not be mandated to use the technologies provided by the Centre and
would be able to use other tools, as long as they meet the standards and provide for the
safeguards specified in the legislation (see below).
3) By reviewing the reports submitted by service providers to ensure accurate reporting to law enforcement, and providing support, including through feedback on accuracy, to prevent imposing excessive obligations on the providers and in particular to avoid
imposing the obligation to carry out an in-depth assessment of the illegality of the content detected, which can be relevant in particular in borderline cases. 'f possible CSAM is detected by the EU Centre, it will be added to the database of indicators of known CSAM only after public authorities have confirmed the illegality of the content. It could then also be used to imDrove the database of new CSAM indicators.
The support of the Centre would be particularly useful to SMEs , which would also be subject to the above requirements and could thus also receive a detection order from national authorities, the Centre and the Commission would provide additional support to SME s in the form of guidance, to inform SM Es about the new legal framework and the obligations incumbent on them. This guidance could be disseminated with the help of industry associations. It may also be possible to provide specific training, in collaboration with
Europol and the national authorities.
Box 16: technology to detect new CSシ4M
New CSAM often depicts ongoing abuse and therefore implies an urgency to act swiftly to rescue the child. Given the importance of this material, making its detection mandatory would ensure that more of it is detected and therefore more victims can be swiftly safeguarded.
The detection of 'new' content, as compared to that of known content through hashes,
typically relies on an algorithm which uses indicators to rank the similarity of an image to
images already reliably identified and hence identify the likelihood of an image or video
constituting CSAM. While the patterns that the AI algorithm is trained to identify cannot be
equated one to one to known material, they are similarly designed to identify equivalent content. The reliability of such tools, as with any algorithm, depends on the specificity of the content and availability of quality training data, i.e. content already reliably identified as CSAM. Given the large volumes of "known" CSAM, automated identification of new CSAM has had a good basis for development and would be rendered more effective through the continuous expansion of the database of known CSAM confirmed by independent authorities. In addition, as oppos ed to situations where context is of relevance and needs to be analysed
78
(e.g. a slanderous expression reported on in a press article), the dissemination of CSAM is
always illegal regardless of context. As a result, the challenge for automated detection is
significantly lower in detecting what is often termed "manifestly illegal" content, compared to
performing context-dependent assessments.
It is important to note that the process is similar to that for detection of known CSAM in that the classifiers are not be able to deduce the substance of the content of the communications but are solely able to detect patterns which point to possible CSAM. In other words, they are
solely able to answer the question "is this content likely to be CSAM?", yes or no, and they are not be able to extract any other information from the content such as identifying specific persons or locations (i.e. they ignore all other content information transmitted).
The detection of new content is in general more complex than the detection of known content. Due to the nature of new material, after it is flagged by software, it requires systematic human review to ascertain its potential illegality. The accuracy rate nonetheless lies
significantly above 90% (see annex 8, section 2 for an industry example that can be set at
99.9%, which means that only 0.1% of the content automatically na gged is non-illegal). Annex 8 section 2 contains additional information on new CSAM detection technoloav.
Conditions and safeguards As in option C, the obligation to detect new CSAM would apply regardless of the
technology deployed in the online exchanges, and as an obligation of results, to ensure that the legislation remains technology neutral and as future proof as possible.
Also, as in option C, the competent national authorities, on the basis of the risk assessment conducted by the service provider (including mitigating measures adopted), and, if needed, in consultation with the EU Centre and its technical experts on the technologies deployed, would determine whether a detection order should be issued to a given service provider. They would remain competent to verify the compliance with conditions and safeguards and to
supervise the tools deployed, in cooperation with data protection authorities and the EU Centre's technical experts, where appropriate.
The legislation would specify the necessary safeguards to ensure a fair balance between all the affected fundamental rights. The safeguards could include all those described in option C extended to new CSAM , on 1) the technologies used, 2) how they are deployed, and 3) EU Centre-related safeguards. Given the high but comparatively lesser accuracy rates that detection tools for new content can have, the tools should be deployed in such a manner as to limit the number of false positives to the extent possible. The final determination of whether an image or video constitutes CSAM has to be made by a court or independent national
authority. In addition, the material used to prepare and improve the indicators (AI classifiers) made available by the EU Centre could be subject to periodic expert auditing to ensure the aualitv of the data used to train alaorithms.
5.2.5. Option E : option D+mandatory detection ofgrooming
This option includes the policy measures of option D and adds mandatory detection of
grooming for certain providers of interpersonal communications services as the key vectors for online grooming. It would therefore comprise the mandatory detection of the three main forms of CSA online: known and new CSAM and 'grooming' (solicitation of children), limited to the service providers relevant for each of the types of content, which are different for grooming: while CSAM can be shared in various ways, such as by message, sharing links
79
to image hosts or other means, grooming requires a direct communication channel between
the offender and the child. Whereas known and new CSAM depict crime scenes of abuses
already committed, grooming can indicate abuse that is ongoing and/or about to happen and which therefore could be prevented or stopped, protecting the child from harm.
As in options C and D, to ensure that the legislation is technology neutral, the obligation would apply regardless of the technology used in the online exchanges. Reporting and removal (upon the reception of a removal order) would be mandatory for all types of CSA
online, as described in option B .
The services in scope in options C, D and E could be:
for the risk assessment, reporting and removal obligations: relevant providers that
provide or facilitate access to services enabling the dissemination of CSAM and
grooming; for the obligations to detect known and new CS AM: a more narrow category of relevant service providers, in particular providers of hosting and interpersonal comniunication services; for the o bligation to detect arolillill : interoersonal communications services.
Mandatory risk assessment
Expanding the risk assessment outlined in options C and D, relevant service providers would be required to also assess the risk that their services are misused for grooming.S ubect to further assessment, the risk factors to consider specific to grooming could include:
the user base, including whether the service is available directly to end users (as opposed to, e.g., providing services to businesses), the verification of user identity in the registration process, whether the services are likely to be accessed by children or otherwise where children make up a significant proportion ifa service's user base; the existence of functionalities of the service enabling adults to search for other users of the service (including children), e.g. if the profiles are searchable by default to all users; the existence of functionalities of the service enabling adults to contact other users
(including children), in particular via private communications, e.g. if private messaging is enabled by default to all users and if private messaging is an integral part of the service; whether the services enable sharing images and videos via private communications for all
users; whether robust age verification measures are in place (in particular to prevent adults from
pretending to be children); whether the service offers grooming reporting toils that are effective, easily accessible and age appropriate;
past experience with grooming on the same or a comparable service.
The service providers would then be required to report to the competent national
authority the risk assessment, including any mitigating measures that they plan to adopt or have already adopted.
Detection order
similarly to options C and D, on the basis of this risk assessment, the competent national
authority would decide whether a detection order for grooming should be issued to a service
provider, for one or more of its services. Where it is possible based on the risk assessment and
technically feasible to limit the detection to a part of the service, the order should be limited to what is strictly necessary: for example, to perform detection only in one-on-one exchanges as
go
time and renewable based on
providers would be provided groups. This detection order would also be limited in risk assessment.S uitble redress for affected service
opposed to an updated 比r.
Support by the EU Centre The EU Centre would support service providers in three ways:
1) By making available to providers the database of indicators of grooming (e.g. AI
classifiers) that providers would be required to use to detect grooming, while ensuring a technology neutral approach. The indicators would be based on grooming cases
determined by courts or other independent public authorities.
2) By making available to providers, free-of-charge, technologies to facilitate detection.
Providers would not be mandated to use the technologies provided by the Centre and
would be able to use other tools, as long as they meet the requirements and provide for
the safeguards specified in the legislation (see below).
3) By reviewing the reports submitted by service providers to ensure accurate reporting to law enforcement, and providing support, including through feedback on accuracy, to prevent imposing excessive obligations on the providers and in particular to avoid
imposing the obligation to carry out an independent assessment of the illegality of the content detected. If possible grooming is detected by the EU Centre, it could be used to improve the database of grooming indicators, after public authorities have confirmed the ille aalitv of the content.
The above three-way support of the Centre would be particularly useful to SMEs , which would also be subject to the above requirements and could thus also receive a detection order from national authorities. The Centre and the Commission would provide additional
support to SMEs in the form of guidance, to inform SME s about the new legal framework and the obligations incumbent on them. This guidance could be disseminated with the
help of industry associations. It may also be possible to provide specific training, in collaboration with Europol and the national authorities.
Boxl7. technology to detect grooming
The detection of grooming, as compared to that of known content through hashes, typically relies on an algorithm which uses content indicators (e.g. keywords in the conversation) and metadata (e.g. to determine age difference and the likely involvement of the child in the
conmrnnication) to rank the similarity of an online exchange to online exchanges reliably identified as grooming, and hence determine the likelihood of an online exchange to constitute grooming. The classifiers are not be able to deduce the substance of the content of the communications but are solely able to detect patterns which point to possible grooming. In other words, they are solely able to answer the question "is this online exchange likely to be
grooming?", yes or no, and they are not be able to extract any other information from the content such as identifying specific persons or locations (i.e. they ignore all other content information transmitted).
The accuracy rate lies around 90%, which means that 10% of the content automatically flagged for human review is determined by the reviewers as non-illegal). The detection of
grooming is therefore also based on AI patterns/classifiers, like the detection of new CSAM, and in general more complex than the detection of known CSAM . Due to the nature of
grooming, after it is flagged by software, it requires systematic human review to ascertain its
potential illegality. In addition, the tools are constantly fed with data to continuously improve
81
the detection process. Annex 8 section 3 contains additional information on grooming
technology.
Despite the increase of grooming (see section 2.1.1.) and value of grooming detection to stop ongoing abuse and prevent imminent one, only one third of service providers that detect any form ofcsA online detect grooming192
Conditions and safeguards As in options C and D, the obligation to detect grooming would apply regardless of the
technology deployed in the online exchanges, and as an obligation of results, to ensure that the legislation remains technology neutral and as future proof as possible.
As in options C and D, the competent national authorities would be given the necessary competences for effective oversight to determine whether conditions and safeguards are
respected, also in terms of the deployment of technologies.
The legislation would specify the necessary safeguards to ensure proportionality and a fair balance between all the affected fundamental rights. The safeguards could include all those described in option C extended to grooming, on 1) the technologies used, 2) how they are
deployed, and 3) EU Centre-related safeguards. In addition, the material used to prepare and improve the grooming indicators (AI classifiers) made available by the EU Centre could be subject to periodic expert auditing to ensure the quality of the data used to train algorithms; the service provider could be obliged to report back to the competent data protection authority on the measures taken to comply with any written advice issued by the
competent supervisory authority for technologies to detect grooming, following and in addition to the prior data protection impact assessment and consultation; the technologies used to detect grooming should be limited to the use of relevant key indicators and objectively identified risk factors such as one-on-one conversations
(as grooming very rarely takes place in a group setting), age difference and the likely involvement ifa child in the scanned communication
tk eh oldr' views on niandatorv detection from the open public consultation
Pubic authorities indicated that mandatory detection of known (71% of responses) and new CSAM (57%), and
grooming (48%) should be covered by the possible legislation. Child rights NGOs were in favour of mandatory detection and removal of known (78% of responses) and new
CSAM(61%), and grooming (5 1%).
Privacy rights organisations opposed any mandatory detection measures and stressed the need to respect the
requirements of necessity and proportionality to ensure the respect of fundamental rights of users, also with
regard to privacy and confidentiality. Service providers expressed little support for imposing legal obligations to detect known CSAM (12.5% of
responses), new CSAM (6%) and grooming (6%). They flagged that, if there are any obligations, they should be formulated in terms of best reasonable efforts at the current state of technology, be in line with other EU
legislation (e.g. e-commerce directive and DsA), and should not impose an excessive burden on SME s. They raised questions of conflict of laws between the US and the EU emerging from detection and reporting obligations. Individuals that responded to the open public consultation also expressed little support for imposing legal obligations for service providers to detect known CSAM (20% of responses), new CSAM (14%) and grooming (13%). At the same time, there was general support for a possible role of EU Centre managing a single EU database of known CSAM to facilitate detection.
192 S urvey carried out by the WeProtect Global Alliance, WeProtect Global Alliance Global Threat Assessment 2021.
82
'views on o line chiMProtection andP万yacy
Box 18. YouGov survey on
Arecents urvey1 ソづ carried out in eight Member S ttes (DE, FR, IT, NL, PL, sE, Es, lU) in
september 2021 in which nearly 9 500 adults participated found that:
A majority (73%) of respondents believed that children are not safe online.
Nearly 70% of respondents said they would support a European law to mandate online
platforms to detect and report CSAM images and grooming, with technology scanning their photos and messages, even though this means giving up certain personal privacy. A majority of respondents (76%) considered detection ofCSA online to be as or more
important than people's personal privacy online.
Most respondents in the qualitative research groups did not know that hash detection tools
to address online CSAM existed or that anti-grooming tools had been developed. Once
participants learnt about these tools, "they were angry that they weren't being used and
turned on at all times". Participants in these groups held to this view even when they were
told that their data could be scanned to achieve this.
A majority of respondents (68%) felt that there is not much, if any, privacy online vs 25% of respondents who believed that it does.
5.3. Measures discarded at an early stage
The process of building the retained options started with scoping the widest spectrum of measures and discarding a number of them along the way, which included notably:
Indefinite continuation of the Interim Regulation, i.e. extending indefinitely the current
period of application of three years. This measure was discarded because it would not address in a satisfactory way the problem drivers, in particular problem driver 1,
concerning the insufficient voluntary action by online service providers, and 2 on the lack of legal certainty (the Interim Regulation does not establish a legal basis for any processing of personal data). Also, the Interim Regulation only covers a subset of service
providers whose services affected by CSA online. The possible combination of this measure with other options (including the practical measures in option A) would not be able to address these fundamental shortcomings.
Obligations to detect CSA online (known and!or new CSAM, and/or grooming) limited to technologies that currently make possible such detection (e.g. unencrypted services). These measures were discarded because the legislation would not be effective in
achieving the general objective of improving the functioning of the internal market by introducing harmonised EU rules for improving identification, protection and support for victims of CSA . Moreover, rather than improving the fight against CSA online, these measures could worsen it, by unintentionally creating an incentive for certain providers to use technologies in their services to avoid the new legal obligations, without taking effective measures to protect children on their services and to stem the dissemination of CSAM .
Annex 10 contains a further analysis of discarded options for the Centre
193 ECPAT, YouGov, Project Beacon, November 2021.
83
6. WHAT ARE THE IMPACTs OF THE POLICY OPTIONs?
6.1. Qualitative assessment
The qualitative assessment of the policy measures (which form the policy options), is available in annex 4, section 1. This section focuses on the qualitative assessment of the
policy options retained for analysis. It analyses the most relevant impacts, i.e. social, economic and fundamental rights, in addition to those related to the UN S DGs. The
consistency of the options with climate law, the 'do no significant harm' principle and the
'digital-by-default' principle was taken into account throughout the assessment where relevant一
6.1.1.Soc加 1 impact
All proposed measures except the baseline scenario would improve, to differing degrees, the
protection of online users, particularly the young and vulnerable, and enhance the ability of authorities to prevent and respond to cases of online CSA.
6.1.1.1. Option A: practical measures to enhance prevention, detection, reporting and
removal, and assistance to victims, and establishing an EU Centre on prevention and assistance to victims
The practical measures to enhance voluntary detection, removal and reporting of online CSAwould improve the prevalence and effectiveness of voluntary measures to some extent, and would increase the number of related reports and investigations. The measures would also
likely improve the efficiency and quality of reporting from service providers to law enforcement authorities, and allow more efficient use of resources by both. Uncertainty as to the legal basis for the necessary processing of personal data would remain, leading to
fragmented efforts.
Establishing an EU Centre that could perform certain tasks relating to prevention and assistance to victims would help facilitate coordination and the implementation of practical measures in these areas. While these measures would to some extent improve efficiency in
public-private cooperation, a number of difficulties would remain, in particular regarding a reliable source of hashes, a single European reporting point, accountability and transparency regarding providers' efforts, and the need for clear and comprehensive information on the
prevalence ofCSA online.
Finally, this option would likely not be sufficient in providing effective assistance to victims of CSA, or to prevent CSA. While the practical measures included here may facilitate
dialogue and exchange of information, they would not be sufficient to support the
implementation ifa holistic, evidence-based approach. The Centre's impact would be limited, as it would be supported by minimal resources and the support it could offer would be restricted. In particular in view of the significant impact 0f providers' efforts on the wellbeing of children and the rights of all users, the resulting continuation of a patchwork approach would fall short of the objectives.
Therefore, this option would not fully address the problem drivers.
84
6.1.1.2. Option B: option A + legislation 1) specifying the conditions for voluntary
detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and
3) expanding the EU Centre to also support detection, reporting and removal
This option would specify the conditions for service providers' voluntary detection,
reporting and removal of online CSA , eliminating key obstacles to voluntary efforts by providing legal certainty. This would allow services within the scope of the ePrivacy Directive (and its proposed revision) to adopt or continue voluntary efforts, following the
lapsing of the Interim Regulation in 2024, as well as other relevant services. The reporting obligation would ensure both swift investigations to identify offenders and, where possible, identify and rescue victims, as well as independent verification of the illegality of the content.
The removal obligation would help ensure that service providers that have become aware of the existence of CSAM in their services take it down swiftly. This would limit revictimisation and would contribute to prevention efforts, given the effect that viewing CSAM has on
increasing the probability of future offending (see box 1).
These obligations would also help create a level playing field for relevant providers active in the EU, as they would all need to comply with one framework for the detection, reporting and removal obligations.
The creation of EU-level databases of indicators of CSA online would facilitate service
providers' determination of what constitutes CSA online under EU law. By maintaining a
single, reliable database in the EU of indicators to facilitate detection of CSA online in
companies' systems, the Centre would lead to significant improvements in the relevance of
reports received by EU law enforcement authorities, reducing the number of reports of materials that do not constitute CSA online under the laws of the relevant Member S tte, and further eliminating erroneous removals. An increase in the volume of reports can be expected with the introduction of mandatory reporting and the creation of an EU database. Importantly, the database and the support provided by the EU Centre can be expected to contribute to an
improved quality of reports. This in turn can be expected to result in greater numbers of victims rescued and of perpetrators identified, prosecuted and convicted. The
consequential deterrence effects can support the prevention of future offending. The Centre would also act as a central point for reporting in the EU, supporting both service providers and hotlines, reducing the reliance on reports from third country organisations, and improving the ability of relevant authorities to respond to cases of online CSA also in particular across
jurisdictions.
In addition, the Centre could facilitate, directly and in cooperation with hotlines, the removal of CSAM relating to a victim, at the request of a victim, by conducting searches and by notifying providers of content requesting it to be removed. In addition, the creation of a dedicated EU Centre would send an important message about the dedication of the EU as a whole to combating child sexual abuse more effectively and to ensuring that rules apply online as they do offune. It would place the EU at one level with those leading the fight against child sexual abuse worldwide, and would reduce dependence on third-country entities, both for operational reports and for strategic and horizontal information about threats and trends, areas where the EU and its Member S tates to date have very limited visibility. The social impact of the creation of an EU Centre to prevent and counter child sexual abuse is described further in annex 10, sections 4-6.
However, there are also some drawbacks to this option from the perspective of social impacts. As described in S ection 2, experience has shown that service providers' voluntary action by itself has been insufficient. Only 12% of service providers responding to the open public
85
consultation on the DSA reported that they used automated systems to detect illegal content
they host194. This is reflected in the annual reports provided by NCMEC, which show that
only a small percentage of providers registered to make reports to NCMEC have done so, that
many of those who do make reports make very few of them, and that tools for the detection of CSAonline are not widely used. Therefore, beyond ensuring that voluntary measures in
interpersonal communications services can continue after the hiterim Regulation expires, clarifications on the legal basis is unlikely to cause a significant increase in the use of
voluntary measures.
Therefore, while option B would have a greater impact than option A through greater support for detection, reporting and removal efforts, it still would not fully address the problem drivers.
6.1.1.3. Option C: option B + mandatory detection of known CSAM
This option differs from Option B in two important aspects when it comes to its social impact. First, because it would introduce an obligation to detect known CSA 1, and secondly because it would do so regardless of which technology is in use in the online exchanges.
The additional benefits of this option compared to Option B would be to ensure that the detection of known CSAM would no longer be dependent only on the voluntary action of
providers. Detection would be focused on specific items of CSAM, which have earlier in an
independent, reliable, specific and objective manner been found to be illegal. The detection would also be case-specific and limited in time, whilst assistance, safeguards and independent oversight would be provided for. Together with the aim of tackling particularly serious
crimes, this all contributes to the conclusion that the obligation is in line with the prohibition on imposing general monitoring obligations. This option would also ensure that detection of known CSAM is performed regardless of the technology used. This would create a level
playing field for relevant service providers, counteracting fragmentation and hence would have a positive effect on the realisation of the S ing1e Market, building on the baseline harmonisation that the DSA is expected to provide.
In terms of the protection of children against the circulation of materials depicting their
abuse, the obligation to detect is expected to have a positive impact. Over time, the overall number of images and videos depicting CSA available on services within scope should be reduced significantly, and, with it, the instances of secondary victimisation inherent in the continued viewing of the abuse. At the same time, it should entail a significant increase in the number of relevant service providers participating, in the volume of detection and reporting, and hence in the proportion of overall cases investigated and number of children identified and removed from abusive situations.
This would also have a positive impact on the overall confidence of users in services, as their
exposure to CSAM would also be reduced. This positive impact would extend also to
society's expectation that services do not facilitate the sharing of CSA 1. While the targeting of specific services would possibly somewhat reduce the overall effectiveness of the
obligation which could be greater if more services were included in scope, this can be
justified in light of the greater impact that such detection might have.
For the detection of known content, the availability of reliable indicators of what constitutes CSAM under EU law and of free-of-charge technologies facilitating automatic detection would support service providers in their identification of relevant content and help
194 0ut ifa totai of 362 providers. Impact Assessment accompanying the DSA proposal, p59
86
ensure proportionality of requirements. Known CSAM is the most common type of child
sexual abuse online. The tools to detect it (see annex 8, section 1) have a high accuracy rate and have been reliably used for over a decade. The obligation to detect known material would level the playing fleld and ensure the detection of that content where is currently missing, with all the necessary safeguards. The EU Centre would make available the database of indicators of known material (e.g. hashes, URLs) that providers should use. The detection
obligation might also encompass materials that victims have referred for detection and
removal, or materials from concluded law enforcement investigations and that have been verified as CSAM by public authorities.
As a downside, such an obligation could result in occasional false positives, that is, in images and videos erroneously identified as CSAM. Given the gravity of an allegation of being involved in CSA , reporting could have a negative impact in the case of false positives and needs to be accompanied by safeguards ensuring that false positives are prevented as much as
possible and that, where they occur, all data generated in relation to the false positives are
erased, other than what is required for the improvement of automatic detection tools.
Therefore, the Centre could provide an independent verification of the illegality of the
content, eliminating manifestly unfounded reports, before forwarding reports that are not
manifestly unfounded to Europol and national law enforcement authorities for action. Those authorities would, in addition, naturally still carry out their own assessments to determine whether further actions is necessary and appropriate in each individual case.
Given the impact on fundamental rights of all users, additional strict safeguards would apply, building on and going beyond those set out above for voluntary detection and for the
reliability of the database of indicators. These could include independent expert auditing of the database of indicators and regular supervision and verification of the procedures of the Centre (with the involvement of data protection authorities as needed), independent expert certification of tools for automated detection to ensure accuracy, as well as additional
transparency and accountability measures such as regular reporting. The legislation could also set out information rights of users and mechanisms for complaints and legal redress (see section 5.2.3.).
The application of an obligation regardless of the technology used in the online exchanges (including encryption) would ensure a level playing field regardless of service providers' choice of technology and would likely significantly increase the effectiveness of the
obligation. On the other hand, it could potentially limit the effective exercise of users' right to
privacy when it comes to the content of their communication and increases the burden on service providers as detection currently remains more challenging in E 2EE communications. It is therefore only in light of the particularly egregious nature of CSA that such an obligation can be considered. This option would need to take into account the requirement of ensuring that the benefits of encryption for the privacy of all users are not compromised in the process of protecting children and identi句ing offenders. Technical solutions would therefore need to be carefully considered and tailored to balance these objectives. The obligation to detect would apply following a decision by the competent national authorities on a case by case
basis, following the analysis of a risk assessment submitted by the service provider and taking into account technical feasibility.
The uniform application by all relevant online service providers to detect, report and remove known CSAM, regardless of the technology used in the online exchanges, would, over time,
significantly affect the availability of CSAM on services falling within the scope of the initiative. It would decrease the blind svot caused by vewetrators' use of certain technologies
87
to share CSAM and abuse and exploit child victims. This would make private
communications safer for children and help ensure that evidence of CSA can be found,
leading to the identification of child victims.
6.1.1.4. Option D: option C + mandatory detection of new CSAM
The impacts of this option would be the same as option C, plus those of establishing a legal obligation for mandatory detection of new CSAM regardless of the technology used in the online exchanges.
The basic rationale for treating previously identified (i.e. known) and new CSAM the same is that both concern the same types of content, the difference between that the former has been
independently confirmed as constituting illegal material under EU law whereas for the latter that has not (yet) occurred.
The additional challenge lies in the fact that detection of new CSAM relies on a different
technology, which does not use hashes or URLs for individual images and videos but rather relies on pattern recognition, as set out in annex 8, section 2. The reliability and efficacy of such technologies is quite advanced, ensuring error rates in the low percentages, yet the burden on relevant service providers in ensuring the accuracy of efforts is significantly higher and would require an additional degree of human oversight and human confirmation of
suspected CSAM .
Whereas the proportion of materials currently flagged as suspected new CSAM is
significantly lower than that of known CSAM, new CSAM requires systematic human verification. The additional burden would need to be proportionate and compatible with the
prohibition of general monitoring and active fact-finding as well as the need to strike a fair balance between the relevant fundamental rights at stake.
Such a balance may be supported by important objectives with respect to the interest of the child that would not otherwise be accomplished. Whereas the detection of known material reduces the re-victimisation of the child depicted in those images and videos and, at times, the investigation initiated with such a report may lead to uncovering ongoing abuses, this material depicts past abuse, which in some cases may be years old. By its nature, previously undetected CSAM usually depicts more recent and at times still ongoing abuse, provides particularly valuable leads, and is therefore treated as highest priority by law enforcement. The added value of detecting new CSAM in terms of the ability to identi句 and rescue children is significant. The positive social impact on children's welfare consequently is
significantly higher than in the case of detection of known content alone.
The prompt detection of new material also allows for prevention of its distribution, and the
possibility of it 'going viral' in circles of abusers, by adding it to the databases of known material that feed the automated detection tools. The subsequent detection based on the
comparison with these databases can also provide important information about the way in which CSAM is disseminated online and the circles of abusers, facilitating detection and effective action against such groups, which would have a significantly positive social impact of tackling the problem closer to its roots.
The application of an obligation to detect new CSAM regardless of the technology used in the online exchanges carries similar considerations as those laid out under Option C. It would ensure that obligations are applicable to all service providers regardless of choice of
technology, which is likely to produce better effectiveness of the obligation to detect new CSAM. In particular, any solution used in this context would have to ensure both the benefits that encryption provides for privacy of all users and the protection of the fundamental rights
88
of children.So lutions would need to be carefully considered and tailored to balance these
objectives. This obligation is likely to increase the burden on service providers to deploy technical solutions that detect new CSAM in E 2EE communications, including similar type of administrative burdens as to detection on new CSAM in un-encrypted communications to ensure accuracy, and mitigate error rates, including through human review.
Simi1ar1y to Option C, uniform application by all relevant online service providers to detect,
report and remove new CSAM , regardless of the technology used in the online exchanges, would, over time, significantly affect availability of CSAM on services falling within the
scope of the initiative.
6.1.1.5. Option E : option D + mandatory detection of grooming
The social impacts of this option would be the same as option D, plus those of establishing a
legal obligation on relevant service providers for mandatory detection of grooming regardless of the technology used in the online exchanges.
Whereas the current number of reports of suspected grooming is significantly lower than that of CSAM, in particular known CSAM, grooming requires systematic human verification. The additional burden would need to be proportionate and compatible with the prohibition of
general monitoring and active fact-finding as well as the need to strike a fair balance between the relevant fundamental rights at stake.
Such a balance may be supported by important objectives with respect to the interest of the child that would not otherwise be accomplished. Whereas the detection of known material reduces the re-victimisation of the child depicted in those images and videos and, at times, the investigation initiated with such a report may lead to uncovering ongoing abuses, this material depicts past abuse, which in some cases may be years ild, hi contrast, the identification and stopping of grooming is a measure that can serve to protect children from
falling victim to imminent abuse, or to stop ongoing abuse. This is of particular relevance in the current situation in the pandemic, where children have been exposed to a significantly higher degree of unwanted approaches online including grooming. The positive social impact on children's welfare consequently is significantly higher than in the case of detection of CSAMalone
The detection of grooming typically relies on tools for automatic text analysis, which are trained on verified grooming conversations and assess a given exchange according to risk factors identified on the basis of the verified grooming cases.S uch tools are at the moment
slightly lower in accuracy than tools for the automatic detection of known or new CSAM (see box 16 in section 5.2.4.) and would therefore require additional conditions and safeguards to avoid reports of false positives. The comparably higher invasiveness of text analysis tools and lower accuracy rate therefore has to be weighed against the interest in more effective
protection of the child, particularly in calibrating the tool to avoid false positives at the
expense of increasing the number of false negatives. In addition, where detection can be limited to parts of a service, determined on the basis of objective factors, this further contributes to ensurina the aDDroDriate balance.
6.1.2.Econoiic ipact
The assessment of the economic impact of the different options focuses on the impact on service providers and public authorities concerned by the measures.
89
The quantitative assessment is included in section 6.2. For a detailed assessment of the
economic impact of establishing the Centre see annex 10.
6.1.2.1. Option A : practical measures to enhance prevention, detection, reporting and
removal, and assistance to victims, and establishing an EU Centre on prevention and assistance to victims
Compared to the baseline scenario, the practical measures to enhance the voluntary detection, removal and reporting of CSAM would to some extent improve the quality of
procedures and the cooperation between the private and public sector. In particular, the
training of EU practitioners and the sharing of guidelines and best practices should have a
positive impact and generate efficiency savings both for providers and for public authorities.
The practical measures to enhance actions on prevention and assistance to victims,
including establishing an EU Centre as a hub without legal personality, would generate limited costs to the EU budget. They would have a potential to limit expenses on the side of the Member S tates, which could make use of existing research and expertise. The Centre's activities in the areas of prevention could lead to a reduction in relevant offences, while its victim support role could contribute to the recovery of victims, reducing the long-term impact of these crimes on victims and society. In all areas, the Centre's work could reduce
duplication of efforts. However, this positive impact would be limited and would depend on the willingness of actors to cooperate.
The practical measures addressed to authorities to improve cooperation with service
providers (training, standardised forms, online portal) would generate some moderate costs for them, but also improve the quality of reports and should therefore lead to a net reduction of costs for both service providers and public authorities. Likewise, the set-up of a feedback mechanism and communication channel would cause some moderate integration and maintenance costs but the benefits of such mechanism are expected to outweigh the expenses.
The practical measures addressed to service providers (streamlining of policies) would
similarly generate moderate costs for them, in particular if changes to procedures have to be
implemented, but public authorities would have a clear point of entry, reducing transaction
costs, and would not have to adapt to a variety of individual service providers' policies, leading to cost reductions for public authorities. The Application Programming Interfaces
(APIs) that public authorities could make available to allow service providers to remotely check hashed images and videos from their service against databases of hashes would
generate moderate integration and maintenance costs for relevant public entities. However, as mentioned above, using common APIs would reduce transaction costs and overall costs in the
long-run.
Supporting measures, technology and expertise sharing across platforms could limit potential economic burdens on relevant online service providers.S imilar to service providers, the
public sector would also benefit from interoperable tools and increased cooperation. There will also be a positive economic impact on expenses related to victim support.
6.1.2.2. Option B : option A + legislation 1) specifying the conditions for voluntary detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and
3) expanding the EU Centre to also support detection, reporting and removal
The economic impacts of this option are the same as in option A, plus those of clarifying the
legal basis for the voluntary detection of CSA by relevant online service providers, a
reporting and removal obligation, and the cost of establishing and maintaining an EU Centre.
90
Reporting obligations under this option could lead to:
additional costs to law enforcement authorities, to adequately respond to the likely increase in reports from service providers. Furthermore, if law enforcement receives more reports where action is required due to more extensive and reliable datasets
provided by the Centre, additional costs could be expected concerning identification of victims and offenders, investigations, criminal proceedings and support to victims and their families; additional costs to service providers, e.g. in technological developments and/or
acquisition and maintenance, infrastructure expenditure and expert staff recruitment and training, in particular with regard to SME s.
For both the public and the private sector, administrative and compliance costs could arise from implementing new legislation. On the other hand, the economic impact of (voluntary) earlier detection of CSA would be expected to be significantly positive with regard to the
quality of life of survivors, their productivity, and reduced costs of lifelong victim support. In
addition, a positive effect on the S ing1e Market could result from additional legal clarity and
certainty, thus limiting compliance costs.
Establishing an EU Centre would incur significant cost to the EU budget. However, the Centre would also contribute to limiting expenses for other stakeholders, including public authorities and service providers, by streamlining activities in an economic manner. The Centre's activities would support both law enforcement authorities and online service
providers in the detection and reporting of CSA online, leading to greater efficiencies. It would facilitate compliance and reduce the costs of complaints and associated judicial proceedings by making available reliable information on content that is illegal in the EU. The Centre would also help streamline and facilitate hotlines' efforts, including with regard to
proactive searches. In addition, more extensive and reliable datasets of e.g. hashes would help law enforcement prioritise their actions, reducing the time spent filtering out non-actionable
reports. The Centre' s activities in the area of prevention could lead to a reduction in relevant
offences, while its victim support role could contribute to the recovery of victims, reducing the long-term impact of these crimes on victims and society. In all areas, the Centre's work could reduce duplication of efforts. In the long run, the Centre's activities would therefore lead to a decrease in the economic costs ofCSA .
6.1.2.3. Option C: option B + mandatory detection of known CSAM
The impacts of this option are those outlined for option B plus those derived from the
obligation to detect known material. For both the public and the private sector, administrative and compliance costs would arise from implementing new legislation.
For service providers, the introduction and maintenance of systems for the detection, where
applicable, and the new or increased generation of reports would result in costs, also in relation to follow-up requests for further relevant data from public authorities, and for
handling complaints and requests for review by affected users. However, they would benefit from the fact that this option would limit further fragmentation of the Internal Market with
regard to administrative procedures and obligations required from hosting service providers. Anumber of service providers could build on systems they already have in place. In addition, the Centre would provide important support in making available technologies that can then be
adapted to the needs of the providers. Technologies for the detection of known CSAM have been available free of charge for years and have proven their reliability.
91
SMEsoffering hosting services are particularly vulnerable to exploitation through illegal
activities, including CSA , not least since they tend to have limited capacity to deploy state-of- the-art technological solutions to detect CSAM or specialised staff. Therefore, while they should not be exempted from any rules and obligations, it is of particular importance to ensure that measures are proportionate and do not place an undue burden on them. The free
availability of reliable databases of known CSAM indicators as well as detection tools
(made available by the Centre) are important in this regard. Even though companies may have
unequal resources to integrate technologies for the detection of CSAM into their products, this
negative effect is outweighed by the fact that excluding them from this obligation would create a safe space for child sexual abuse and therefore defeat the purpose of the proposal. To further mitigate the economic impact on smaller companies, the verification of the illegality of the reported material could be left to the expertise of the EU Centre, in cooperation with the national authorities and the network of hotlines where needed and appropriate, which would inform the provider whether the material did in fact constitute CSAM. Therefore, these service providers would not be forced to invest in additional human resources for confirmation of suspected CSA1.
The expected increase in reports from service providers would result in significant additional costs to public authorities, in particular law enforcement and judicial authorities,
arising from the corresponding increase in investigations and prosecutions. However, this financial impact is expected to be outweighed by the positive economic impact on victim
support measures and survivor quality of life and productivity.
A positive effect on the S ing1e Market could result from additional legal clarity and
certainty, thus limiting compliance costs. Furthermore, both the public and the private sector would benefit from a common framework creating more legal certainty and mutual trust between the public and the private sector.
6.1.2.4. Option D: option C + mandatory detection of new CSAM
The impacts of this option are those outlined for option C plus those derived from the
obligation to also detect new material. For both the public and the private sector, administrative and compliance costs would arise from implementing new legislation. However, all of the legislative options could reduce the fragmentation of the Intern al Market and reduce compliance costs on the long term.
The expansion to new material could further increase the workload of law enforcement,
compared to the previous option. While the overall number of new materials detected is
expected to be lower than that of known CSAM , it will likely still be significant, considering that the cases require urgent and detailed attention, given the greater likelihood of ongoing abuse and the need for victim identification. Therefore, this increase in the workload will be
accompanied by additional costs to respond to reports, costs related to starting investigations as well as the criminal justice process.
As in option C, service providers could encounter additional costs related to the integration and maintenance of detection technology and follow-up requests from public authorities,
among others. Expanding the safety policy to new CSAM might require service providers to invest in adapting the available technologies to their individual products and possibly in
recruiting trained staff to verify new material before reporting it. This could affect smaller
providers in particular. To mitigate this effect, technologies would be made available free of
charge. In addition, in the case of SM Es the human review and verification would be left to the expertise of the EU Centre which, in cooperation with national authorities and the network
92
of hotlines where needed and appropriate, would inform the provider whether the material
constituted CSAM .
6.1.2.5. Option E : option D + mandatory detection of grooming
The impacts of this option are those outlined for option D plus those derived from the
obligation to also detect grooming.
Expanding the obligation to detection of grooming would require relevant service providers to invest in integrating additional tools to detect this type of abuse. These costs could be
mitigated by making available technologies free of charge via the EU Centre, limiting service
providers' expenses to the integration of such tools into their services, and by relying on the EU Centre for the confirmation of cases identified as suspected grooming. By contrast,
staffing costs for the Centre would increase as such cases require immediate reaction in order to ensure the protection of victims. Where the relevant service providers choose to rely on the Centre for verification before taking action, swift turnaround would have to be ensured in order to inform the provider about the need to intervene in an interaction and to protect a child.
Law enforcement would incur higher costs related to processing reports, compared to option D. The number of additional reports is expected to be lower compared to known CSAM, but as for new CSAM, swift action is required to protect the victim. The same considerations on administrative costs for the implementation of legislation as set out above apply. The positive economic impact when it comes to victim support and quality of life would increase, as the number of children that do not fall victim to hands-on child sexual abuse because of the
timely detection of grooming would increase. This could potentially reduce the impact on victim support systems, compared to the previous options, as well as having a decisive impact on the quality of life and future productivity of the children.
Stk eh oldr' views on economic impacts
Service providers and business associations expressed in the open public consultation and the inception impact assessment their concerns regarding the economic impact for SME s of possible legal obligations and that a 'one- size-fits-all' solution should be avoided. They also pointed out that the costs of deploying and maintaining technical solutions should not be underestimated.
Hotines and public authorities indicated in the open public consultation and in the targeted consultations that increased reporting could result in increased costs for investigating, prosecuting, and managing offenders, and in
assistance and support to victims.
61.3.Fmdlllentl rights impact
According to Article 52(1) of the Charter of Fundamental Rights, any limitation on the exercise of the rights and freedoms recognised by the Charter must be provided for by law and respect the essence of those rights and freedoms.S ubect to the principle of
proportionality, limitations may be made only if they are necessary and genuinely meet
objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others.
The objective pursued by the envisaged proposal, i.e. preventing and combating CSA, which is a particularly serious crime195, constitutes an objective of general interest within the
meaning of Article 52(1) of the Charter196. In addition, the proposal seeks to protect the rights
195 CSAM is also the only type of illegal content whose mere possession is illegal. 196 Cf. e.g. CIEU, Digital Rights Ireland, Joined Cases C- -293/12 and C-594/12, para. 42
93
0f others, nar
dignity and to of children. It concerns in particular their fundamental rights to human
integrity of the person, the prohibition of inhuman or degrading treatme as well as the rights of the child197. It takes into account the fact that in all actions relating
血, to
children, whether taken by public authorities or private institutions, the child's best interests must be a primary exchange of photos
consideration. Furthermore, the types of CSA at issue here - notably, the or videos depicting the abuse - can also affect the children's rights
respect for private and family life and to protection of personal data198. In connection
O o
t t
combating criminal offences against minors the European Court of Justice has noted that at least some of the fundamental rights mentioned can give rise to positive obligations of the relevant public authorities, requiring them to adopt legal measures to protect the rights in
question199
At the same time, the envisaged measures affect, in the first place, the exercise of the fundamental rights of the users of the services at issue. Those rights include, in particular, the fundamental rights to respect for privacy (including confidentiality of communications, as
part of the broader right to respect for private and family life), to protection of personal data and to freedom of expression and information200. Whilst of great importance, none of these
rights is absolute and they must be considered in relation to their function in society20 1, As indicated above Article 52(1) of the Charter allows limitations to be placed on the exercise of those rights, subject to the conditions set out in that provision.
More specitically, the measures aim to achieve the atorementioned objective by regulating both 'public-facing' and 'private' services, including interpersonal communication services, which results in varying levels of intrusiveness regarding the fundamental rights of users. In the case of content that is accessible to the public, whilst there is an intrusion, the impact especially on the right to privacy is generally smaller given the role of these services as 'virtual public spaces' for expression and economic transactions. The impact on the right to
privacy in relation to private communications will generally be greater.S uch impact, where
necessary to achieve the aforementioned objective, must be necessary and proportionate and
be moderated by appropriate safeguards. The safeguards have to be differentiated and balanced in order to adapt inter ahia to the varying level of intrusiveness depending on the nature of the communications services at issue.
Furthermore, the potential or actual removal of users' content, in particular erroneous removal
(on the mistaken assumption that it concerns CSAM), can potentially have a significant impact on users' fundamental rights, especially to freedom of expression and information where content is removed erroneously.S uch impact can depend inter ahia on the service
provider's position in the Internet 'stack'.Se rvices lower in the Internet stack include those
providing cloud infrastructure, web hosting, or content distribution network services. At the same time, content involving CSA that is left unremoved can have a significant negative impact on the aforementioned fundamental rights of the children, perpetuating harm for children and for society at large. Other factors to be taken into account in this regard include the nature of the user content in question (text, photos, videos), the accuracy of the technology concerned, as well as the 'absolute' nature of the prohibition to exchange CSAM (which is in
principle not subject to any exceptions and is not context-sensitive).
197 Art. 1, 3, 4 and 24 of the Charter, respectively. 198 Art. 7 and 8 of the Charter, respectively. 199 S ee in particular lEU, La Quadrature du Net, Joined Cases C-511/18, C-512/18 and C-520/18, para. 126. 200 Art. 7, 8 and 11 of the Charter, respectively. 201 Cf. e.g. CIEU, Joined Cases C-511/18. C-512/18 and C-520/18, para. 120.
94
In addition, the freedom to conduct a business of the providers covered by the proposal
comes into play as well202. Broadly speaking, this fundamental right precludes economic
operators from being made subject to excessive burdens. It includes the freedom to choose with whom to do business and the freedom of contract. However, this right is not absolute
either; it allows for a broad range of interventions that may limit the exercise of economic activities in the public interest203.
The need to strike a fair balance between all of the fundamental rights at issue played an
important role in the consideration of the various options. The initiative may not affect the essence of, or affect in an unjustified and disproportionate manner, the abovementioned fundamental rights. The options were pre-selected accordingly, and the main differences between the options relate to the extent of their effectiveness in safeguarding and balancing the various fundamental rights, considering their various degrees of interference, and the
ability of the options to offer a more adequate response in light of both the current and the
evolving risks emerging in a highly dynamic digital enviromrient.
and and
wevention, detection, reporting g an EU Centre on prevention
6.1.3.1. Option A : practical measures to enhanceI removal, and assistance to victims, and establishin assistance to victims
Compared to the baseline scenario, a limited positive impact on tundamental rights may be
expected with respect to better coordination of efforts on prevention and assistance to victims of child sexual abuse with the suppOrt and facilitation of a newly established EU Centre, and on enhancing the voluntary detection, removal and reporting of child sexual abuse online.
Avery limited impact on fundamental rights may be expected with respect to the cooperation between private and public authorities. Practical measures would ensure confidentiality of data sets, which may have a positive effect on the protection of privacy and personal data
compared to the baseline scenario.
This option would furthermore increase transparency and accountability and would contribute to ensuring sound administration. There would be no change with regard to legal clarity and only a moderate impact on individuals' fundamental rights. This option would maintain the current framework of voluntary measures to address CSA and of cooperation with service providers. The rights and obligations of service providers would not be
substantially affected.
6.1.3.2. Option B: option A + legislation 1) specifying the conditions for voluntary detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and
3) expanding the EU Centre to also support detection, reporting and removal.
Measures need to be effective, necessary and proportionate to tackle the crimes at issue and to protect the fundamental rights of children, including to give effect to theS tate 's obligation to provide for the protection of children's rights and well-being, as a vulnerable group requiring particular care, and the effective application of its laws. In line with what was said
above, these rights and interests need to be balanced against the following rights in particular:
Users' rights: when data is processed for the purposes of detection, this affects users' rights to freedom of expression and information, to the protection of personal data, and, where
applicable depending on the type of service, to the confidentiality of their communications. While the rights to freedom of expression and information do not extend to protecting illegal
Art. 16 of the Charter. Cf. e.g. C.IEU,S kv Osterreich, Case C-283/11, para. 45-46.
95
202
203
activities aimed at the destruction it any it the basic士undamental rights and treedoms, the
detection would also need to check legal materials and exchanges for the presence of CSAM . As a result, a strong justification and strong safeguards would be needed to ensure an
appropriate balance of the different fundamental rights. The justification consists essentially in the particularly serious crimes that the envisaged measures aim to prevent and combat and the protection of children that it aims to ensure. As described in section 5.2.3., the safeguards could include requiring service providers to use technologies and procedures that ensure
accuracy, transparency and accountability, including supervision by designated authorities. Iu
addition, the database of child sexual abuse indicators provided by the EU Centre would ensure a reliable basis for determining which content is illegal. The transparency and
accountability that the Centre helps ensure could also help ensure that there are no erroneous takedowns or abuse of the search tools to detect legitimate content (including misuse of the tools for purposes other than the fight against child sexual abuse).
For interpersonal communications services, the users' fundamental right to privacy of communications are also concerned in particular. Therefore, supplementary safeguards would be required, including targeting the voluntary detection of new material and grooming to services where children may be at high risk, and providing clear information to users, as well as possible information once suspected abuse has been detected, including possibilities for redress. An additional safeguard lies in the anonymised processing by technologies204, which ensures that the impact on the fundamental rights 0f users whose conmunications are
processed would remain within reasonable limits and do not go beyond what is necessary, since no personal data deriving from their communications would be reviewed unless there is a justified suspicion of child sexual abuse (these technologies simply detect content like a virus scaier or spar filter, taking no records and not 'understanding' the substance of the
conirnunication, e.g. they answer the question 'dies this image contain CSA patterns?' rather than 'what is this imaae about?").
Servたe providers' rights: This option would have no impact on the rights of service providers who choose to take no action to proactively detect child sexual abuse involving their services. On the other hand, service providers who choose to do so would be subject to new
requirements that have not applied previously, in addition to those arising from the DSA
proposal, such as requirements on the reliability and accuracy of technologies and on
reporting and removal. S uch requirements however are important safeguards for the fundamental rights of users.
Regardless of whether service providers decide to take voluntary action to detect CSA, they would be subject to reporting and removal obligations in case they become aware of the existence of CS A online in their services. These obligations impact service providers' rights but are necessary to safeguard the fundamental rights of victims.
As an additional important safeguard, the EU Centre would help improve transparency and
accountability. The obligation to report would ensure that all instances of reported child sexual abuse online are independently verified, that action is taken to identify and rescue
children, and that offenders are investigated, in addition, its existence would facilitate
reporting to a Centre in the EU, thus limiting international transfers of personal data of EU citizens. Bv facilitatina Member S ttes' action on Drevention and suDDortina victims in
204 Fir example hashing tecl'iologies automatically convert images into a "hash", a code describing the image. personal data. The company Where the hash of the user' s 5ee ainex 8, section 1.
an
yAM This code cainot be converted back into an image and does not contain
to a database of hashes of known ares the hash of the
database, the image is flagged as potential C5AIvI.
96
then comp invige matches a hash in the
removing CSAM , the Centre would have a significant positive impact on the fundamental
rights of victims and children who may become victims. The Centre itself would also be
subject to safeguards as described in section 5.2.3. to ensure that it carries out its
responsibilities fully and in a transparent way.
On the whole, provided appropriate limits and safeguards are ensured, this option would thus
fairly balance the various rights at stake.
6.1.3.3. Option C: option B + mandatory detection of known CSAM
The rights to be balanced are the same as in the previous option; the difference lies in the
greater impact on rights resulting from a) the mandatory nature of the detection of known CSAM and b) its application potentially regardless of the technology used in the online
exchanges.
This option, because of the expanded and more effective action against CSAM, would have a
significantly positive impact on fundamental rights of victims whose images are
circulating on the Internet, in particular on their right to the respect for private life, and to the
rights as children.
At the same time, the mandatory nature of the detection has a notable impact on providers' freedom to conduct their business. This can only be justified in view of the fundamental
importance of tackling the particularly serious crimes at issue and more effective protection of children. Especially in the context of interpersonal communications, providers are the only ones that have visibility on the abuse taking place. Given that up to 80% of investigations in some Member S tates are possible only because of reports from providers, such a measure is
objectively necessar205. In addition, providers would have access to free and verified detection tools. The obligation to detect known CSAM would level the playing丘e ld and ensure the detection thereof where it is currently missing, with all the necessary safeguards. It would be targeted, risk-based, limited in time and would not impose an undue burden on
providers.
In addition, users' rights (in particular freedom of expression, privacy and data protection) are concerned to a greater extent than under the previous option. The availability of reliable and verified tools could ensure that the impact on their rights does not go beyond what is
strictly necessary, by limiting the interference and reducing the risk of false positives and the
possibility of misuse. h particular, there would be no human interaction with interpersonal communications of users beyond the communications that have been automatically identified as containing CSAM.
On the whole, provided appropriate limits and safeguards are ensured, this option would thus fairly balance the various riahts at stake.
Box 19. riskずmisuseずtools to detectSl4 on lnepr other purposes
There is a risk that the technologies intended to detect CSA online are repurposed and misused for other purposes. This risk is common across technologies and across technical
fields, including other technologies used in online services (e.g. the GPS or the camera of a mobile phone, which could be misused for surveillance). In fact, the underlying technologies behind the most common tools to detect CSA online are in themselves applications of
o impose an obligation 0f general monitoring or active fact-丘nding does not rank
right, it serves as a safeguard to facilitate the appropriate balancing of rightsa
205 While the prohibition t itself as a fundamental interests. As set out in more detail above in section 5.2.3, tis obligation would be complied with.
97
technologies that were not originally developed for the exclusive purpose of detecting CSA
online. For example, hashing is an application of digital fingerprinting, which was already being used to detect malware when tools like PhotoDNA were first developed. Likewise, AI, the underlying technology to detect new CSAM and grooming, was not originally developed to detect CSA online. The possibility of repurposing a technology (and therefore the risk of
misuse) exists since the technology is first developed. In the case of the tools to detect CSA
online, these have existed for over a decade (e.g. PhotoDNA) and there is so far no evidence of that risk having materialised; the tools have been made available under a licensing agreement limiting their use to the detection of child sexual abuse content, which appears to have been respected. The legislation would include safeguards on purpose limitation, the way they are deployed, and oversight by competent authorities and the EU Centre to keep the risk of misuse to the absolute minimum.
6.1.3.4. Option D: option C + mandatory detection of new CSAM
The rights to be balanced are the same as in the previous option; the difference lies in the
greater impact on rights resulting from the mandatory detection of new CSAM .
This option would represent a higher impact on providers' freedom to conduct a business and more interference into users' right to privacy, personal data protection and freedom of
expression. However, there is corresponding increase in the types of CSA that are tackled
and, thus, to the achievement of the objective of combatting the particularly serious crimes at issue and protecting children. Moreover, stricter safeguards, remedies and transparency and
accountability measures would be provided for to safeguard users' rights.
Given the similar nature of the materials to be detected and the reliance on verified indicators to be provided by the EU Centre, the detection of new material would in principle have a
comparable level of intrusiveness as the detection of known CSAM. However, given that
accuracy levels of current tools, while still being well above 90%, are lower than for the detection of known CSAM , human confirmation is essential. This would add to the service
providers' burdens and increase intrusiveness, but is deemed necessary to avoid errors and the
negative consequences that such errors might have, including for users' rights. The need to
rely on human confirmation could decrease as the technology develops, partly as a
consequence of the obligations to detect new CSAM in this option. In addition, strict
requirements and safeguards would apply, including on the reliability of indicators and
independent supervision, and reliable detection tools made available free of charge.
Simi1ar1y to Option C, the identification of the specific providers in scope would be done
through detection orders issued by Member S ttes' national authorities. This ensures a case-
by-case, risk-based and time-limited approach, thus contributing to the proportionality of the
approach. For the detection of new CSAM a specific, higher threshold would apply (as compared to detection orders for known CSAM), i.e. only services at a high and objective risk of being misused for the exchange and dissemination of new CSA、 would be subject to a detection obligation.
In light of the new nature of most previously undetected CSAM, this option would have a
positive impact on victims of ongoing abuse and would significantly enhance the possibility of safeguarding victims from additional abuse. In addition, the early detection and confirmation of new CSAM and the swift addition thereof to the database of known CSAM can helD limit the sDreadina of CSAM across service Droviders.
98
Overall, the measures in this option would theretore tairly balance the attected 士undamental
rights while having a significantly greater positive effect on the rights of victims.
6.1.3.5. Option E : option D + mandatory detection of grooming
The impacts of this option are the same as in Option D, with the important difference of the additional impact caused by requiring service providers to also detect grooming. The introduction of this obligation would have a higher impact on fundamental rights, which would be balanced by stricter personal data protection and privacy safeguards while
providing redress, accountability and transparency.
Detecting grooming would have a positive impact on the fundamental rights of potential victims by contributing to the prevention of abuse. At the same time, the detection process would be the most intrusive one for users (compared to the detection of known and new
CSAM) since it would involve searching text, including in interpersonal communications, as the most important vector for grooming. On the one hand, such searches have to be considered as necessary to combat grooming since the service provider is the only entity able to detect it. Automatic detection tools have acquired a high degree of accuracy206, and indicators are becoming more reliable with time as the algorithms learn, following human review. On the other hand, the detection of patterns in text-based communications may be more invasive into users' rights than the analysis of an image or a video to detect CSAM,
given the difference in the types of communications at issue and the mandatory human review of the online exchanges flagged as possible grooming by the tool.
This obligation would be restricted to only certain specific service providers (identified, on a case-by-case basis, through the detection orders of Member S tates' national authorities), which are at high risk of being misused for grooming, which would further reduce the fundamental rights impact only to the users of those services and the providers concerned. This approach would contribute to ensure the required level of proportionality.
In this option, detection obligations would apply to the three main types of CSA online
(known CSAM, new CSAM and grooming). Compared to voluntary detection, which leaves to private parties the decision of whether to detect, under this option the legislator is the one
taking the decision on whether to detect all three types, given the particularly serious
objective of public interest at stake, setting out the conditions and safeguards under which that detection should take place.
Overall, provided appropriate limits and safeguards are ensured, the measures in this option would therefore fairly balance the affected fundamental rights while having a significantly areater Dositive effect on the riahts of victims.
and and
wevention, detection, reporting g an EU Centre on prevention
6.1.4. UNDG impact
6.1.4.1. Option A: practical measures to enhanceI removal, and assistance to victims, and establishin assistance to victims
Enhancing voluntary detection, removal and reporting of online CSA and the creation of the EU Centre on prevention and assistance would to some extent contribute to relevant S DGs.
Notably, limiting the likelihood of girls and children in general falling victims to CSA would
positively impact S DG 5.2 (eliminate all forms of violence against women girls, as a majority of CSA victims are girls) and S D1 16.2 (end abuse, exploitation, trafficking and all forms of
206 Fir example,M icrosoft reports that the accuracy 0f its grooming detection tool is 88%, see a imex 8
99
the order of magnitude 0f costs and benefits and therefore should not be taken as exact
forecasts.
6.2.1. Costs
All the policy options under consideration would result in costs for public authorities, service
providers, and the Centre. Each policy option includes measures relating to prevention, assistance to victims, and detection, reporting and removal of online child sexual abuse.
In the area of prevention, costs would be incurred by the Commission as a result of the
practical measures in Option A, under which the Commission would have responsibility for
managing the Centre as a knowledge hub without legal personality. Under all other options, costs related to prevention measures would be borne by the Centre itself.
Costs in the area of assistance to victims would similarly by borne by either the Commission or the Centre, depending on the option chosen. In addition, measures to improve prevention and assistance to victims would likely give rise to costs for Member S ttes.
Measures relating to the detection, reporting and removal of online CSA would entail administrative costs for service providers and public authorities under all options. These relate to the expense for service providers to implement measures to detect, report and remove online CSA, whether on a voluntary or mandatory basis, as well as the cost to both service
providers and public authorities of processing each report. Under Options B to E, the Centre would also incur costs relating to the handling of reports, as well as costs for the creation and maintenance of an EU database of indicators of online child sexual abuse.
The cost model built to estimate the above costs first determined the composition of an
average report today, based on the total amount of known and new CSAM files and grooming reports made in 2020. Then it estimated the cost of this average report, based on the estimated time that service providers and public authorities require for processing and following up on it
(including investigations). It also estimated the number of reports in the coming years under the baseline scenario under voluntary detection, assuming that the number of reports would continue to grow in line with trends over recent years. It also assumed that the level of abuse detected and reported by Facebook, which is the top provider of reports to NCMEC, is indicative of the level of abuse that could potentially be detected and reported by other
providers under mandatory detection. Finally, the model estimated the costs of each policy measure by estimating how the policy measure would change the composition of the average report and/or the number of reports compared to the baseline.
The estimated costs of each measure and option are presented in table 3 and table Tble4, he1ow一
Tabl 3: cost estimatespr the retained policy measures伍UR millionり
POLICY MEASURS
ONE-OFF C0STS CONTINUOUS (ANNUAL)
C0STS
Service Providers 8
0 0 。 ノ 「 ノ
● ?
? ?
,
つ ム 0 0 6 ‘ ー
n七 n七
n七 n七
nも
f459A f520β
Public Authorities
E3 ,5 E 1o, E 25フ EllJ E3 ,
f503,6 f250,1
S ervice Providers
EO2 EO , Eo ,
E 137フ E 20A f3522 f604,4
Public Authorities
f0,4 EO ,0 E 5,0 EO ,0 eo ,0 Eo , EO ,0
1 《 ノ ー
ュ〕4
【 、 い 6
101
8
Tabl 4. one-iがand
EO,0 E6 18p e282 f471,9
POLICY OPTIONS
continuous costs estimatespr the policy options伍UR millionり
CO NTINUOUS (ANNUAL" ONE-OFF COSTS C0STS
Public Authorities
Service Providers
Public Authorities
Se rvice Providers
f2,8
EllA
f470,9
f991,3
f1.463,3
f13,9
f43,6
f547,3
f797,4
f825戸
f0,2
f158A
f466タ
f1.025,0
f1.595,3
f0,4
f5,4
f5,4
f5,4
f5,4
A B C D E
6.2.2.βe nグts
The main quantitative benefits derive from savings as a result of reduction of CSA associated
costs, i.e. savings relating to offenders (e.g. criminal proceedings), savings relating to victims
(e.g. short and long-term assistance), and savings relating to society at large (e.g. productivity losses).
To estimate the benefits the first step is therefore to determine the total CSA costs in the EU. As indicated in section 5.1 on the baseline, the estimated annual costs of CSA in the EU are EUR 13.8 biiion一
Box 20: estimation げannual costsげCSシ4in the EU
No studies that have estimated the total costs of CSA in the EU, or in a Member S tate are known to be published207.
Letourneau et al. estimated the total annual costs of CSA in the Us, adjusted to the reference
year 2015, in a paper that appeared in 2018 in the peer-reviewed journal Child Abuse &
Neglect208. The paper estimated total costs including health care costs, productivity losses, child welfare costs, violence/crime costs, and special education costs, based on secondary data drawn from papers published in peer-reviewed journals. The paper indicates that its estimates of annual losses of USD 11 billion are conservative and minimum, since they could not include the economic impact of nonfatal CSA on male victims due to lack of data, and they relied on cases reported to child protection agencies, whereas it is widely recognised that a substantial proportion of CSA cases never comes to attention of child protection agencies209
For comparison, the other known study210 on csA costs in the Us (not peer-reviewed) estimated the annual costs in USD 23 billion. And the only other known peer-reviewed paper (in addition to Letourneau et al's) on CSA costs estimated the annual costs in Canada in
aDDroximatelv CAN $3.70 billion211, with a poi,ulation less than 10% that of the EU.
201 The lack of EU-specific studies is an important gap in knowledge in the fight against CSA in the EU.S uch research could be facilitated through the prevention and assistance to victims functions of the Centre.
208 Letourneau et al., The economic burden of child sexual abuse in the United S ttes, May 2018 209 IOM, NRC, Child maltreatment research, policy, and practice for the next decade: Workshop summary, The
National Academies Press, Washington, DC (2012). 210 T.R. Miller, l.A. Cohen, B. Wiersema, Victim costs and consequences: a new look, 1996. 211O. Hankivsky, D.A. Draker, The economic costs of child sexual abuse in Canada: a preliminary analysis,
Journal of Health &S ocia Policy, 17 (2) (2003), pp. 1-33.
102
Although Letorneau et al's paper concerns the US, studies on the economic cost of violence
against children (including child sexual abuse) suggest that costs are comparable among high- income countries212. Therefore, the conservative estimates provided in the above-mentioned
paper are assumed to be applicable in the EU context, when adjusted to take account of the
larger population in the EU in 2021 compared to that of the US, the infiation rate 2015-2021 and the exchange rate USD-EUR in April 2021, resulting in a total of EUR 13.8 billion of annual CSA costs in the EU.
The quantitative benefits originate mainly from two sources:
savings from CSA crimes prevented: these result not only from the options that explicitly cover prevention but also from those that cause an increase in the number of reports (e.g. those imposing detection and reporting obligations on service providers). The increase in
reports is likely to lead to an increase in victims rescued from ongoing and/or imminent abuse as well as to an increase in arrests, which in turn could lead to prevention of future crimes by those offenders. It could also lead to an increase in removal of CSAM, with the
positive effects on prevention that it entails (see box 1). In addition, the prosecuted offenders would have (improved) access to prevention programmes during and after criminal proceedings (including during and after prison), which could decrease
reoffending. Moreover, the increase in reports could also have a deterrence effect, and
thereby prevent additional offences;
savings from better assistance of victims: these would result from a better mitigation of the negative effects of these crimes on victims, e.g. by facilitating Member S tates' action in this area through the exchange of best practices and research, and supporting the takedown of ima2es and videos (includin2 at the victims' request.
It is not possible to determine exactly what would be the benefits caused by each of these two sources or each policy measure, such as the obligations on service providers or the Centre. In
addition, it is not possible to forecast with certitude what would be the exact benefits of each
policy measure. For example, the reduction of CSA due to prevention would depend to large extent on the investments and efforts from Member S ttes and the EU, which the policy options considered in this initiative could only help facilitate.
Considering the qualitative considerations above, it would be safe to estimate that the
quantitative benefits could be up to 50% of the annual costs of CSA in the EU (remembering that the amount of BUR 13.8 billion was a conservative estimate).
The calculation of benefits for each of the options will take an even more conservative
approach and assume that the benefits would be in the middle of that range, i.e. a maximum of 25% of the total annual costs. This calculation also assumes that there is a direct correlation between the factor that can be best quantified, the increase in reports, and the estimated
savings. This is of course an approximation, as the savings could also derive from other
components not linked to the increase in reporting, as explained above, but it facilitates the
comparison of options. The model therefore assumed a cost decrease of 25% for option E
(highest number of reports) and applied the same ratio of increase in reporting vs decrease in costs from option E to the other options.
212 See , for example Ferrara, P. et al., The Economic Burden of Child Maltreatment in High Income Countries December 2015.
103
Tabl 5. estimated benずtpr the policy options伊UR mdliiり
POLICY OPTIONS
Estimated number 0f
reports
Estimated increase in
reporting compared to the
baseline
Estimated cost reduction Benefits (millions per
year)
Baseline
A
B
C
D
E
1.939.556
2.133.584
2.385.726
7.521.652
8.691.029
8.812.811
10%
23%
288%
348%
354%
97,3f
223,8E
2.800,3E
3.386,9E
3.448,0E
一 m 眺 鱗 碑
a L 2 0 2 4
25,0%
Seeannex 4, sections 3 and 4 for further details on the model, the assumptions and the calculations.
7. How DO THE O PTIONS COMPARE?
7.1. Qualitative comparison
7.1.1. Criteriapr the coll prおon
The following criteria are used in assessing how the five options would potentially perform, compared to the baseline:
Effectiveness in achieving the specific objectives.
Efficiency, i.e. cost-benefit assessment of each policy option in achieving the specific objectives. Coherence with all relevant polcy instruments in the fight against CSA :
a. Legislation: i. horizontal instruments (GDPR, ePrivacy Directive and its proposed
revision, e -Commerce Directive and the proposed Digital S erices Act, Victims' Rights Directive);
ii. sector-specific legislation (CSA Directive, Interim Regulation, Europol Regulation and its proposed revision);
b. Coordination: EU level cooperation in investigations, prevention and assistance to
victims, as well as multi-stakeholder cooperation at EU and global level; c. Funding.
Proportionality, i.e. whether the options go beyond what is a necessary intervention at EU level in achieving the objectives.
7.1.2.Swnmaグ げ訪e collparおon
Table 6 below summarises the qualitative scores for each main assessment criteria and each
option. The options are compared below through listing positive (+), negative (ー) and 'no-
change' (~) impacts compared to the baseline (> indicates higher costs compared to the
baseline).
The detailed comparative assessment ifall options can be found in annex 4, section 2:
104
comparison qfpolicy options
Proportionality Coherence
summary of the
Efficiency
Table6了
βenejits leg. Cuord. Fund.
Effectiveness
+ + + + + > +
+ ++ + ++ >> ++
十 + +++ + +++ >>> +++
+ + +++ + >>>> ++++ ++++
+ + +++ >>>>> +++++ + +++++
Baseline
OptionA
Option B
Option C
Option D
Option E
7.1.3.助とctivness
The scores on effectiveness indicate the extent to which the impacts screened in section 6 contribute to the achievement of the specific objectives.
1. Ensure the effective detection, removal and reporting of online child sexual abuse where
they are currently missing
While options A and B could improve detection, removal and reporting of online child sexual
abuse, their effectiveness is significantly limited by their reliance on voluntary action by providers when it comes to detection, which has proven to be insufficient. Under option A, as under the baseline, many of these activities would be prohibited following the expiry of the Interim Regulation.
Options C to E are the only options which would ensure the effective detection and reporting of online CSA . In particular, Option E would have the highest effectiveness as it would ensure that all relevant online service providers detect known and new CSAM, and grooming.
Whereas option C imposes obligations to detect only known CSAM, options D and E, impose additional, cumulative obligations to detect new CSAM and grooming respectively. As described in Se ction 6.1.1, the detection of new CSAM and grooming, by their nature,
provide greater added value in terms of the ability to identify and rescue children from
ongoing or imminent abuse. As such, the effectiveness under options D and E is higher than in option C. The obligations to detect, and report known and new CSAM and grooming are a
significant step forward. Reliable tools for the detection of CSA online are already freely available and in use by a number of service providers. Extending their deployment to all relevant online services could greatly contribute to virtually eliminate the dissemination of known CSAM on such services and significantly reduce the dissemination of new CSAM, and the instances of grooming. The Centre would facilitate the detection, reporting and removal
process, including by making available technology and possibly contributing to their
develoDments through its technical exverti5e213.
213 Researchers have acknowledged the need to continue developing technical tools to detect, report and remove CSA online.S ee for examples, Insoll T, Ovaska A & Vaaranen-Valkonen N, (Protect Children),CSAM Users in the Dark Web: Protecting Children Through Prevention, 2021.
105
2. Improve legal certainty, transparency and accountability and ensure protection of
fundamental rights
Option A, which consists of non-legislative measures, offers the least improvement in terms of legal certainty, protection of fundamental rights, transparency and accountability. Any such
improvements under Option A would be largely limited to legal advice and jurisprudence and the establishment of best practices to be adhered to on a voluntary basis.
Options B to E could all offer significant improvements in these areas. Under each of these
options, the conditions for voluntary detection would be clarified and mandatory measures to
detect, report and remove CSA online would be established, ensuring improved legal certainty for all stakeholders. h addition, each of these options would establish robust safeguards and
accountability mechanisms to ensure strong protection of fundamental rights. These would include notably the designation of a competent national authorities to assess the measures
implemented by relevant online service providers, impose detection and removal orders, and
impose sanctions on providers that do not meet their obligations. These options would also establish transparency obligations for both service providers and the authorities designated to receive reports from and supervise providers, as well as redress mechanisms for users, among other safeguards.
Both the baseline scenario and option A would not address the current challenges and the
impact on children's fundamental rights would likely worsen with time.
Option B would increase legal certainty for detecting CSA voluntarily and would also create an obligation to report once a provider becomes aware and remove CS AM, once confirmed to be illegal. In addition, the activities of the EU Centre would have a significant positive impact on the fundamental rights of victims and children who may become victims. The necessary safeguards would also be provided in order to balance the interference with the rights of the users and providers. However, the detection of CS A would remain voluntary, which would not ensure a consistent protection for children who are or might become victims, while there will still be an impact of privacy and data protection rights of all users. In sum, this option would have a certain negative impact on fundamental rights, particularly those of children.
Options C to E would render the detection of CSA mandatory, and, especially since the
systems used for detection can affect relevant fundamental rights would include
comprehensive safeguards. Furthermore, appropriate checks and balances are also to be set
up, notably through sanctioning mechanisms and reporting and transparency requirements, and supervision by the competent national authorities, supported where relevant in the technical aspects by the EU Centre to prevent and counter child sexual abuse. These options would have overall small positive (Option C), significant positive (Option D) and significant positive (Option E) impacts on fundamental rights, particularly those of children.
The fundamental rights most clearly touched upon by the intervention are the following:
Rights to human dignity and integrity of the person, prohibition of inhuman and
degrading treatment and rights of the child (Articles 1, 3, 4 and 24 of the Charter). All five options would have a positive impact in protecting the safety and rights of children. Consistent with the analysis in section 6.1.3 the positive impact is strengthened with each subsequent option. Given the seriousness of the crimes at stake and of the
impact on children, being vulnerable persons entitled to protection by the public authorities, the objective pursued by the envisaged measures is capable of justifying a
significant interference with the fundamental rights of other parties involved (service
106
providers, users), provided that the interference respects the essence of those rights and
remains limited to what is necessary.
Rights to respect for private and family life, protection of personal data, and freedom of expression and information (Articles 7, 8 and 11 of the Charter).
Each of the options would have an impact on privacy and the protection of personal data, with regard to both the users of relevant online services and victims or potential victims of child sexual abuse. All options take into account the need to balance these impacts by including strong safeguards for voluntary/mandatory detection, reporting and removal of online CSA .
Evidently, the obligations imposed by Options C, D and E would have the greatest impact on overall users' rights, especially those to privacy and on personal data protection, due to the data to be processed in the detection and the progressively increasing need for human review with each option. Furthermore, errors in the detection process could have additional negative consequences for users' rights, such as erroneous decisions to remove users' content, or limit access, which would impact their freedom of expression and information. At the same time, the scope for erroneous decisions is likely to be limited,
especially when adequate safeguards are provided for, bearing in mind the 'absolute'
(non-context-specific) nature of the prohibition of distributing CSAM. That holds in
particular in respect of Options C and (to a somewhat lesser extent) Option D, considering the accuracy of the technologies which would need to be used.
On the other hand, the progressively increasing detection and number of reports of online child sexual abuse expected under each option would result in corresponding improvements to the rights of victims (and potential victims) to privacy and personal data. In particular, options C, D and E would contribute significantly to safeguarding rights of
victims, while robust safeguards would ensure proportionality and limit interference to what is strictly necessary.
Freedom to conduct a business (Article 16 of the Charter). Another important element of the overall balance that has to be struck is the balance between facilitating or mandating action against CSA online and the protection of
providers' freedom to conduct a business.
The options considered in the impact assessment take into account the need to ensure that
any impact upon these rights and freedoms would be strictly limited to what is necessary and proportionate, whilst leaving the essence of the freedom to conduct a business unaffected. While Options A and B would not directly or significantly affect the freedom to conduct a business, Options C, D and E would entail an interference with this freedom, while however minimising negative effects on this right by ensuring a level playing field for all providers offering services in the Union, regardless of their size or location. The interference with this right will be further mitigated by the strong support offered by the
Centre, the availability of the necessary technology at no or limited costs, as well as the benefits associated with operating under a clear and uniform legal framework.
3. Reduce the proliferation and effects of CSA through harmonisation of rules and increased coordination of efforts
The non-legislative measures of Option A are less effective than the rest of the options, which includes the creation of the EU Centre to support prevention and assistance to victims, as well as detection, reDortina and removal of CSA online. Practical measures can only lead to
107
limited improvements, and cannot replace a Centre as reference entity in the EU and a
facilitator on all the aspects of the fight against child sexual abuse.
7.1.4. flciencア
Except for the baseline, all options would generate some additional administrative costs for
public authorities as a result of the anticipated increase in reporting of CSA. Options C to E would lead to significant cost increases for public authorities due to the significant increase in the volume of reports of online CSA expected to arise from the obligations imposed on service providers under those options.
For service providers, all options will generate administrative and other costs, and may also result in savings when processes become more efficient. The extent of additional costs to service providers will, in part, depend upon the nature and size of their services, which is
expected to affect both the volume of data to be processed for the purposes of detection and
reporting, and the cost of integrating the relevant technologies.
Given the cumulative nature of the options, the costs also increase with each option, driven in
particular by the increased detection obligations. These will entail a progressive increase in
reports and therefore increased costs for both service providers and public authorities. On the other hand, these increased obligations would also lead to increased benefits derived from
savings as a result of reduction of CSA associated costs, i.e. savings relating to offenders
(e.g. criminal proceedings), savings relating to victims (e.g. short and long-term assistance), and savings relating to society at large (e.g. productivity losses).
7.1.5. Coherence
a) Legislation
Horizontal instruments
● GDPR The proposed measures in Options B to E build on the GDPR. At the moment, various
grounds for processing set out in the GDPR are invoked by service providers to carry out the
processing of personal data inherent in voluntary detection and reporting of CSA online.
Options B to D would specify the conditions for mandatory and voluntary detection,
providing greater legal certainty for those activities.
Insofar as mandatory detection activities involving processing of personal data are concerned, options C to E would build on the GDPR's Article 6 (l)(c), which provides a legal basis for the processing of personal data to comply with a legal obligation.
ePrivacy Directive and its proposed revision
The proposed measures in Options B to E would include service providers that offer
interpersonal electronic communications services and hence are subject to the provisions of the ePrivacy Directive and its proposed revision currently in negotiations. These measures
presuppose the need for a derogation from the relevant provisions of that Directive (akin to the Interim Regulation already in force, but then without limit in time and covering, where
relevant, also mandatory detection) and would provide specific conditions for the processing of certain types of data otherwise subject to the ePrivacy framework.
● e-Commerce Directive
The e-Commerce Directive prohibits Member S tates from imposing general monitoring obligations and from actively seeking facts or circumstances indicating illegal activity. The
los
DSA proposal confirms and restates this principle. The legislative proposal will include the
necessary elements (including on objectives pursued, type of material, scope and nature of
obligation, risk-based approach, limitation in time, assistance, safeguard and supervision) to ensure respect for the appropriate balancing of fundamental rights enshrined in this principle.
The proposed Digital S ervices Act
Options B to E would build on the DSA's horizontal framework, setting out a more specific framework where needed for the particular case of combating CSA online, akin to sectoral
legislation such as the Terrorist Content Online Regulation, relying on the baseline provided by the DSA where possible. As regards the prohibition of general monitoring and active fact-
finding obligations (which is also provided for in the DSA proposal), see the above point on the eConm-ierce Directive.
Victims ' Rights Directive
Options A to E would strengthen - to an increasing extent - support to victims, in coherence with the Victims' Rights Directive as a horizontal instrument to improve victims' access to their rights. Options B to E would establish an EU Centre that would carry out, in addition to its principal tasks, certain tasks relating to prevention and assistance to victims, and would thus ensure greater facilitation of the cooperation with Member S tates and exchange of best
practices, with regards to CSA victims. These options would also include measures to enhance the practical implementation of victims' rights to stop images and videos related to their abuse from circulating and hence give fuller impact to these rights.
Sector-specific legislation
● CS二4Directive
The CSA Directive is a criminal law instrument, which none of the policy options considered would contradict. In fact, strengthening prevention, detection, reporting and victim support should positively influence the implementation of the Directive and cooperation between Member S tates.
勿terim Regnたtion
Option A would contribute through non-legislative measures to the voluntary efforts by online service providers under the Interim Regulation. Once the Interim Regulation expires on 3
August 2024, there would not be another legal instrument to replace it under this option.
Options B to E specify the conditions for voluntary detection, reporting and removal ofCS A online and options C to E define obligations to detect CSA online. These options would
provide a long-term regulatory framework that would build on the Interim Regulation (including its safeguards) and replace it.
EuropolRegnんtion and itsProPosed revision
Under options B to E , the EU Centre would be the recipient of the reports by service
providers, will review them and eventually forwarded them to Europol for action. The
processing and follow up of these reports by Europol would be governed by the Europol Regulation and then by its proposed revision. This proposed revision could strengthen the
fight against CSA by e.g. effectively supporting Member S tates and their investigations with the analysis of large and complex datasets, addressing the big data challenge for law
data that Europol authorities.
enforcement authorities. The Centre would contribute to ensure that the services from service providers is actionable and usable for law enforcement
109
b) Coordination
EUたvel cooperation m investigations,Prevention and assおtance to vic功ns
Option A would facilitate to a limited extent cooperation in investigations, prevention and assistance to victims. This cooperation would be higher in the case of OptiOns B to E, thanks to the Centre, whose main purpose is to serve as a facilitator of efforts, including thorough increased cooperation in those three areas.
Multi-stlk加ider COGPeration at EUand global level
Likewise, the Centre in options B to E would also facilitate multi-stakeholder cooperation at EU and global level, in particular by facilitating the exchange of best practices on prevention and assistance to victims. Under options C to E , the obligations to detect CSA online would likely entail an increase in the number of reports in other jurisdictions, in particular the Us. While these obligations would apply only to services offered in the EU, the cross-border nature of these crimes means that a significant number of reports will relate to activities which involve third countries (for example, a report of grooming where the suspect and victim are located in different
jurisdictions). in addition, while technology to detect known CSAM is widely used by many providers, technologies for the detection of new CSAM and grooming are less widely- deployed. It is expected that obligations to use such technologies in the EU could lead to increased voluntary use of the same technologies in relation to third countries, particularly as their distribution would be facilitated by the centre to the relevant service providers offering their services in the EU (without imposing restrictions on use outside of the EU). The amount of CSAM detected globally would increase, and with it the possibilities to stop its circulation and prevent future abuses globally. The number of cross-border investigations and
opportunities to cooperate internationally, within the EU and globally, would increase.
Box 21: riskげduplication げreporting to the EU Centre and NCMEC
Mandatory reporting of CSA online to the EU Centre could lead to duplicating obligations for US service providers to make reports both in the EU and in the Us.S ome stakeholders have
suggested that, in order to avoid duplication of reporting, any obligation to report to an EU
organisation should include an exemption for providers that already report to NCMEC. This
exemption would have several negative consequences, notably:
delays for European law enforcement authorities to receive the reports due to exclusive
reporting to NCMEC and losing the ability to 'de-confuct' reports by discovering reports having the same or similar content by cross-referencing the reports received by NCMEC, the EU Centre and Europol;
unequal conditions and safeguards relating to the reporting obligations, since those
existing under US law and those to be established under the present initiative would
differ; and
the processing of large volumes of EU user data outside the EU, by an entity not bound by EU law.
Such an exemption would therefore have a negative impact on the protection of fundamental
rights, another specific objective of the initiative, and potentially lead to negative effects on international relations. Where possible within the limits sets by the applicable legislation, the
implementation of technical solutions to report could help ensure that there is no confusion or
unnecessary duplication of reports received by law enforcement agencies in the EU (e.g. by simply adding a tag in the report indicating whether it has been sent to the US or the EU).
110
In any event, the obligations under EU law would remain limited to the relevant services
offered in the EU. Therefore, those obligations would not extend to services offered elsewhere.
e) Funding
The Centre under options B to E would serve as a facilitator of efforts, possibly including thorough signposting funding opportunities at EU and national level and maintaining an overview of past projects, to avoid duplication of efforts and ensure the most effective use of funds. The Centre would also facilitate research on prevention and assistance to victims,
possibly by managing its own research funding.
7.1.6. Proportionaliか
The five options follow the same principle of proportionality and necessity of an intervention at EU level: a fragmented approach across Member S ttes is unable to ensure an appropriate level of protection to children across the Union, and the protection of fundamental rights of all online users. Whereas the level of effectiveness of the options is different, as they contain different measures and impose different obligations, all are proportionate, as none goes beyond what is a necessary intervention at EU level to achieve the specific objectives. In
addition, the conditions of application and safeguards for each option are conceived according to match its level of intrusion.
7.2. Quantitative comparison
7.2.1. Overall costs
For the purpose of comparing the options and calculating overall costs, the total combined cost (not discounted) to service providers and public authorities over a period of 10 years (202 1-2030) was considered. The cost over this period was obtained by combining the one-off costs of the relevant policy measures with the sum of the annual costs for ten years. These include all costs directly arising from the measures as described in Annex 4, section 3, such as costs for the establishment of the Centre, implementation of technical measures for detection and reporting of CSA online, development of tools, processing of reports, etc.
The one-off and annual costs associated with each policy option are set out in detail in Annex 4, section 4.
Over 10 years, the total of costs per option is the following:
Tabk 7: comparative costs of琉e policy options over lo years (EUR billions)
E D
24.49 18.92
C
10.65
B
1.71
A
1.17
�
T ota
l
costs (
EUR billions)
7.2.1. Overall benグts
The table below compares the estimated costs and benefits for the different options over ten
years:
111
Tabl 8. comparative quantitative assessmentげthe policy options over l o years伊UR
billionり
A B C D E
Overall costs 0.17 0.71 10.65 18.92 24.49
0verall benefits 0.97 2.24 28.00 33.87 34.48
Total (net benefits) 0,81 1,52 17,35 14,95 9,99
The overall benefits (not discounted) assumes a decrease of 25% in the total SA costs per year. Annex 4 contains a sensitive analysis on the % decrease in total CSA costs to determine the minimum values at which each of the options would produce net quantitative benefits. Table 9 summarises these results:
Table 9: minimum%decrease in totaルnnualSン4 costs to generate net benグts in each
Policy option
A 0,13%
B 0,6%
C 8%
D 14%
F 18%
8. PREFERRED OPTION
On the basis of the assessment, the preferred option is E, which notably includes:
the creation of the EU Centre in the form ifa decentralised EU agency;
mandatory detection of known and new CSA1 and grooming, based on detection
orders; an obligation to report possible CSA online to the EU Centre; and
an obligation to remove CSA online, once confirmed as illegal.
The preferred option is the one that most effectively address the problem drivers as well as the associated costs and impacts in other areas such as fundamental rights, and achieves the
objectives of the initiative. While some of the other options that are mire economically convenient, the degree to which they would be less effective outweighs financial savings. However, it should be noted that the report aims to make a recommendation for the preferred option, and the final policy choice is left to the political decision maker.
The annual estimated costs of Option E, based upon the analysis in Sec tion 6.2.1, are summarised in Table 10, below. As noted in that section, the costs were estimated primarily for the purposes 0f comparing the policy options. The estimates provide an idea of the order 0fm agnitude 0f costs and benefits and therefore should not be taken as exact forecasts.
112
卑tonE伍URm辺ionり
CO NTINUOUS (ANNUAL) C0STS
Public S ervice Authorities Providers
f3,5 f2,8
f25,7 e 0,0
f11,1 f6,9
f3,3 f1,7
f503,6 f459,4
f250,1 f520,5
f28,2 f471,9
f825,6 f 1.463,3
Tabl 10.annm l costsげthe prげとrrd
ONE-OFF C0STS
Public Se rvice Authorities Providers
f0,4 f0,2
f5,0 f0,0 fl.0 f0,0
f0,0 f20,4
f0,0 f352,2
f0,0 f604,4
f0,0 f618,0
f5,4 f1.595,3
POLICY MEASURS
1
3
4214
5
6
7
8
Total
8.1. 'Iain advantages
EttectiVely achieves the general and specitic o bectiVes: Option b would bring strong improvements in identification, protection and support of victims of child sexual abuse, would ensure effective prevention and would facilitate investigations. In particular:
The Centre would facilitate and support coordination of efforts of all relevant actors, which would in turn reduce the proliferation and effects of CSA . This includes carrying out certain tasks entailing support for victims, which could rely on the Centre to assist them in requesting removal of known CSA1 depicting them.
ゆ The Centre would help boost efforts (and their effectiveness) in the overall fight against child sexual abuse in the EU, focusing on CSA online but leading in that manner also to concrete results offline.
The legislative provisions, in particular the obligations to detect known and new CSAM and grooming, combined with the support of the Centre on detection, reporting and removal efforts, would ensure the effective detection, removal and reporting of online CSA where they are currently missing. The safeguards to be included in the legislation, combined with the Centre's support to
help ensure transparency and accountability in the detection, reporting and removal by online service providers, would improve overall legal certainty, protection of fundamental rights, transparency and accountability.
吟 The Centre is a fundamental component of the legislation. It serves as a key safeguard in the detection, reporting and removal process.
The establishment of clear and uniform legal requirements at EU level, to the exclusion of diverging national rules on the issues covered, would improve the functioning of the internal market to the benefit of both providers and users. The present initiative will J lin other sector-specific initiatives like the terrorist content online regulation and the
Copyright directives in providing more specific and stricter rules to address certain types of illegal content and activities.
Respects subsidiarity and proportionality Subsidiarity: option E offers the highest added value of EU action described in section 3.3. In iarticular, it reduces 1e2a1 fra2mentation throuh the EU level legislation, and throuh
214 Adjusted to exclude one-off costs of measure 4 on voluntary detection, which would be covered by those of measures 6, 7 and 8 on mandatory detection.
113
the Centre it facilitates Member S ttes' action, enables the exchange of best practices and
reduces dependence and increases cooperation with third countries.
Proportionality: OptiOn E does not go beyond what is necessary to achieve the general and
specific objectives identified for EU intervention. In particular, the necessary measures would be taken to ensure respect for the fair balance principle underlying the prohibition to
impose general monitoring or active fact-finding obligations. Also, the legislation in this
option would have the legitimate purpose of more effectively tackling CS A online, including better protection of victims through more effective detection, reporting and removal, with the
necessary limits and safeguards to ensure a fair balance and proportionality.
Protects fundamental rights: All options have to strike a fair balance between different fundamental rights. 0f the available options, option E protects fundamental rights to human
dignity and to the integrity of the person, the prohibition of inhuman or degrading treatment, and the rights of the child, among others, by boosting efforts to better prevent and protect children from sexual abuse and better support victims. In addition, option E also limits the
impact on fundamental rights of users of the online services concerned, notably to the
respect for private and family life, protection of personal data, and freedom of expression, among others, to the strictly necessary minimum, through the necessary limits and
safeguards in the legislation, including the functions of the Centre. These conditions also ensure increasing standards over time as technology evolves, by ensuring that tools
correspond to the state of the art. 'f particular, given the importance of the objective and the interference with the rights of users inherent in proactive detection, the decision on the limits and safeguards to detect CSA should be the legislator's, not the service provider's.
8.2. Main disadvantages
Implies more extensive implementation ettorts and higher costs: the implementation ettorts ot the legislation imposing such obligations on service providers, and setting up the Centre, would likely require more time and effort and hence be more expensive than a less
comprehensive instrument. The establishment of the Centre as a decentralised EU agency requires higher initial and running costs than if the Centre were established as part of an
existing entity.S ervice providers will incur costs to comply with the legislation. Public authorities will also incur increased costs, notably to deal with the likely increase in child sexual abuse cases detected.
8.3. Trade-Offs
Better detection, reporting, prevention and victims' assistance imply new efforts and costs To achieve the general objective, the initiative proposes a new legislative framework for online service providers, which includes the creation of a Centre to facilitate existing and new efforts. Whereas the proposal would seek to minimise disruption, building as much as
possible on ongoing efforts, it is clear that additional human, technical, and financial efforts are required to improve prevention, support of victims, and the detection, reporting and removal mechanisms. The new efforts will likely lead to an increase of detected cases, at least in the near future, before prevention efforts decrease the prevalence of the crimes.
Although option C would have the highest net economic benefit, the overall benefits for
option C are still expected to be significantly lower than under option E. In addition, as set out in the qualitative comparison in section 7.1, option E appears as the best one in terms of overall qualitative scores, driven by higher effectiveness.S pecifica11y, the detection of
grooming included in option E adds a significant prevention aspect to this option, which
114
determines its highest score on ettectiveness compared to the other options. Child sexual
abuse material depicts scenes of crimes already committed, and, whereas its detection contains an important prevention aspect as described in box 1, the detection of grooming focuses on preventing crimes such as hands-on abuse or sexual extortion. This avoids the short-term and long-term consequences for victims, all of which cannot be numerically quantified.
Improved detection and reporting imply a comprehensive set of conditions and safeguards Mandatory detection of known and new CSAM and grooming has an impact on fundamental rights of all users, in particular considering that online service providers would be processing personal data, in both public and non-public (interpersonal) communications. This is a sensitive issue that requires appropriate consideration to ensure that the conditions and safeguards put in place protect the fundamental rights of all users. Likewise, the
relationship with other acts of EU law (especially e-Commerce Dlrective/DSA and the EU data protection acquis) is a point of particular attention. This will likely require substantial time to prepare (until the legislative proposal becomes EU law) and implement.
8.4. Application of the 'one in, one out' approach
The 'one in, one out' approach refers to the principle whereby each legislative proposal creating new burdens should relieve people and businesses of an equivalent existing burden at EU level in the same policy area.
The preferred option for this initiative entails direct adjustment costs for businesses (service providers) and administrations. These are costs of complying with and adjusting their
operating processes to the requirements of the proposed legislation. Examples of adjustment costs for service providers include the human and tecimical resources to comply with the
obligations to detect, report and remove CSA online. The preferred option will also generate direct adjustment costs for administrations (notably law enforcement), due to the increased workload to deal with the increase ofCSA reports.
The preferred option also creates administrative costs for service providers and administrations. These are costs that result of administrative activities performed to comply with the administrative obligations included in the proposed legislation. They concern costs for providing information, notably on the preparation of annual transparency reports.
On the other hand, the proposed legislation will replace one existing legislative instrument: the Interim Regulation. This would generate savings on administrative costs for service
providers and public authorities.S ee Annexes 3 and 4 for additional details.
Furthermore, the initiative is expected to generate significant cost savings to society, derived from a reduction in CSA crimes (e.g. reduction in productivity losses, see section 6.2.2).
Also, the EU Centre will facilitate action of Member S tates and service providers in
preventing and combating CSA, and support victims. This will generate cost savings, by, e.g. helping avoid duplication of efforts and facilitating a more effective and efficient use of resources.
9. How WILL ACTUAL IMPACTS BE MONITORED AND EVALUATED?
The actual impacts of the preferred option, i.e. the actual progress in the fight against child sexual abuse offline and online, will be monitored and evaluated against the three specific objectives. The indicators would build on those of the Interim Regulation to minimise
disruption and costs.
115
The specific objectives basically aim to improve what is being done (specific objectives i
and 3), and how it is being done (specific objective 2). The specific objectives have
corresponding operational objectives, which would be monitored using various data sources
through indicators, which different actors would be responsible for collecting and sharing.
116
Table ll」monitring qfgeneral, pec夢c and pertional objectives
Who is responsible for collection -
output
indicators ~ data sources Specific objectives Operational objectives General
objective
EU Centre - Commission annual report 一 iniplenientation to the public and report every 5 the Commission years (extended - evaluation every 5
version) years, using as sources the annual reports from the EU Centre and from
providers, among others
Prevention
prevalence rate in Member S ttes - surveys
number, type and evaluation results (including best
practices and lessons learned) of prevention programmes -
public authorities in Member S ttS
Assistance to victims:
number of victims assisted and level of satisfaction of victims with the assistance provided一surveys to survivors
number, type and evaluation results (including best
practices and lessons learned) of victims assistance
programmes 一 public authorities in Member S tats
Detection and reporting: number of reports by Member S tate, source (company, hitline, public), type of online service, and type ofCSA
online (i.e. number of images and videos, including unique/not unique and kowr'new, and grooming) - EU Cen tr7
feedback on reports: ifni action taken why, if action taken outcome (number of victims identified/rescued, number of offenders convicted, and (anonymised and
short) description ofme case)一public authorities in Member S tats
Prevention:
reduce CSA
prevalence reduce duplication and blind spots of Member S ttes' efforts
Assistance to victims:
provide the required assistance
reduce duplication and blind spots of Member S ttes' efforts
Detection and reporting:
detect, report and remove all CSAM , known and new, distributed online
increase detection and reporting of
g roomng
Improve the what": 1. Ensure the
effective
detection, removal and
reporting of online child sexual abuse where they are
currently missing
3.Reduce the
proliferation and effects of child sexual abuse
through harmonisation of
mies and increased coordination of efforts
Service
providers -
wmml repoだ ti supervisiry authirities, the EU Centre and the Cimntissiin
technologies used, including error rates, measures ti limit the errir rates, and, if the tecl'miligies are new, measures taken to comply with written advice if cimpetent authirities 一 service providers
Improve the how": ・ Make clear all 2. Improve legal relevant aspects of
certainty, the detection, transparency and reporting and
acciuntability removal process by and ensure inline service
pritectiin if prividers fundamental
rights
Improve the
functioning of the Internal
Market by introducing clear, uniform and balanced EU rules to
prevent arid combat child sexual abuse
117
Annexes
119
127
ANNEX 1: PROCEDURAL INFORMATION
ANNEX 2:STAKEHO LDER CO NSULTATION
ANNEX 3: WH0 1S AFFECTED AND HOW?..........................................................................173
ANNEX 4: ANALYTICAL ME THODS.....................................................................................181
236
250
ANNEX 5: RELEVANT LEGISLATION AND POLIIES
ANNEX 6: ADDITIONAL INFORMATION ON THE PROBLEM
ANNEX 7:SAM PLE CASES OF CHILD S EXUAL AB USE ONLINE IN THE EU................267
ANNEX 8: TECNOLOGIES TO DETECT CHILD S EXUAL AB USE ONLINE ..................278
AN"EX 9: ENCRYPTION AND THE FIGHT A GAINST CHILD S EXUAL AB USE............284
ANNEX 10: EU CENTRE TO PREVENT AND COUNTER Cliii) S EXUAL AB USE ........315
379
118
AN"EX 11: S 1E TEST
ANNEX 1: PROCEDURAL INFORMATION
Lead DG, Decide Planning/CWP references
This S taff Working Paper was prepared by the Directorate-General for Migration and Home Affairs (HOME).
The Decide reference of this initiative is PLAN/2020/8915.
This initiative appears in the 2021 Commission Work Programme under action 35,
'Follow-up to the EU security strategy': Legislation to effectively tackle child sexual abuse online (legislative, incl, impact assessment, Article 114 TFEU, Q2 2021).
Organisation and timing
Organisation
The S ecurit Union Inter-Service Group ('sG), chaired by the Sec retary-General of the Commission, was consulted at all stages of the process to prepare the impact assessment, including the inception impact assessment, consultation strategy, questionnaire for the public consultation and the various drafts of the impact assessment.
The ISG included the following Commission services: DG EMPL (DG Employment, Socia1 Affairs and Inclusion), DG GROW (DG Internal Market, Industry, Entrepreneurship and sME), DG RTD (DG Research and Innovation),S J (Legal Serice), DG SA NTE (DG for Health and Food safety), DG TRADE, DG CNECT (DG Communications Networks, Content and Tecbnology); DG EAC (DG Education and
Culture); DG J UST (DG Justice and Consumers); DG NEAR (DG Neighbourhood and
Enlargement Negotiations);ES TAT (Eurostat); DG DEFIS (DG Defence Industry and
space); DIGIT (Informaties); DG ECHO (DG Humanitarian Aid and Civil Protection); DG ENER (DG Energy); DG ENV (DG Environment); DG FISMA (DG Financial
Stability, Financial Se rices and Capital Markets Union); FP' (Service for Foreign Policy Instruments); IDEA (Inspire, Debate, Engage and Accelerate Action); JRC (Joint Research Centre); DG MARE (DG Maritime Affairs and Fisheries); DG MOVE
(Mobility and Transport); DG TAXUD (Taxation and Customs Union); DG REFORM
(DG S tructural Reform support); OLAF (European Anti-Fraud Office); DG INTPA (DG International Partnerships); CERT-EU (Computer Emergency Response Team for the EU
Institutions, bodies and agencies); DG BUDG (DG Budget) and DG REGIO (DG Regional Policy). It also included the EE AS (European External Action S ervice).
The last meeting of the 'sG, chaired by the Sec retarit-General, was held on 17 January 2022.
Timing - chronology of the IA
This initiative was first announced in the July 2020 EU strategy for a more effective
fight against child sexual abuse215, where the Commission notably committed to:
215 EU strategy for a more effective fight against child sexual abuse,COM (2020)607 final
119
propose the necessary legislation to tackle child sexual abuse online effectively
including by requiring relevant online services providers to detect child sexual abuse
on their services and to report any such abuse to relevant public authorities; and
work towards the possible creation of a European centre to prevent and counter child
sexual abuse to enable a comprehensive and effective EU response against child
sexual abuse online and offline, based on a thorough study and impact assessment.
The strategy also announced the proposal for the necessary legislation to ensure that
providers of electronic communications services could continue their current voluntary practices to detect in their systems child sexual abuse after December 2020. The Commission proposed this legislation ("the Interim Regulation") in se ptember 2020216, and on 29 April 2021 there was a political agreement between the European Parliament and the Council on the text, which was then adopted by the two institutions in July 2020217
The present initiative, once adopted, would replace this Interim Regulation, among other purposes.
The Commission published an inception impact assessment218 on 3 December 2020. The feedback period ran until 30 December 2020. A public consultation was launched on 11 February 2021, and stakeholders and citizens had the opportunity to express their views through an online questionnaire until 15 April 2021.
While work on various aspects of the measures considered has been going on for several
years, the drafting of the impact assessment itself started in October 2020 and continued until February 2022, after incorporating the feedback from the Regulatory S crutiny Board
Consultation of the Regulatory Sc rutiny Board
Board received the draft version of the present impact Scrutiny on 25 May 2021. It issued an impact assessment quality checklist on
The Regulatory assessment report 11 June2021.
The Regulatory S crutiny Board issued a first negative opinion on 17 June 2021 on the draft impact assessment report. To address the feedback given by the Regulatory S crutiny Board, the following changes were made in the report and its annexes:
216 Proposal for a Regulation of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/C of the European Parliament and of the Council as
regards the use of technologies by number-independent interpersonal comniunications service
providers for the processing of personal and other data for the purpose of combatting child sexual abuse of 10 S eptember 2020,C 0112020/568 fmal.
217 Regulation (EU) 202 1/1232 of the European Parliament and of the Council of 14 July 2021 on a
temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of
technologies by providers of number-independent interpersonal comniunications services for the
processing of personal and other data for the purpose of combating online child sexual abuse, OJ L
274, 30.7.202 1, p. 41-5 1
218 inception Impact Assessment, 3 December 2020.
120
were incorporated in the
annexes How they report and
Board's comments
Changes were made throughout the
report, in particular in sections 1, 2 and 3, in particular to highlight that the central focus of the legislation to is to harmonise rules for online service providers
Addition of a dedicated section (5.2.2.1) discussing the implementation choices for the EU centre.
1. The internal market dimension and the necessity for EU action in the area of prevention and victim
support is not always clear
Sectlon 5 in particular was reviewed to detail the safeguards that could apply (see description of options).S ectlon 6 was
updated accordingly, including the
analysis on fundamental rights.
2. The report does not fully describe all the available policy choices and leaves a number of questions open. It does not discuss in a transparent and balanced manner the alternative
implementation forms for a
European centre
3. The report does not clearly establish how safeguards will ensure fundamental rights, in particular regarding technologies to detect CSA in encrypted communications
4, The comparison of policy options S ection 7 was reviewed to notably does not comply with the standard include coherence as a comparison assessment criteria and is not based criterion, and a revised ranking on a clear and consistent ranking methodology. methodology
17 June
by the and its
Regulatory S crutiny Board issued a second and final positive opinion on
given repo比
impact assessment report. To address the feedback were made in the
on the The 2021
Regulatory S crutiny Board, the following changes annexes:
121
How they were incorporated in the
report and annexes
Additional descriptions of the role of the Centre on prevention and assistance to victims added to S ection 5.2.1. Additional clarifications on the role of the Centre added in sections 5.2.2., 5.2.3., 5.2.4., and 5.2.5.
to an
5.2.2. was restructured and analyse the options in
Board's comments
Section
present open, complete and balanced manner.
1. The role of the EU centre and associated costs are not sufficiently described. The implementation options for the EU centre are not
presented in a sufficiently open, complete and balanced manner
Further clarifications added in sections 5.2. and 5.2.3.
2. The report is not sufficiently clear on how the options that include the detection of new child sexual abuse material or grooming would respect the prohibition of general monitoring obligations
Further clarifications added in section
8.3., in particular in relation to the
importance and added value of
grooming detection.
3. The efficiency and proportionality of the preferred option is not
sufficiently demonstrated
Clarifications added in section 8.4., in
particular in relation to the costs and included in the quantification
in, one out purposes.
scope and quantification of the
savings for the 'one in, oses are not clear
and cost 4. The
cost one out' purp
Evidence, sources and quality
When drafting the impact assessment report and annexes, particular attention has been to properly reference all the sources and review their quality.
calculations of costs and benefits were limited by the lack of data. The
given
The Commission made significant efforts to collect data, or at least estimates, from public
surveys. Where this information was )del to calculate costs, which were
authorities and service providers through targeted assumptions were made in the m(
experts from Member S ttes and service providers.
not available, discussed with
The evidence base includes in particular:
external studies prepared at the request of the European Commission
122
'CF et al. S tud on options for the creation of a European Centre to prevent and
counter child sexual abuse, including the use of 'CT for creation of a database of hashes of child sexual abuse material and connected data protection issues, 2021 'CF et al. S tdy on framework of best practices to tackle child sexual abuse material online, 2020.
'CF, Grimaldi, Overview of the legal framework of notice-and-action procedures in Member S tates,SMA RT 20 16/0039, 2018.
. selective list of relevant case law:
Court of Justice of the European Union:
C-236/08 to C-238/08, Google France SA RL and Google Inc. v Louis Vuitton
Malletier sA,EC LI: EU:C:20 10:159.C3 80/03.
C-324/09, L'Or6al v eBay,EC LI:EU:C:201 1:474.
C-70/10,S car1et Extended SA v SABAM , ECLI:EU:C:2011:771.
C-360/10, SABAM v Netlog NV, ECL:EU:C:2012:85.
C-314/12, UPC Telekabel Wien, EU:C:2014:192.
C-484/14, McFadden, E CL:EU:C:2016:689.
C-18/18, Glawischnig-Piesczek v Facebook Ireland, ECLI:EU:C :2019:821.
European Court of Human Rights:
Application no. 2872/02, K.U. v. Finland, judgment of 2 December 2008.
Application no. 5786/08,S6 derrnan v.S weden, judgment of 12 November 2013.
Application no. 24683/14, RIJ TV A/S against Denmark, decision of 24 May 2018.
Application no. 56867/15, B uturg五 against Romania, judgment of 11 February 2020.
Decisions of national courts
Antwerp Civil Court, A&M, judgment n.2010/5-6 of 3 December 2009.
OLG Karlsruhe, judgment 6 U 2/15 of 14 December 2016.
Rome Court of Appeal, RTI v TIFT Enterprises LLC, judgment 8437/20 16 of
27 April 2016.
Austrian S upreme Court, (Oberster Gerichtshof), decision 6 0b 178/04a of 21
December 2006.
Turin Court of First Instance, Delta TV v Google and YouTube, judgment No
1928, RG 38113/2013 of7 April 2017.
Se lective Bibliography
Carnegie Endowment for International Peace,Mo villg the Ellc rpton Policy Conversation Forward, Encryption Working Group,S eptember 2019.
De Jong, R., Child S exua1 Abuse and Family Outcomes, Crime Sc ienc, 2
November 201 5一
123
Di Rua, R., Beslay, L., 'Fighting child sexual abuse-Prevention policies for
offenders, Publication ( fice of the EU, 3 0ctober 2018.
Fargo, J., Pathways to Adult S exua1 Revictimization: Direct and Indirect
Behavioural Risk Factors across the Lifespan, Journal of Interpersonal Violence,
160ctober2008.
Fanid, H., Reining in online abuses, Technology and Innovation, Vol.19, p. 593-
599, 2018.
Floridi, L., & Taddeo, M. (2017). The Responsibilities of Online S erice
Providers, 2017.
Gewirtz-Meydan, A., Finkelhor, D.,Se xua1 Abuse and Assault in a Large National S amp1e of Children and Adolescents, Child Maltreatment, 16 Se ptember 2019.
Kuhle, L., et al., Child S exua1 Abuse and the Use of Child Se xua1 Abuse Images, 9 March 2021.
Letourneau, E., The Economic Burden of Child S exua1 Abuse in the United
S ttes, Child Abuse & Neglect, Vol. 79, May 2018.
Madiega, T. (2020). Reform of the EU liability regime for online intermediaries.
Background on the forthcoming Digital Se rvices Act. European Parliamentary Research S ervice, PE 649.404, May 2020.
Martin E,S ilverstone P: How much child sexual abuse is "below the surface".
and can we help adults identify it early, Front Psychiatry, May 2013.
Niemi Pereda et al., 'The prevalence of child sexual abuse in community and
student samples: A meta-analysis', Clinical Psychology Review, Vol. 29, Issue 4
(2009).
Rosetizweig, P. (2020). The Law and Policy of ClientーSide S caming, Lawfare, 20
August 2020.
Ruzicka, A., Assini-Me界in, L.,S chaeffer, C., Bradshaw, C., & Letourneau, E.,
Responsible Behavior with Younger Children: Examining the Feasibility of a
Classroom-Based Program to Prevent Child S exua1 Abuse Perpetration by Adolescents, Journal of Child S exualAbuse, 8 February 2021.
S ch errer, A., Ballegooij, W., Combating sexual abuse of children Directive
2011/93/EU, European Implementation Assessment, European Parliamentary Research S ervice, PE 598.6 14, April 2017.
S chw emer, s.F. (2018). On domain registries and unlawful website content.
勿tern ational 匠urnalげLaw and Information Technolog〕ノ,Vol. 26, Issue 4, 12
0ctober 2018.
S luis, J. et al. (2012). Cloud Computing in the EU Policy S phere, 2011.
124
S mlth M. (2020), Enforcement and cooperation between Member S tates - E
Commerce and the future Digital S erices Act,S tudy for lICO committee, PE
648.780, April 2020.
S ta11a-Bourdillon, S . (2017). Internet Intermediaries as Responsible Actors? Why It Is Time to Rethink the E-Commerce Directive as Well. In The Responsibilities of Online S erice Providers, 1 July 2016.
Truyens, M., & van Eecke, P. (2016), Liability of Domain Name Registries: Don't S hoot the Messenger, Computer Law &Secu rit Review, Vol.32, Issue 2, 19 January 2016.
Urban, J., et al., Notice and Takedown in Everyday Practice, UC Berkeley Public
Law Research Paper No.2755628, 22 March 2017.
Van Hoboken, J., et al., Hosting intermediary services and illegal content online:
An analysis of the scope of Article 14 ECD in light of developments in the online
service landscape, final report prepared for the European Commission,
Publictions Qtllceげthe Eu 29 January 2019.
Wagner B., Rozgonyi K. et al., Regulating Transparency? Facebook, Twitter and
the German Network Enforcement Act, January 2020.
Wilman, F., The responsibility of online intermediaries for illegal user content in
the EU and in the Us, 20 November 2020.
Related Impact Assessments
Impact Assessment accompanying the Proposal on a S ing1e Market For Digital Se rvices
(Digital S ervices Act) and amending Directive 2000/3 1/EC,S WD(2020) 348 final, 15 December 2020.
Impact Assessment accompanying the document Regulation of the European Parliament and of the Council amending Regulation (EU) 20 16/794, as regards Europol's cooperation with private parties, the processing of personal data by Europol in support of criminal investigations, and Europol's role on research and innovation,S WD(2020) 543
final, 9 December 2020.
Targeted subsitute Impact Assessment on the Commission proposal on the temporary derogation from the e-privacy Directive for the purpose of fighting online child sexual
abuse,E rpean PaHiamentr ResearchSe rvic, PE 662.598, February 2021.
Impact Assessment accompanying the Proposal for a Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online,
SWD(2018) 408 final, 12 S eptember 2018.
Impact Assessment accompanying the Proposal for a Regulation of the European Parliament and of the Council on European Production and Preservation Orders for electronic evidence in criminal matters and Proposal for a Directive of the European Parliament and of the Council laying down harmonised rules on the appointment of legal
125
esentatives for the purpose of gathering evidence in criminal proceeding,
through the stakeholder consultation, as
SWD(2018) 118 final, 17April2018.
Additional external expertise was gathered explained in detail in Annex 2.
126
ANNEX 2:STAKEHO LDER CO NSULTATI N
This al iex is the synopsis report of all stakeholder consultation activities undertaken in the context of this impact assessment.
1) Consultation strategy
In order to ensure that the general public interest of the EU is properly considered in the Commission's approach to the fight against child sexual abuse, the Commission has consulted as widely as possible. The consultation aimed to enable an evidence-based
preparation of the future Commission initiative for a more effective fight against child sexual abuse with the help of stakeholders and had four main objectives:
to identify current best practice, as well as challenges and gaps, and the relevant
needs of all stakeholders;
to identify ways forward that would best address those needs;
to ensure that stakeholders (including citizens and those who would be directly affected by this initiative), can provide their views and input on the possible
options for the way forward; and
to imDrove the overall evidence base under〕irniin a the initiative.
To do this, the Commission services identified relevant stakeholders and consulted them
throughout the development of its draft proposal. The Commission services sought views from a wide range of subject matter experts, service providers, business associations, national authorities, civil society organisations, and from members of the public on their
expectations and concerns relating to the issue of child sexual abuse and possible initiatives to prevent and combat it. These included in particular the responsibilities of relevant online service providers and possible requirements to detect and report child sexual abuse online and to report that material to public authorities, as well as the
possible creation of a European centre to prevent and counter child sexual abuse.
During the consultation process, the Commission services applied a variety of methods and forms of consultation. They included:
the consultation on the Inception Impact Assessment and the Open Public
Consultation, which sought views from all interested parties;
targeted stakeholder consultation by way of dedicated questionnaires;
a series of workshops, conferences, expert groups, as well as bilateral meetings;
inviting position papers and analメical papers from organizations, industry
representatives, civil society and academia.
Taking into account the technicalities and specificities of the subject, the Commission services focused on targeted consultations, addressing a broad range of stakeholders at national and EU level.
2) The consultation was structured as follows: 1. Who - stakeholders consulted:
citizens;
service providers:
127
rnctivictual companies;
professional and business associations;
public authorities from Member S tates and relevant non-EU countries:
Ministry of Justice officials;
Ministry of Interior officials;
law enforcement representatives;
legal practitioners (lawyers, prosecutors, judges);
non-govermental organisations (NGOs);
inter-governmental organisations ('GOs);
EU institutions and agencies; and
. academia.
2. how - methods and tools used: S urveys:
Open public consultations:
o S urvey, open to feedback fom any interested party, from 11 February 2021
to 15 April 2021; included a luik to the Conmiission website on the fight
against child sexual abuse219 to provide further information and context
o Consultation on the Inception Impact Assessment, open to feedback from
any interested party from 2 December to 30 December 2020.
Targeted surveys: o survey for law enforcement authorities in Member S tates to collect
information regarding the origin, quality and use of reports of child sexual
abuse online that law enforcement authorities receive.
o S urvey for law enforcement authorities in Member S tates to collect
information regarding the costs associated with reports of child sexual abuse
online received by law enforcement authorities (LEAs); how the quality of
reports can be improved; and the impact of encryption on investigations.
Meetings220
Expert group meetings and bilateral meetings organised by the Comniission;
Participation in conferences and workshops organised by third parties.
In total, the dedicated consultation activities lasted two years, from February 2020 to January 2022.
the same logical sequence of the impact gradual
gradually
definition and allowing for a The consultation was designed to follow
assessment, starting with the problem development of the possible options and scenarios and their impacts increasing the number of stakeholders involved.
EU strategy for a more effective fight against child sexual abuse,COM (2020)607 final. For a list of meetings and conferences, please see S ection 3 below.
128
219
220
3. What - the consultation gathered feedback on the problem definition, options and
impacts of these options, focused on the legislation to tackle child sexual abuse online
effectively and the possible creation of a European centre to prevent and counter child sexual abuse. The diversity of perspectives proved valuable in supporting the Commission to ensure that its proposal addresses the needs, and takes account of the
concerns, of a wide range of stakeholders. Moreover, it allowed the Commission to
gather necessary and indispensable data, facts and views, on the relevance, effectiveness,
efficiency, coherence and EU added value of the proposal. Taking into consideration the Covid- 19 pandemic and the related restrictions and inability to interact with relevant stakeholders in physical settings, the consultation activities focused on applicable alternatives such as online surveys as well as meetings via video conference. The table below sunmarises the structure of the consultation:
129
ルble 1:consu ltation strategypr a more ffctive lig加against cん尼sexual abuse
How
& rveys Meetings Conferences
Open public Targeted survey i Targeted survey 2 Group Bilateral consultation
Citizens t, t,,
Se rvice VI' 'I
providers
Public 7 7 7 ノ ノ ノ authorities
Practitioners / J I 't, WHO
NGOs , J v,
'GOs ノ ゾ ノ 7
EU institutions ( v' v' I aud agencies
Academia ( I
Problem defnition, Origin, quality Costs and quality Problem defnition, Problem defnition, Problem definition, options and
options and impacts aud use of reports of reports options and impacts options aud impacts impacts
WHAT
130
1. Consultation activities - summary of results
The following sections presenta summary of the main results of the consultation activities.
Open public consultation
The purpose of the open public consultation was to gather evidence from citizens and stakeholders and it was part of the data collection activities that the related inception
impact assessment announced in December 2020.
In total, 603 responses were submitted by a diverse group of stakeholders. It was addressed to a broad range of interested stakeholders, including public authorities, EU institutions and agencies, international organisations, private companies, professional and business associations, NGOs, academics and the general public.
Most feedback was received by citizens (77.93% from EU citizens, 1.84% from non-EU
citizens), NGOs (10.37%), public authorities (3.51%), companies/businesses organizations (2.68%). This was followed by others (1.84%), business associations
(0.84%), academic/research institutions (0.67%), as well as consumer organisations (0.33%). Additionally, around 45 position papers were received in the context of the
open public consultation.
In terms of geographical distribution, most of the respondents are located in the EU, with a majority of contributions coming from Germany (45.15%), Ireland (16.22%), Belgium (4.18%) and Italy (4.18%). Internationally, the highest share of respondents that
participated were from the UK (1.84%) and the Us (2.5 1%)221.
summary
Its results as far as current practices and identified gaps, legislative solutions and the
possible creation of a European centre to prevent and counter child sexual abuse are
concerned, can be sunmiarized as follows:
The public consultation revealed broad support for EU action (among all
categories of respondents). More specifically it revealed strong support for legal certainty for all stakeholders
involved in the fight against child sexual abuse online (e.g. service providers, law
enforcement and child protection organisations), for future-proved legislations, for effective cooperation between stakeholders and for additional coordination
and support to EU level in the fight against child sexual abuse online and offline.
What isme current situation and where are琉e gops
54.01% of the respondents state that the new legislation should aim to enable a swift
takedown of child sexual abuse material after reporting.
France,
Spin,
Czechia, Finland, Russia, S 1oVenia,
221 Countries withs 15 submissions include Austria, Bulgaria, Croatia, Cyprus, Greece, Hungary, Kosovo, Luxembourg, Netherlands, Poland, Portugal, S weden, S witzerland, Thailand, Venezuela, Zimbabwe.
131
The new legislation should further aim to reduce the number of instances of online
grooming of children, based on the feedback provided by 49.67%.
The areas of prevention and assistance to victims of child sexual abuse should be
tackled in priority according to 6 1.54% and 65.05% of respondents, respectively. Law enforcement reflected on what are the main challenges they face in their work
investigating child sexual abuse cases.
85.71% raised their concerns with regards to the increased number of child sexual
abuse material in the last decade and the lack of resources (i.e. human, technical). It
was followed by concerns about the underreporting of child sexual abuse cases and
difficulties accessing evidence during investigation linked to the introduction of end-
to-end encryption (38.1% and 47.62%). 14.29% referred to gaps in national or/and
EU laws as one of the main issues.
NGOs cooperate with law enforcement authorities in the fight against child sexual
abuse, including by forwarding reports of child sexual abuse online received from the
public or from service providers. 74.19% of the respondents see a need for
improvement in the cooperation.
NGOs also cooperate with services providers. Among other things, NGOs advise
them on policies to fight child sexual abuse online and they also send notice-and-
takedown requests to services providers. However, based on 72.5 8% of the replies, there is still room for improvement. 9.68% of the NGOs respondents consider that current efforts to tackle child sexual
abuse online strike an appropriate balance between the rights of victims and the rights of all users (e.g. privacy of communications) while 56.45% considered that the
current efforts put too much emphasis on the rights of all users and not enough emvhasis on victims' riahts.
Legisたtive soんtion: what shou尼it include to tackle the above gaps 施ctiveケ If online service providers were to be subject to a legal obligation to detect, remove
and report child sexual abuse online in their services, most of the respondents to the
public consultation agreed that services providers of social media (33.11%), image
hosting (29.10%), web hosting (25.75%), message boards (23.75%), video streaming
(23.58%) and online gaming (2 1.40%) should be subject to such legal obligation. In addition, if legislation were to explicitly allow online service providers to take
voluntary measures to detect, remove and report child sexual abuse online in their
services, providers of the following services should be included: social media
(38.96%), image hosting (35.79%), video streaming (30.43%), message boards
(29.10%), online gaming (26.76%). The respondents further reflected on the types of child sexual abuse online that the
possible legislation should cover as well as on the best possible ways to achieve that
as follows:
132
Known child sexual abuse
material (i.e. material
previously confirmed as
constituting child sexual
abuse)
Mandatory detection and removal
Mandatory reporting
Voiuntary detection and removal
Voluntary reporting
No need to cover this in the
legislation
New (unknown) child sexual
abuse material
Mandatory detection and removal
Mandatory reporting
Voluntary detection and removal
Voluntary reporting
No need to cover this in the
legislation
Online grooming
Mandatory detection and removal
Mandatory reporting
Voluntary detection and removal
Voluntary reporting
No need to cover this in the
legislation
Live - streaming of child
sexual abuse
Mandatory detection and removal
Mandatory reporting
Voluntary detection and removal
Voluntary reporting
No need to cover this in the legislation
161
72
26.92%
12.04%
14.2 1%
7.53%
26.92%
20.07%
14.55%
15.22%
10.03%
28.26%
17.89%
17.89%
14.05%
10.20%
27.09%
26.09%
16.05%
12.88%
7.69%
25.08%
5 5
n 相
7 1
0
& 柔
匠 ロ
g ダ
α
169
107
107
4, ー
8 6
162
邸 96
77 46
印
To be able to detect, remove and report child sexual abuse online
providers need to carry out a series of actions. The respondents to the public
133
consultation were asked to share their views concerning the proportionality of the
following action, when subject to all necessary safeguards:
32.94% 8.36% 16.89% 30.77%
37.96% 13.04% 15.05% 22.07%
60.20% 6.69% 6.52% 14.38%
To check whether images or videos
uploaded online (e.g. to a social media piatform, or a file hosting service) are copies of known child sexual abuse material
To assess whether images or videos
uploaded online (e.g. to a social media platform, ora file hosting service) constitute new (previously unknown) child sexual abuse
material
To check whether images or videos sent in a private communication are
copies of known child sexual abuse mated』
60.20% 6.69% 6.52% 14.38%
63.38% 6.02% 6.86% 12.2 1%
54.85% 9.03% 9.70% 13.04%
50.33% 8.86% 11.54% 14.55%
To assess whether the images or videos sent in a private communication constitute new
child sexual abuse material
To assess whether the images or videos sent in a private communication constitute new child sexual abuse material
To assess whether the contents of a
text based communication constitute grooming
To assess, based on data other than
content data (e.g. metadata), whether the user may be abusing the online service for the purpose of child sexual abuse
The actions to detect, remove and report child sexual abuse online may require safeguards to ensure the respect of fundamental rights of ai users, prevent abuses, and ensure proportionality. According to the submitted replies, the legislation should put in place safeguards to ensure the following:
134
Safguards to ensure the respect of fundamental rights of all users, prevent abuses,
and ensure proportionality �
Partially Disagree disagree
13.04% 4.18%
Partially
agree
12.21%
Fully agree
41.30%
13.04% 1.67% 9.20% 49.50%
12.7 1% 2.5 1% 8.36% 48.16%
11.20% 1.17% 4.85% 54.52%
10.70% 1.84% 7.86% 51.67%
The tools used to detect, report and remove child sexual abuse online reduce the error rate to the maximum extent possible
The toois used to detect, report and remove child sexual abuse online are the least
privacy intrusive
The tools used to detect, report and remove child sexual abuse online comply with the data minimisation principle and rely on
anonymised data, where this is possible
The toois used to detect, report and remove
child sexual abuse online comply wilh the
purpose limitation principle, and use the data exclusively for the purpose of
detecting, reporting and removing child sexual abuse online
The toois used to detect, report and remove child sexual abuse online comply with the
storage limitation principle, and delete
personal data as soon as the purpose is filfilied
11.87% 3.85% 10.37% 3 8. 13%
16.22% 5.18% 10.70% 36. 12%
11.20% 6.19% 13.7 1% 3 8. 13%
13.55% 8.53% 14.88% 32.61%
6.19% 1.00% 5.69% 61.37%
4.85% 1.00% 4.68% 62.37%
The mnimne service provider conducts a data
protection impact assessment and consults the supervisory authority, if
necessary
Online service providers are subject to the
oversight of a supervisory body to assess their compliance with legal requirements
Reports containing new material or
grooming are systematically subject to human review before the reports are sent to law enforcement or orgaiistions acting in
the public interest against child sexual abuse
All reports (inciuding those containing only previously known child sexual abuse
material) are systematically subject to
human review before the reports are sent to law enforcement or organisations acting in
the public interest against child sexual abuse
A clear complaint mechanism is available to users
Effective remedies should be available to users that have been erroneously affected by the actions of the service provider to detect,
report and remove child sexual abuse online
135
Providers should make clear in the Terms
and Conditions that they are taking measures to detect, report and remove child sexual abuse online
60.87% 5.18% 1.5 1% 5.02%
In the context of possible 士iture legislation allowing/obliging relevant online service providers to detect, report and remove child sexual abuse online in their
services, 39.97% of the respondents believe that companies should be subject to financial sanctions if they fail meet the legal obligations (including safeguards) related to the detection, reporting and removal of child sexual abuse online. While 27.09% opposed to this.
Concerning criminal sanctions, opinions were almost equally divided between those in favour of such measure (35.96%) and those against (30.43%). It is further noted that there is no difference between the percentage for the
respondents who would agree (32.61%) and that for those who would not
(32.61%), that companies that erroneously detect, remove or report child sexual abuse online in good faith should not be subject to the relevant sanctions.
Nearly half (41.64%) of the respondents participating in the survey stressed that there should be no sanctions for failure to meet the legal obligations (including safeguards) related to the detection, reporting and removal of child sexual abuse online. At the same time, 22.57% of the replies were in favour of such measure.
Transparency reports could refer to periodic reports by service providers on the measures they take to detect, report and remove child sexual abuse online. These
transparency reports should be:
Obligatory to ensure transparency and accountability
Voluntary: an obligation would incur an additional burden on the online service providers, especially when
they are small and medium enterprises Evaluated by an independent entity
Standardised, to provide uniform quantitative and
quaiitative information to improve the understanding of the effectiveness of the technologies used as well as the scale of child sexual abuse online
46.15% 17.39%
25.92% 31.77%
47.99% 11.37%
50.17% 11.54%
ln addition, trans arencv renorts should include the followina information
Number of reports of instances of child sexual abuse online reported by type of service
Number of child sexual abuse material
images and videos reported by type
of service
48.49%
44.98%%
0 O/
6
29 26
13
44.3 1%
265
47.66% 285
46.66% 279
53.01% 317
48.66% 291
53.34% 319
46.15% 276
49.33% 295
Time required to take down child sexual abuse material after it has been flagged to/by the service provider
Types of data processed to detect, report ard remove child sexual abuse online
Legal basis for the processing to detect,
report aud remove child sexual
abuse online
Whether data are shared with any third
party and on which legal basis
Number of complaints made by users
through the available mechanisms and
the outcome of those proceedings
Number aud ratio of false positives (an online event is mistakenly flagged as child sexual abuse online) of the different technologies used
Measures applied to remove online child sexual abuse material in line with the online service provider's policy (e.g. number of accounts blocked)
Policies on retention of data processed for the detecting, reporting and removal of child sexual abuse online and data
protection safeguards applied
To measure the success of the possible legislation, a series of performance indicators should be monitored. In particular:
Number of reports of child sexual abuse online reported by company and
type of service (33.78%); Number of child sexual abuse material images and videos reported by company and type of service (32.78%); Time required to take down child sexual abuse material after it has been na gged to/by the service provider (34.78%); Number of children identified and rescued as a result of a report, by company and type of service (44.31%); Number of perpetrators investigated and prosecuted as a result of a report, by company and type of service (44.3 1%); Number of related user complaints as a result of a report, by company and
tve of service (33.28%).
Views were particularly divided over (i) the legal obligation of online service
providers that offer their services within the EU, even when the providers themselves
are located outside the EU, and (ii) the legal obligation of online service providers who offer encrypted services to detect, remove and report child sexual abuse online
in their services.
137
Pbssibk European centre to Prevent and counter chiM sexual abuse
44.65 % of the respondents see a need for additional coordination and support at
EU level in the fight against child sexual abuse online and!or offline to maximize
the efficient use of resources and avoid duplication of efforts.
This could help to address existing challenges related to law enforcement action
(up to 30% of the replies), preventive measures (up to 45%) as well as in the field
of assistance to victims (up to 41%).
Concerning relevant functions to support law enforcement action in the fight
against child sexual abuse in the EU, survey respondents supported that possible Centre could:
o Receive reports in relation to child sexual abuse to ensure the relevance of
such reports, determine jurisdiction(s), and forward them to law
enforcement for action (45.82%); o Maintain a single EU database of known child sexual abuse material to
facilitate its detection in companies' systems (39.96%); o Coordinate and facilitate the takedown of child sexual abuse material
identified through hotlines (43 .98%); o Monitor the take down of child sexual abuse material by different
stakeholders (38.96).
In order to ensure transparency and accountability regarding actions of
service providers to detect, report and remove child sexual abuse online in their
services, the EU Centre should:
o Ensure that the tools employed are not misused for purposes other than the
fight against child sexual abuse (59.53%); o Ensure that the tools employed are sufficiently accurate (55.69%); o Ensure that online service providers implement robust technical and
procedural safeguards (44.15%); o Draft model codes of conduct for service providers' measures to detect,
report and remove child sexual abuse online (37.46%); o San ction service providers whose measures to detect, report and remove
child sexual abuse online, including associated technical and procedural
safeguards, do not meet legal requirements (30.6%); o Receive complaints from users who feel that their content was mistakenly
removed by a service provider (5 0%); o Publish aggregated statistics regarding the number and types of reports of
child sexual abuse online received (46.49%).
The EU centre would support prevention efforts in the fight against child sexual
abuse in the EU:
138
S upport Member S ttes in putting in place usable, rigorously evaluated
and effective multi-disciplinary prevention measures to decrease the
prevalence of child sexual abuse in the EU (51%); 〇 Serve as a hub for coimecting, developing and disseminating research and
expertise, facilitating the communication and exchange of best practices between practitioners and researchers (54.85%);
Help develop state-of-the-art research and knowledge, including better
prevention-related data (51.17%); Provide input to policy makers at national and EU level on prevention
gaps and possible solutions to address them (49%).
In addition, the respondents reflected on the possible functions of the Centre
which would be relevant to support efforts to assist victims of child sexual
abuse in the EU:
o S upport implementation of EU law in relation to assistance to child
victims of sexual abuse (56.35%); o S upport the exchange of best practices on protection measures for victims
(58.03%); 〇 Carry out research and serve as a hub of expertise on assistance to victims
of child sexual abuse (56.59%); o S upport evidence-based policy on assistance and support to victims
(58.03%); o S upport victims in removing their images and videos to safeguard their
privacy (57.36%); o Ensure that the perspective of victims is taken into account in
po licVmakina at EU and national level (54.1 8%).
With regards to the most appropriate type of organisation for the possible centre, 34.78 % of the respondents would welcome the creation of an EU body.A smaller percentage identified public- private partnerships (5.18%) and 20.90%
non for profit organisations (20.90%) as the most appropriate types of
organisation for the possible Centre.
More than half of the respondents (53.5 1%) consider that the possible Centre
should be funded directly from the Union budget, while almost 1 in 5 support the
idea of mandatory levies on industry (18.73%) or voluntary contributions from
industry(19.90%), and non for profit organistions(22.74%) as the most
appropriate types of funding.
Probルin dscr加tion斥urrent gaps andpossibルoutcomes]
The majority of the public survey respondents, all categories included, acknowledged the online grooming of children as the most concerning type of child sexual abuse online which needs to be tackled in priority.
Public authorities
139
Practitioners from law enforcement and other public authorities stressed that the new
legislation should reduce the number of instances of online grooming of children and enable a swift takedown of child sexual abuse material after reporting222. The
respondents further expect the initiative to reduce the amount of unknown child sexual abuse material distributed in the open web223 or via messaging applications224 as well as to reduce the amount of sexual material self-generated sexual by children distributed online225. According to 52.38%, the new legislation should aim to ensure that child sexual abuse material stays down (i.e. that it is not redistributed online), in addition, 7 1.43% of the respondents highlighted the need to improve prevention as one of the main
goals of the new legislation. It should further provide legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations)226, and be future-proof27. The new
legislation could also serve to improve transparency and accountability of the measures to fight against child sexual abuse online (23.81% of the respondents). Practitioners furthermore expressed concerns regarding the increased volume of child sexual abuse material detected online in the last decade and the insufficient human and technical resources to deal with it228.
Companies
Online grooming is perceived as a challenge and should be tackled in priority according to 56.25% of the public survey respondents representing companies, who further identified the need to enable swift takedown of child sexual abuse material after
reporting229. They further stressed that the new legislation should prioritise the following prevention and victim support outcomes: to provide legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations)230 as well as to ensure that legislation is
future-proof. Improving prevention and assistance to victims of child sexual abuse was also identified as a key concern. 18.75% stressed the need to enable a swift start and
development of investigations, while (25% flagged that) it should also ensure a victim- centric approach in investigations, taking the best interests of the child as a primary consideration.
Non-governmental organisations
More than half of the respondents from non-govermiiental organisations stated that the current efforts to tackle child sexual abuse online place too much emphasis on the rights ifall users and not enough emphasis on victims' rights231. 4.84% believe that the current efforts do not place enough emphasis on the rights of the users.
In their view, the new legislation should aim to reduce the number of instances of online
grooming and to enable a swift takedown of child sexual abuse material after
22280.95% (n=17) 0f the respondents from law enforcement or other public authorities. 223 71.43% (n=15) of the respondents for law enforcement or other public authorities. 224 71.43% (n= 15) of the respondents for law enforcement or other public authorities. 225 66.67% (n=14) of respondents from law enforcement or other public authorities. 226 61.9% (n=13) of the respondents from law enforcement or other public authorities. 227 76.19% (n= 16) of the respondents from law enforcement or other public authorities. 228 85.71% (n= 18) of the respondents from law enforcement or other public authorities. 229 4375% (n=7) of the respondents from companies. 230 56.25% (n=9) of the respondents from companies or business organisations. 231 56.45% (n=35) of the respondents from non-governmental organisations.
140
rep0rting乙, while ensuring that child sexual abuse material stays down (i.e. that it is not
redistributed online) and reducing the amount of new child sexual abuse material
uploaded in the open web233. It should further provide legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations)234 and improve transparency and
accountability of the measures to fight against child sexual abuse online235. Legislation should not overlook the importance of prevention and assistance to victims.
General public
Nearly half of the individuals participating in the survey flagged online grooming of children as the most concerning type of child sexual abuse online, which needed to be tackled as a matter of priority.236 The distribution of known and new child sexual abuse material by uploading it to the open web (e.g. posting it in social media or other websites,
uploading it to image lockers, etc.)237, and the distribution of new child sexual abuse material via darknets238 were next on their list.
Among the possible outcomes that the new legislation should aim to achieve, the general public referred to the need to enable swift takedown of child sexual abuse material after
reporting239 and to reduce the number of instances of online grooming of children240. The new legislation should further aim to reduce the amount of sexual material self generated by children distributed online (23.27%). Two thirds of the respondents stated that the new legislation should aim to improve assistance to victims of child sexual abuse, while close to half flagged the need for a victim-centric approach in investigations, taking the best interests of the child as a primary consideration. Prevention efforts should further be
improve子些
Cooperation between stakeholders
Public authorities referred to the inefficiencies (such as lack of resources) in public- private cooperation between service providers and public authorities as one of the main
challenges while investigating child sexual abuse cases242. 33.3 3% of the respondents further expressed concerns regarding the lack of uniform reporting procedures, resulting in variable quality of reports from service providers.
Almost 50% of the civil society organisations taking part in the survey reported that their organisations cooperate with law enforcement authorities by forwarding reports of child sexual abuse online received from the public243. 13 out of 62 forward reports from service providers to law enforcement authorities, while some of them provide technology of hash lists for the detection of child sexual abuse online (7 and 4 out of 62, resvectiVelV〕. They also cooterate with service p roViders in the fwht aaainst child sexual
乃277.42% (n=48) 0f the respondents from non-governmental organisations. 233 67.74% (n=42) of the respondents from non-governmental organisations. 234 74.19% (n=46) of the respondents from non-governmental organisations. 235 70.97% (n=44) of the respondents from non-governmental organisations. 236 48.43% (n=23 1) of the general public. 237 32.91% (n=157) of the general public. 238 33.12% (n=158) of the general public. 239 49.69% (n=237) of the general public. 240 45.49% (n=2 17) of the general public. 241 58.91% (n=291) of the general public.
19.05% (n=4) of the respondents from law enforcement or other public authorities. 243 51.61% (n=32) of the respondents from non-governmental organlisations
141
abuse online by advising them on policies to fight child sexual abuse online244, and by
sending notice-and-takedown requests to service providers245. However, they saw room for improvement in the area of cooperation in the fight against child sexual abuse both between civil society organisations and law enforcement authorities246 and between civil
society organisations and service providers247
Legisたtive soんtions
roんntaグmeasures
More than 75% of public authorities stated that social media, online gaming and video
streaming should fall within the scope of legislation on voluntary measures to detect, remove and report child sexual abuse online.
50% of the participants representing companies were in favour of voluntary measures to
detect, remove and report child sexual abuse online in social media, instant messaging, text-based chat (other than instant messaging) and message boards, among others.
Concerning voluntary detection, removal and reporting of known and new (unknown) material, 25% of the replies to the open public consultation questionnaire suggested that these measures should be covered by the possible legislation. Online grooming and live-
streaming of child sexual abuse should also be covered by rules on voluntary measures248
More than 55% of the representatives from non-governmental organisations suggested that social media, online gaming, web and image hosting providers should be included in
legislation which would explicitly allow voluntary detection, removal and reporting child sexual abuse online. A smaller percentage (6.45%) supported that no service provider should be legally enabled to take such voluntary measures.S ome respondents required a
legislation which would cover not only the voluntary detection and removal of known and new (unknown) child sexual abuse material but also voluntary measures to detect and remove online grooming and live-streaming of child sexual abuse.
Over 50% of the respondents from the general public stated that no service provider should be legally enabled to take voluntary measures to detect, remove and report child sexual abuse. Around i in 6 (15%) individuals suggested that the possible legislation should cover the voluntary detection and removal of known and new (unknown) child sexual abuse material, online grooming and live-streaming of child sexual abuse. With
regards to voluntary reporting, of all types of child sexual abuse online, around i in 10
(10%) of the respondents believe that it needs to be covered by the new legislation.
Mandatory detection and removalげ伽own and un伽own child sexual abuse material
Law enforcement and other public authorities, non-governmental organisations, academic249 and research institutions as well as other entities agreed that the new
legislation should impose mandatory detection and removal of know and new (unknown) material, online aroomina and live streamina of child sexual abuse. One third of the
244 43,55% (n=27) 0f the respondents from non-governmental organisations. 245 30.65% (n=19) of the respondents from non-governmental organisations 246 74.19% (n=46) of the respondents from non-governmental organisations. 247 72.58% (n=45) of the respondents from non-governmental organisations.
12.5% (n=2) in favour of voluntary detection and removal, and 12.5% (n=2) in favour of voluntary reporting.
249 100% (n=4) of the respondents from academic and research institutions.
142
replies coming from companies suggested the mandatory reporting of different types of
child sexual abuse250.
Public authorities
The majority of law enforcement and other public authorities considered that social media25 1, online gaming, video streaming, and instant messaging252 should be subject to
obligatory detection, removal and reporting of known child sexual abuse material253 More than half of the respondents (57%) thought mandatory detection and removal should also extend to new (unknown) child sexual abuse material and live-streaming.
Companies
While some companies considered that mandatory detection, removal and reporting should encompass known254 and unknown child sexual abuse material as well as online
grooming255, a majority disagreed. 31.25% of respondents suggested that no service
provider should be subject to a legal obligation to detect, remove and report child sexual abuse online. They were particularly concerned about the costs for small businesses.
Business associations, whose input has to be treated with particular caution given the
very small sample size, overall identified a need for legal certainty for all stakeholders involved in the fight against child sexual abuse online (e.g. service providers, law enforcement and child protection organisations)256. Two of three respondents thought that service providers should not be subject to a legal obligation to detect, remove and report child sexual abuse online. They proposed a more flexible reporting scheme for small and medium-sized enterprises and law enforcement authorities, always with respect to
privacy efforts and principles.
Non-govemietal organisations
The majority of non-governmental organisations representatives suggested that online service providers should be subject to a legal obligation to perform those actions in their services with a particular focus on social media257, online gaming and video streaming258, among others. On the other hand, 12.9% stressed that no service provider should be
subject to such legal obligation. More than 50% of the respondents side with some other
respondents in giving priority to mandatory detection and removal of known material259;
highlighting the importance of mandatory detection and removal of new (unknown) material260 and live-streaming of child sexual abuse26 1.
General public
250 31.25% (n=5) 0f the respondents from companies and business organisations. .24% (n=20) of respondents from law enforcement or other public authorities.
ondents from law enforcement or other public authorities. ondents from law enforcement or other public authorities.
.95% (n=17) of the resp
95 80
251
252
253 71.43% (n=15) of the resp 25% (n=4) of the respondents from companies. 31.25% (n=5) of the respondents from companies.
Ir business associations. 〕m non-governmental organisations.
ital organisations. ital organisations. organisations.
ital organisations.
143
ondents fri )ondents fr(
respondent resDondent
254
255
256 60% (n=3) of the resp 97% (n=44) of resI
(n=40) of the
(n=37) of the .52%
70 64
257
258
259 59.68%
50% (n=3 1) of the respondents froi 53.23% (n=33) of the respondents
260
261
The majority of the individuals participating in the open public consultation argued that
no service provider should be subject to such a legal obligation262. They also underlined that the legislation should not include the mandatory or voluntary detection, removal and
reporting of any of the proposed types of child sexual abuse (known material, unknown
material, online grooming, live-streaming).
Service providers located outside theEU
It was acknowledged that a new legislation should apply to service providers that offer services within the EU, even when the providers themselves are located outside the EU. The idea has been widely accepted by public authorities263, companies264 and civil
society organisations.2651 0n the other hand, more than 50% of the general public opposed to the idea of legislation which would be applicable to service providers that offer services within the EU, when the providers themselves are located outside the EU266
Encrypted environments
Opinions are divided on the question of whether online service providers who offer
encrypted services should be obliged to detect, remove and report child sexual abuse online in their services.A large majority of the respondents representing public authorities267 would support it, as would a majority of the respondents representing NGOs268. They highlighted the importance of ensuring that any action of detection, removal and reporting should be in line with applicable human rights and privacy laws.
47.62% of the respondents from public authorities identified the introduction of end-to end encryption as a challenge in their investigative work, because it results in difficulties in accessing evidence of child sexual abuse. 80.95% also considered that relevant online service providers who offer encrypted services should be obliged to maintain a technical
capability to proactively detect, remove and report child sexual abuse online in their services and platforms.
However, other stakeholders, such as civil society organisations dealing with privacy and
digital rights, consumer organisations, telecommunication operators, and technology companies, raised concerns, flagging the need to preserve the balance between privacy and security; fundamental rights must be preserved, especially the right to privacy and
digital privacy of correspondence. Privacy and digital rights organisations also underlined the need to p reserve strofa e ncrmtion.
Like other relation
1eglsl肌1
to On
groups, privacy should
business associations of communications.
and individuals expressed their concerns
According to business associations, n
put in place safeguards limit 山e l i ation 吐 m t to the mon
correspondence to known suspects and require judicial authorisation, mandate it as the default position of online service providers.
Business associations
marginalized groups and urge expressed concerns
the need for effective
lg of er than
priv
n N
e
。し a
legally
about the potential harm to
encryption to ensure the online
262 62.68% (n=299) 0f the individuals. 263 95.24% (n=20) 0f the respondents from law enforcement or other public authorities. 264 62.5% (n=10) of the respondents from companies and business organisations. 265 80.65% (n=50) of respondents from non-governmental organisations. 266 55.65% (=265) disagree, and 38.36% (n=183) agree. 267 95.24% (n=20) of the respondents from law enforcement or other public authorities 268 69.35% (n=43) of the respondents from non-governmental organisations.
144
safety 0f groups at risk (including children, member of the LGBTQ+ conmiunity, and
survivors of domestic abuse).
Servlce providers and digital technology industry highlighted the need to distinguish services which host and serve public, user-generated content from private messaging services and warned not to undermine, prohibit or weaken end-to-end encryption. The new legislation should take into account the key role of encryption in providing and
ensuring private and secure communications to users, including children, and its integrity should be safeguarded and not weakened.
Individuals stressed that service providers should not be obliged to enforce such measures (detection, removal, reporting) in encrypted services269 s earching encrypted communications in their view would require adding backdoors to encryption technology and thus threaten to weaken the security of communications in general, which many citizens, businesses and governments rely on.
safeguards
The actions to detect, remove and report child sexual abuse online may require safeguards to ensure the respect of fundamental rights of all users, prevent abuses, and ensure proportionality.
Public authorities
Public authorities agreed that the legislation should put into place safeguards to ensure the respect of fundamental rights of all users, prevent abuses and ensure proportionality. In particular, the tools used to detect, report and remove child sexual abuse online needed to comply with the data minimization principle and rely on anonymised data where this is
possible270. The tools should further comply with the purpose limitation principle, and use the data exclusively for the purpose of detecting, reporting and removing child sexual abuse online27 1.s ome respondents warned as to the challenges relating to the data retention period and the legislative compliance assessment of online service providers.
Companies
About half of company respondents also highlighted that the tools used to detect, report and remove child sexual abuse online should be the least privacy intrusive, comply with the data minimization principle and rely on anonymised data where possible272. Close to half stated that the new legislation should also include safeguards to ensure that reports containing new material or grooming are systematically subj eet to human review before the reports are sent to law enforcement or organisations acting in the public interest
against child sexual abuse273. Data should be used exclusively for the purpose of
detecting, reporting and removing child sexual abuse online and the tools used should
comply with the storage limitation principle.
Non-governmental organisations
269 89.73% (n=428) of the respondents from the general public. 270 57.14% (n=12) fully agree and 9.52% (n=2) partially agree, of the respondents from law-enforcement
or other public authorities. 271 76.19% (n=16) of the respondents from law enforcement or other public authorities. 272 37.5% (n=6) fully agree and 12.5% (n=2) partially agree, of the respondents from companies. 273 31.25% (n=5) fully agree and 12.5% (n=2) partially agree, of the respondents from companies.
145
Service providers' actions to detect, remove and report child sexual abuse online need to
be proportionate and subject to safeguards, according to NGO respondents. Most of the
respondents agreed on the need for a clear complaint mechanism for users274.
significant majority stressed that effective remedies should be provided to users275 that have been erroneously affected by the actions of the service provider to detect, report and remove child sexual abuse online. Furthermore, most deemed essential that service
providers would make clear in the Terms and Conditions that they are taking measures to
detect, reiort and remove child sexual abuse online276.
General public
Concerning safeguards, more than half of individual respondents flagged the need to ensure the availability of a clear complaint mechanism277 and effective remedies278 for users that have been erroneously affected.S 1ight1y more than half also thought it was
important that providers made clear in the Terms and Conditions that they are taking measures to detect, report and remove child sexual abuse online,279 as well as to ensure that the tools used to detect, report and remove child sexual abuse online are the least
privacy intrusive280.
&mctms
The majority of the respondents from law enforcement and other public authorities28 1 and
from non-governmental organisations282 would support both financial and criminal sanctions if companies have been found to not meet their legal obligations related to the
detection, reporting and removal of child sexual abuse. However, 4.84% of the
respondents from NGOs partially disagree with imposing financial sanctions, while 9.67% would further disagree with imposing criminal sanctions to online service
providers283
50% of the respondents from companies and 60% business associations stated that online service providers that erroneously detect, report or remove child sexual abuse online in
good faith should not be subject to financial or criminal sanctions. 60% of the
respondents from business associations disagree with imposing criminal sanctions to
companies if they fail to meet the legal obligations related to detection, reporting and removal of child sexual abuse online. Detection and removal, in their view, were best
placed as part of voluntary requirements to encourage innovation to further develop and
deploy technology in this area, while it was also seen as crucial to support national law enforcement authorities responsible for pursuing and prosecuting crimes related to CSAM.
274 83.87% (n=52) 0f the respondents from non-govermnental organisations. 275 75.81% (n=47) 0f the respondents from non-governmental organisations. 276 82.26% (n=5 1) 0f the respondents from non-governmental organisations. 277 59.54% (n=284) of the respondents from the general public. 278 61.64% (n=294) of the respondents from the general public. 279 57.23% (n=273) of the respondents from the general public. 280 51.78% (n-247) of the respondents from the general public. 281 3333% (n=7) fully agree and 52.38% (n=1 1) partially agree on criminal sanctions; 80.95% (n=17)
fully agree and 14.29% (n=3) partially agree on financial sanctions. At the same time, 9.52% (n=2) would Dartiallv disauree with such measures. ’ろ . /1 'o( n=L4) muly agree ana乙乙コろツ0( n=14) paniauly agree on criminal sancuons; )Q&7 (54) muly agree and 16.13% (n= 10) partially agree on financial sanctions.
283 8.06% (n=5) partially disagree and 1.615(n= 1) fully disagree with imposing criminal sanctions.
146
General public
Around 26% of the respondents suggested that companies should not be subject to any financial or criminal sanctions284 while 19.92% and 15.72% believe that companies should be subject to financial and criminal sanctions, respectively.
7rmparncア reports andpe加rmance indicators
Three quarters of public authorities and non-governmental organisations underlined that
transparency reports should be obligatory285,286 and standardized287・288 in order to provide uniform quantitative and qualitative information to improve the understanding of the effectiveness of the technologies used as well as the scale of child sexual abuse online
Public authorities
More than 80% of law enforcement and other public authorities expect transparency reports to include information on the number of reports of instances of child sexual abuse
online reported, by type of service289. They also highlighted that reports, as well as the number of perpetrators investigated and prosecuted as a result of a report, by company and type of service, should be taken into account in assessing the success of the possible legislation. The number and ratio of false positives (an online event is mistakenly flagged as child sexual abuse online) of the different technologies used should also be included, based on the 38% of the replies.
Companies and business associations
Close to half of respondents thought that transparency reports should include information on whether data are shared with any third party and on which legal basis, as well as information related to the policies on retention of data processed for the detecting, reporting and removal of child sexual abuse online and the data protection safeguards applies290. The number and ratio of false positives (an online event is mistakenly flagged as child sexual abuse online) of the different technologies used should be also taken into account29 i The size of each organisation and enterprise should be taken into account to ensure that they have the necessary infrastructure in place to respond to any regulatory and/or supervisory requirements.
Non-governmental organisations
82.26% of the replies coming from non-governmental organizations, flagged that reports should include information about the time required to take down child sexual abuse material after it has been flagged to/by the service provider while the measures applied to remove online child sexual abuse material in line with the online service provider's policy (e.g. number of accounts blocked) identified as an important element of a transDarencv reDort by 80.65% of the resDondents.
crirl 284 25.79% (n123) fully disagree (on financial sanctions) and 26.62% (n=127) fully disagree (on sanctions), of the respondents from the general public.
285 76.19% (n=16) of the respondents from law enforcement or other public authorities. 286 75.81% (n=47) of the respondents from non-governmental organisations. 287 80.95% (n=17) of the respondents from law enforcement or other public authorities. 288 74.19% (n=46) of the respondents from non-governmental organisations. 289 85.7 1% (n=18) of the respondents from law enforcement or other public authorities. 290 4375% (n=7) of the respondents from companies and business organisations. 291 4375% (n=7) of the respondents from companies and business organisations.
147
General public
According to individuals, the success of the possible legislation should be monitored based on the number of victims identified and rescued292 and the number of perpetrators investigated and prosecuted as a result of a report293, by company and type of service. Academia
75% of academic and research institutions supported the idea of transparency reports which would be obligatory, and evaluated by an independent entity. They further stated294 that these reports need to be standardized in order to provide uniform
quantitative and qualitative information to improve the understanding of the effectiveness of the technologies used as well as the scale of child sexual abuse online.
coordination and offline.
EuroPean centre切Prevent and counter chiM sexual abuse
There is broad consensus among all respondents on the need for additional to EU level in the fight against child sexual abuse online further emphasized the need to avoid duplication of efforts.
and support Stakeholders
In the area of prevention, overall, respondents supported an EU initiative to create an EU Centre to stimulate the exchange of best practices and research and cooperate with non-
governmental organizations, law enforcement authorities, educational institutions and
academia, and experts, with a view of facilitating the coordination of actions undertaken
by competent authorities and relevant stakeholders.
The majority of the respondents, all categories included, reflected that a possible EU Centre would serve to support Member S tates in putting in place usable, rigorously evaluated and effective multi-disciplinary prevention measures to decrease the
prevalence of child sexual abuse in the EU295
Public authorities
Law enforcement and other public authorities confirmed almost unanimously the need for additional coordination and support at EU level in the fight against child sexual abuse online and offline296, to maximize efficiency and avoid duplication. A coordinated
response at EU level (and beyond) could deal with challenges related to law enforcement,
prevention and assistance to victims.
Among the most widely supported functions of the EU Centre, to support law
enforcement, respondents acknowledged the need to maintain a single EU database of known child sexual abuse material to facilitate its detection in companies' systems297 The EU Centre would further help ensure the relevance of the received reports, determine
jurisdiction(s), and forward them to law enforcement for action298. In addition, the EU Centre would support law enforcement authorities to coordinate and facilitate the take
2り241.93% (n=200) 0f the general public. 293 41.5 1% (n=198) 0f the general public. 294 100% (n=4) 0f the respondents from academic arid research institutions. 295 85.71% (n=18) from public authorities; 37.5% (n=6) from companies; 83.87% (n=52) of the
respondents from non-govermental organisations; 40% (n=2) from business associations; 37.53%
(n=179) from the general public; arid 100% (n=4) from academic and research institutions. 296 85.71% (n=18) of the law enforcement authorities or public authorities. 297 76.19% (n= 16) of the respondents from law enforcement or other public authorities. 298 66.67% (n=14) of the respondents from law enforcement or other
public authorities.
148
down of child sexual abuse material identified through hot1ines299. Regarding the
implementation of robust technical and procedural safeguards, respondents flagged it is critical in order to ensure transparency and accountability as regards the actions of service providers300. Coordinated actions on a global level, law enforcement cooperation, and exchange of best practices as well as proper resources distribution and support noted as key actions to stop the cycle of abuse.
Practitioners from law enforcement or other public authorities301 acknowledged the key role of the implementation of EU law in relation to assistance to victims of sexual abuse while highlighting the importance of cooperation with different stakeholders in the area of victim protection, assistance and s upport302. Identification of possible legislative gaps, research, and victim's participation, awareness raising campaigns, proper education and
training were further listed amongst the suggested measures and good practices. A
majority of the respondents would welcome the creation of an EU body303. 4.76% identified public- private partnerships and non for profit organisations as the most
appropriate types of organisation for the possible Centre. The Centre should be funded
directly from the Union budget (90.48% of the replies); or to receive funding from
voluntary contributions from industry or non for profit organisations (28.57% and 23.81% of the replies, respectively).
Companies
37.5% of the survey participants representing companies and business organisations confirmed the need for additional coordination and support at EU level in the fight against child sexual abuse online and offhne, to maximize the efficient use of resources and to avoid duplication of efforts. Companies and business organisations representatives reflected that the Centre should be serve as a hub for connecting, developing and
disseminating research and expertise, facilitating the communication and exchange of best practices between practitioners and researchers304, to support prevention efforts
Furthermore, the role of the Centre would be relevant to support efforts to assist victims of child sexual abuse. The Centre could further support the exchange of best practices on
protection measures for victims and further support victims in removing their images and videos to safeguard their privacy. At the same time, it is crucial to ensure that the
perspective of victims is taken into account in policymaking at EU and national level.
Like other groups, most of the respondents305 considered that the possible Centre should be funded directly from the Union budget, while 18.75% support voluntary contributions from industry or non for profit organisations as the most appropriate type of funding.
The idea of the creation of an EU Centre to prevent and counter child sexual abuse had found broad support from business associations. The EU Centre can play a key role in the fight aaainst child sexual abuse and exploitation if desianed to com〕lement and build
61.9% (n=13) 0f the respondents from law enforcement or other public authorities. 57.14% (n=12) of the respondents from law enforcement or other public authorities. 80.95% (n=17) of the law enforcement authorities or other public authorities. Civil society organisation, non-goverllmental organisations, child protection associations and victim
protection institutions, law enforcements authorities, lawyers, doctors, experts and academia. 76.19% (n=16) of the law enforcement authorities or other public authorities. 37.5% (=6) of the respondents from companies. 56.25% (n=9) of the respondents from companies.
149
299
300
301
302
303
304
305
upon the existing infrastructure. The EU Centre should remain in full harmony and
cooperation with other bodies to avoid duplication of efforts and a conflict of reporting obligations to avoid an impact on the efficiency of the system. Additional coordination and support at EU level is needed to improve the sufficiency of communication and
exchange of best practices between practitioners and researchers in the area of
prevention306. In parallel, it was seen as critical to publish aggregated statistics regarding the number and types of reports of child sexual abuse online received in order to ensure transnarencv and accountabilitv reaardina actions of service nroviders307.
Non-governmental organisations The majority of respondents308confirmed the need for additional coordination and
support at EU level in the fight against CSA online and offline. Most of the participants from non-governmental organisations identified as main challenges in the fight against child sexual abuse that could benefit from additional support and coordination at EU
level, the lack of evaluation of the effectiveness of prevention programmes309 as well as the insufficient communication and exchange of best practices between practitioners (e.g. public authorities in charge of prevention programmes, health professionals, NGOs) and
researchers310, both in the area of prevention and in relation to the assistance to victims.
Respondents from non-governmental organisations acknowledged, as the most relevant functions of the EU Centre to support law enforcement, the need to monitor the take down of child sexual abuse material by differents takeho1ders311 as well as to maintain a
single EU database of known child sexual abuse material to facilitate its detection in
companies' systems312. in parallel, ii旦y agreed that, it is critical, amongst others, to ensure that the tools employed are sufficiently accurate3 13, and are not misused3 14for
purposes other than the fight against child sexual abuse. Non-governmental organisations further acknowledged the key role of the implementation of EU law in relation to assistance to victims of sexual abuse while highlighting the need for supporting the
exchange of best practices on protection measures for victims and the importance of an evidence-based policy on assistance and support to victims3 15.s upport victims in
removing their images and videos to safeguard their privacy and ensure that the
perspective of victims is taken into account in policymaking at EU and national level were also identified as key functions of the future Centre in the area of assistance to victims.
Amid the respondents from non-governmental organisations, 22 welcomed the idea of an EU body3 16, as the most appropriate type for the possible Centre. That was followed by public-private partnership (11.29%) and not for profit organisation (12.9%). 79.03% welcomed the idea of an EU Centre which will receive EU funding. Mandatory levies on
m660% (n=3) 0f the respondents from business associations. 307 40%(n=2) of the respondents from business associations. 308 83.87% (n=52) of the respondents from non-governmental organisations 309 66.13% (n=4 1) of the respondents from non-governmental organisations 310 69.35% (n=43) of the respondents from non-governmental organisations 311 51.61% (n=32) of the respondents from non-governmental organisations 312 61.29% (n=38) of the respondents from non-governmental organisations 313 48.39% (n=30) of the respondents from non-governmental organisations 314 48.39% (n=30) of the respondents from non-governmental organisations 315 83.87% (n=52) of the respondents from non-governmental organisations 316 35.48% (n=22) of the respondents from non-governmental organisations
150
industry (33.87%), voluntary contributions from industry (20.97%) or not-for-profit
organisations (17.74%) included in the list.
General public Additional coordination and support at EU level could be beneficial in the context of
prevention and assistance to victims, in particular to tackle the lack of evaluation of the effectiveness of prevention programmes in place3 17 as well as the effectiveness of
programmes to assist victims3 1 8 Idividuals further identified the lack of an EU approach (i.e. based on EU rules and/or mechanisms) to detect child sexual abuse online and in
particular lack of a single EU database to detect known child sexual abuse material
(24.11 %) and the lack of an EU approach to determine relevant jurisdiction(s) of the instances of child sexual abuse online and to facilitate investigations (28.93%) as main
challenges.
In order to ensure accountability and transparency regarding actions of services providers to detect, report and remove child sexual abuse online in their services, the Centre should
against receive service
employed are not misused for purposes other than the fight . 42.77% of the individuals consider that the Centre could
ensure that the tools child sexual abuse3 19
complaints of users who feel that their content was mistakenly removed by a
provider, and ensure that the tools employed are sufficiently accurate.
In the area of prevention, the Centre could serve as a hub for connecting, developing and
disseminating research and expertise, facilitating the communication and exchange of best practices between practitioners and researchers320. The Centre could further carry out research and serve as a hub of expertise on assistance to victims of child sexual abuse
as well as support the exchange of best practices on protection measures on victims32 1.
support victims in removing their images and videos to safeguard their privacy and ensure that the perspective of victims is taken into account in policymaking at EU and national level were also identified as key functions of the future Centre in the area of assistance to victims. Almost 50% of the respondents agreed that the new Centre should receive direct funding from the Union budget. Voluntary contributions from not-for-
profit organisations (24.11%) or from industry (19.71%) and mandatory levies on
industry (17.61%) were next on the list.
Academia Academics and researchers fully support the idea of the creation of an EU Centre to face the challenges in the area of prevention. The Centre could support Member S tates in
putting in place usable, rigorously evaluated and effective multi-disciplinary prevention measures to decrease the prevalence of child sexual abuse in the EU. Providing help to
develop state-of-the-art research and knowledge, including better prevention-related data to monitor the take down of child sexual abuse material by different stakeholders could also be a key function of the possible Centre. It could further serve as a hub for
connecting, developing and disseminating research and expertise, facilitating the communication and exchanae of best p ractices between mactitioners and researchers322.
317 47.17% (n=225) 0f the respondents from the general public. 318 46.54% (n=222) of the respondents from the general public. 319 55.14% (n=263) of the respondents from the general public. 320 50.95% (n=243) of the respondents from the general public. 321 39.41% (n=188) of the respondents from the general public. 322 100% (n=4) of the respondents from academic and research institutions.
151
and providing input to policy makers at national and EU level on prevention gaps and
possible solutions to address them.
Practitioners from academic and research institutions further acknowledged the key role 0f the implementation of EU law in relation to assistance to victims of sexual abuse323 while highlighting the importance of cooperation with different stakeholders in the area 0f victim protection, assistance and support. All the respondents from academic and research institutions would welcome the creation of an EU body which should be directly funded from the Union budget.
1meptimn 11lpact且ssessmen tj'4
In total, 41 replies were submitted: 13 by non-governmental organisations, 11 by companies and business organisations, 2 by public authorities, 2 by EU citizens, i by academia/research institutions, 2 by business associations, and 10 by other entities (e.g. UNIFEC, Global Partnership to End Violence against Children, etc.). Interested stakeholders could provide feedback to the Inception Impact Assessment from 2 to 30 December 2020.
The Inception Impact Assessment aimed to inform citizens and stakeholders about the Commission's plans in order to allow them to provide feedback on the intended initiative and to participate effectively in future consultation activities.
The feedback gathered in reaction to the Inception Impact Assessment shows that, in
summary, the initiative enjoys significant support as the stakeholders welcome the Commission's efforts to tackle child sexual abuse online. Providing legal clarity and
certainty as well as the holistic approach of the proposed Centre are seen as the main
positive attributes of the proposal. S ome concerns regarding mandatory reporting, however, arise amongst different actors. The business representatives are primarily concerned about the duplication of reports and the disadvantageous impacts on SM Es.
Furthermore, some believe the legislation should be future proved based on the dynamic development of technology.
323 75% (n=3) of the respondents from academic and research institutions. 324 The Inception Impact Assessment consuitation is available h壁旦. All contributions received are
publically available.
152
げvalMfeedback り categoryげrespondent
Table l了
By category of respondent
Non血governmental organisation (NGO): 13(31.71%)
●Company/business organisation: 11(26.83%)
Other: 10 (24.39%)
● EU citizen: 2 (4.88%)
● Public authority: 2 (4.88%)
" Business association: 2 (4.88%)
Academic/research Institution: 1 (2.44%)
んluntary measures
Companies
Companies and business organisations call for an EU framework allowing continuing voluntary measures to detect report and remove CSAM on their platforms. Many efforts undertaken by companies to tackle CSAM have already been successful on a voluntary basis e.g. the development of tools such as PhotoDNA. Mandatory detection of known and new CSAM could have serious consequences. A legal requirement to apply such tools risks incentivizing companies towards prioritizing removal over accuracy, and could effectively amount to an obligation to screen all content. Taking into account the limited capability of small and medium-sized companies (sIE), voluntary measures to detect CSAM online should be given preference. Reporting mechanisms should be flexible to avoid burdensome requirements for SM Es and overburden LEA. A harmonized approach across the EU, including definitional clarity and exchange of best
practices will increase the effectiveness of online platforms' voluntary efforts.
Legal certainty regarding the detection of child sexual abuse material is fundamental.
Any new EU legal instrument needs to provide sufficient legal basis for online platforms to continue to operate their detection.
Other entities/stakeholders
Most of the contributions from business associations illustrated that any legislation should take into account the limited capability of small and medium-sized companies (SME). Thus, voluntary measures to detect CSA M online should be given preference. The different (technical and financial) capabilities of SME s could not be taken into consideration within a legislative framework that imposes mandatory measures.
Companies could be safeguarded by creating a legal framework allowing voluntary proactive measures under clear conditions securing compliance with fundamental rights.
Obligation to detect known
An obligation to detect known CSAM is expected to have a significant impact on SM Es in terms of capacity, resources and economics. Especially S MEs do not always have
153
access to essential tools to detect CSAM as well as resources to develop this kind of
tools. Using external tools or services can be challenging for small operators, as understandable legal restrictions on the ability to access CSAM.
Companies
Some of the contributions from companies and business associations urge the Commission to take into consideration the potential financial and technical burden that would be placed on smaller companies as a result of the adoption of binding legislative measures The data privacy and customer security issues were also highlighted as
important among companies.
One the other hand, it was flagged that a legal framework which would create a binding obligation for relevant service providers to detect, report and remove known child sexual abuse material from their services could encourage improvement and provide legal certainty. S imple and streamlined reporting obligations that avoid duplication and confusion in a well-functioning system is essential. Participants further underlined the need for transparency reporting obligations to be reasonable, proportionate, and based on clear metrics.
Other entities/stakeholders
The detection, removal and reporting of child sexual abuse online is a necessary element in the broader fight against the exploitation of children and the protection of their fundamental rights. Any legal framework that is put in place in pursuit of these objectives will need to encompass binding obligations for relevant service providers, on a
proportionate basis, and including necessary safeguards. It should ensure legal certainty, transparency and accountability.
Obligation 切 detect new and known CS二4M
Like already mentioned above the legislative option to detect new and known would have a significant impact on SME s.S uch proposal to mandate the detection and removal of 'new' materials must consider technical realities.
Companies The responding companies and business associations said there is a need to to formulate
requirements in terms of best reasonable efforts at the current state of technology. In
addition, that obligations could be differentiated on the basis of size and capability of small and medium enterprises (SMEs) to avoid putting excessive burdens on them. It was further stated that a legal obligation for relevant service providers to detect, report and remove child sexual abuse from their services, applicable to both known and new
material, and to text-based threats such as grooming would currently be in contravention of existing EU law (and the proposed DSA) regarding the prohibition of general monitoring efforts, and would also be a more difficult and costly implementation, especially for the smallest platforms.
Participants further underlined the need for transparency reporting obligations to be reasonable and proportionate.S imp1e and streamlined reporting obligations that avoid
duplication and confusion in a well-functioning system is essential.
Non-governmental organisations Non-governmental organisations called for long term legislation that makes reporting and removal of child sexual abuse material and g romina on their Dlatforms mandatory for
154
service providers. Mandatory detecting, reporting and removal requires a holistic
approach with close cooperation between relevant service providers and stakeholders. As it was further flagged, it is vital that the objectives and obligations are consistent and
compatible with the measures set out in the Digital S ervices Act, particularly around
transparency and reporting mechanisms. Any policy and legislative options shall
incorporate the strongest available safeguards and address the need for greater transparency and accountability within the industry. The Commission needs to provide legal clarity and certainty as well as to adopt a victim-centred approach. The new
legislation must be flexible and future-proof.
Among others, it was stressed that voluntary measures does not meet the overall
objectives of the initiative, which means that efforts to counteract child sexual abuse will continue to be fragmented and insufficient.
Other entities/stakeholders The contributions recognised the importance of legal certainty, transparency and
accountability. Any legal framework that is put in place in pursuit of these objectives (detection, removal and reporting of child sexual abuse online) will need to encompass binding obligations for relevant service providers, on a proportionate basis, and including necessary safeguards. In addition, any new initiative should take into account the best interest of the child as well as ensure that functional prevention measures and victim
support services are in place.
Ell crpt
Public authorities
The great importance of balancing the protection of privacy and the confidentiality of comnrnnication with the legal interests concerned was specifically highlighted among public authorities.
Companies
Companies' representatives urged for legal certainty for the processing of personal data for the purpose of detecting child sexual abuse material. They further stressed that end- to-end encryption must be preserved; any framework should not undermine, prohibit or weaken end-to-end encryption.
Severa1 parties further advised against requirements to weaken and break encryption and recommend instead that appropriate measures are taken so that content can be detected at the endpoints of encrypted communications, whenever appropriate, it was of utmost
importance that the legislative solution chosen remains proportionate to the very purpose of the fight against CSAM.
It was also stressed that any new EU framework should define adequate safeguards efficiently balancing the digital safety interests with users' privacy rights.
Non-governmental organisations
A few stakeholders have shared views on encryption.S pecifica11y, it was recommended that the regulation would include a requirement for service providers of encrypted services to at the minimum facilitate reporting of CSAM and CSE online, including self-
generated material, and prompt action to remove confirmed materials upon request from hotlines and law enforcement authorities.
155
The need for clear legislative frameworks that allow online CSEA to be detected,
removed and reported efficiently in order to safeguard the rights of existing victims but also to prevent abuse from occurring in the first place, protecting the privacy of some of the most vulnerable users of online services, was further underlined. Appropriate and realistic rules should be adopted to ensure the roll out of tools scanning text for potential CSEand CS A in line with the GDPR.
EurcPean centre lo Prevent and counter cん尼sexuaルbuse
Public authorities
The possible creation of a European Centre would create a common front for the harmonization of European legislation in order to prevent and protect children.
Companies
Overall, representatives from companies and business organisations recognised the
importance of the role of an EU Centre to prevent and counter child sexual abuse.
Among the objectives identified objectives are, the role of the Centre as a hub to provide information regarding programmes, services and legislation that could benefit exploited children; as well as to develop and disseminate programmes and information to law enforcement agencies, nongovernmental organisations, schools, local educational
agencies, child-serving organisations, and the general public on the prevention of child sexual abuse exploitation; internet safety, including tips for social media. Provide
adequate assistance and support to victims (and their families) as well as specialized training to law enforcement authorities, civil society organisations and the general public. Non-gOvemiental organisations
Non-governiental organisations welcomed the idea of a European centre to prevent and counter child sexual abuse, which could play an important role in strengthening the
global effort to combat child sexual abuse online. Participants pointed out that the existence of a European Centre would help to ensure continued and improved implementation of the European Directive on combating the sexual abuse and
exploitation of children as well as to share and promote learning and best practice, and
provide rigorous evaluation of existing responses to child sexual abuse.
Address early intervention and prevention of predatory behaviour, as complementary to the detection and identification of perpetrators and child victims is key.
They also flagged the need to enhance global and multi-stakeholder cooperation and enable a coherent approach to tackle child sexual abuse, online and offline. The Centre's functions could include initiatives to improve victim support, law enforcement and
prevention. This must be against a wider background of support for children's rights. Legislation and regulations that may be overseen by the Centre have to prioritize these
rights.
Other entities/stakeholders
Respondents noted that the proposed European centre to prevent and counter child sexual abuse may address some of the challenges relating to coordination and/or duplication of efforts among different stakeholders. The European centre to prevent and counter child sexual abuse and exploitation could also play a critical role to promote enhanced cross- sector collaboration and enaaaement modalities, Darticularlv with industry Dlavers.
156
Focusing on the legal framework, a clear legal framework should be developed to
empower and protect hotlines engaged in handling and accessing illegal material. For effective investigations and prosecutions, law enforcement authorities need adequate staffing and technical solutions. Currently, there seems to be a lack of resources resulting in delays of analysing hard disks etc. after house searches, and identification of victims and offenders. In addition, it should be taken into account that citizens are often afraid or reluctant to report CSAM to law enforcement authorities directly.
There is an additional need to ensure that the new Regulation and the possible EU centre are fully aligned with relevant EU initiatives as well as legislations, policies and
regulations addressing related matters such as other forms of violence.
The EU Centre could further enable improved educational opportunities in schools within the framework of media literacy for both children and parents. It was also highlighted as an important element towards the fight against child sexual abuse, the increased attention to prevention of offending and victimization of children as the best approach to achieve sustainable results at scale and ultimately ensure that children are safe in digital environments. Ensure the views of children are heard and facilitate appropriate ways for
meaningful child participation throughout the consultation, decision making and
implementation processes.
Academic / research institutions
Academic and research institutions welcome an effort to establish an EU centre to
support the effective prevention of child sexual abuse and to help ensure coordinated
post-abuse reporting, detection and intervention efforts.
Targeted survey i一Law elfre ent authorities
The replies to Targeted Su rvey i revealed that:
Origin of reports: o For most EU law enforcement authorities responding (6 1%), reports received
from service providers, either through NCMEC or directly, constitute the
single largest source of reports of child sexual abuse online.
o In the case of 45% of EU law enforcement authorities responding, NCMEC
reports amounted to more than half of all reports received.
Participants were asked several questions regarding the origin and quality of reports of child sexual abuse online received by their organisation. Participants were asked to
provide data in respect of several possible sources of reports:
NCMEC;
Members of the public;
The respondent's own organisation (e.g., based upon a lead arising in another
investigation);
Other public authorities (including law enforcement authorities) in the same country;
Public authorities (including law enforcement authorities) in another country; National hotlines in the same country;
National hotlines in another country;
Directly from service Droviders: and
157
Other sources.
EU law enforcement authorities were invited to participate via EMPACT. Following the validation of data after the survey closed, there were responses from 49 law enforcement authorities in 16 Member S tates.
Originげreports
Participants were asked to respond to the following survey question:
'To understand the various sources of child sexual abuse reports that you receive, please estimate the percentage of reports from each of the sources (the total should be around
100%)'
For each of the possible sources, participants were required to select the percentage range corresponding to the approximate percentage of reports received from that source.
Qualilyげreports
Participants were asked to respond to the following survey question:
Question: 'To understand the quality of the child sexual abuse reports that your organisation receives, please estimate the percentage of reports that are actionable (i.e. that can be used to start an investigation) for each of the different sources'
For each of the possible sources, participants were required to select the percentage range corresponding to the approximate percentage of reports from that source that are
typically actionable.
Table 2 shows, for each source, the number of EU law enforcement authorities that estimated that the percentage of reports received by their organisation falls into each of the percentage ranges.
Table 2: Number げrespondents answering that a given percentageげreportsげCSl4 online are received かom each source
NCMEC
Public
Own organisation
Other pubic authorities (same
country)
Other pubic authorities
(different country)
Hotline (same
country)
Hotlme (different
country)
」 。 , 吻 」 ぐ 。 Z 、
① 目 ‘ 《 易 国 ち 」 」 【 。
よ。。『《。
よ 中 ひ I 《 加
軌 跳
画
% % %
つ」 0
0
0 ド?ドノ
烈 0
0
よ O L I ご
眺 眺
眺
よ09I《切 既
眺 m
よ。NI《《
37% 22%
59%
67%
61%
14% 4% 0% 6% 0% 0% 0% 0%
0% 0% 2% 0% 0% 0% 0% 16%
% %
ノb 4
,
1
24%
39%
% %
0 0
% %
0 0
% %
0 0
% %
0 0
% %
0 0
% %
0 0
% %
0 0
% %
0 0
Qゾ vバ
%
区 g
α
158
43%
63%
51% 4% 2% 0% 0% 0% 0% 0% 0% 0%
31% 2% 2% 0% 2% 0% 0% 0% 0% 0%
Service providers
(directly)
Other
Table 3: Percentageげrespondents answering that more than 50% and 70%可reports received jhml a given source are actionabk
More than 70% of reports are actionable
■ Public
Other public authorities (same country)
More than 50% 0f reports are actionable
■NCMEC
■Own organisation
■Other public authorities (different country)■ Hotline (same country)
■Service providers (directly) ■ Hotline (different country)
■Other
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
Participants were also asked to respond to the following survey question:
'What are the main reasons that make a report non-actionable?'
For each of the possible sources, participants were required to select the typical reasons which lead to a report from that source being non-actionable. There was no limit on the number of reasons that could be selected for each source. Reasons were to be selected from the following list, with the option for respondents to specify other reasons:
Reported content is not illegal under national law;
Insufficient information contained in report;
Report relates to reappearance of known content;
Insufficient resources;
Investigation not promising;
Other (please specif
Useげreports 伽vestigationり
Participants were asked to respond to the following survey question:
'To understand how investigations of child sexual abuse typically start, please estimate the percentage of investigations that start with a lead from each of the sources below (the total should be around 100%)'
159
For each of the possible sources, participants were required to select the percentage range
corresponding to the approximate percentage of reports received from that source.
Targeted survey 2一Data regarding reportsげchild sexual abuse online receivedりlaw
e可brement au琉orities
Time required to process reports
Participants were asked to estimate the average time taken to process a report. For the
purposes of this survey, the time to process a report was interpreted as meaning the total number of hours of work required to prioritise an incoming report, to investigate the
report, and to report back on the outcome of any resulting investigation.
Tble4 shows the average time required for each of these tasks.
Tilbk 4: Time requiredpr processingげreフortsげchiM sexual abuse onlineりたw
enforcement authorities
�
�Pri
oritisation o
f
re p
orts (
time p
er re p
or
t) � 0.
47
� 0.
47
� 0.
47� I Investigation I 57.75� 102.27� 89.82�
I Rporting on the outcome of thein vesti2ation I 0.32 I 0.32 I 0.32 I
I Total I 58.54 I 103.06 I 90.61 I
�Total (rounded to nearest 10 hours) � 60� 100� 90�
Information to be included in reports
In order to determine the information that a report should contain to make it actionable to law enforcement, participants were asked to indicate the importance of several types of information by categorising them under the following possible options:
Critical - the report cannot be actioned without this information.
Useful - the report can be actioned without this information, but it should be included
if it is available.
Not relevant - there is no need to include this information in a reDort.
Participants were also given the option to specify other relevant information.
Table 5 shows the percentage of respondents that categorised each type of information as
critical, useful or not relevant (excluding participants who did not select an option for a
given type of information).Table 5 shows the percentage of respondents that categorised each type of information as critical, useful or not relevant (excluding participants who did not select an option for a given type of information).
160
Tabl 5. percentageげrespondents indicating that eachりpeげinformation is critical,
u sげid or not relevant in order to ensure that a report is actionable
I
n釦
rmatlon to b e mc
luded
m re p
or
t -
I
Inf
o mation
be
i n
lude
d in p
t 9h_
_ Information relating to the provider making the re)ort
Name of the provider 81% 19% 0%
Point of contact in service provider 33% 57% 10%
Jurisdiction in which the service provider is located 25% 50% 25%
Other information (please specify) 40% 20% 40%
General information relating to the report:
Indication of whether the report is urgent (child in ininiinent
danger of actual sexual abuse) or not 62% 38% 0%
More detailed indication of level of urgency (please specify) 35% 41% 24% Nature of report (e.g.,CSAM images/videos, grooming, live-
streaming of abuse) 48% 52% 0%
Copy of reported content 95% 5% 0%
Additional relevant content data (please specify) 46% 38% 15%
Type of service on which reported content was detected 67% 33% 0%
Date/time the reported content was detected 76% 24% 0%
Languages used in the reported content 29% 57% 14%
Technology which detected the abuse 14% 62% 24%
Traffic data 60% 40% 0%
Other information (please specify) 33% 33% 33%
Information relating to child victim(s) related to reported content:
Actual age of child victim(s) 48% 48% 5%
Estimated age of child victim(s) (if actual age unknown) 20% 75% 5%
Name of child victim(s) 48% 43% 10%
Contact information of child victim(s) 43% 52% 5%
Jurisdiction(s) in which child victim(s) are located 43% 52% 5%
Relationship between child victim and suspect 33% 67% 0%
Injuries displayed by child 24% 76% 0%
Psychological state of child 14% 71% 14%
Other information (please specify) 33% 22% 44%
Information relating to suspect(s) related to reported content
Name of suspect(s) 71% 29% 0%
Contact information of suspect(s) 65% 35% 0%
Jurisdiction(s) in which suspect(s) are located 35% 65% 0%
Other information (please specify) 42% 25% 33%
Impact of encryption on investigations into child sexual abuse
In order to obtain further insight into the manifestation of encryption in criminal
investigations relating to child sexual abuse and the level of challenge this poses to law
enforcement, participants were asked to estimate the proportion of investigations in
which encryption had an impact.
161
Participants were asked to consider the proportion of investigations of child sexual abuse
where encryption (at rest/at motion):
Appeared;
Delayed the course of an investigation, having a negative impact on safeguarding
victims;
Resulted in an inability to achieve prosecution and/or conviction; and
Resulted in investigations being altogether stopped.
In each case, participants were asked to indicate which of the following categories applied:
None - very few (0%-25% of investigations affected in this way);
Very few - half of my workload (25-50% of investigations); half of my workload - very often (50-75% of investigations); or
Very often - all the time (75-100% of investigations).
Tale6 shows the percentage of respondents that indicated that the proportion of cases
impacted fell into each cate
Tabl 6: proportionげcases impacted勿encryption ゆercentageげrespondents
selecting each catego切
Proportion of cases where encryption at rest...
Appears
Delayed the course ifa criminal investigation, having a
negative impact on safeguarding victims
Resulted in an inability to achieve prosecution and/or conviction
Resulted in investigations being altogether stopped
Proportion of cases where encryption in motion...
Appears
Delayed the course ifa criminal investigation, having a
negative impact on safeguarding victims
Resulted in an inability to achieve prosecution and/or conviction
Resulted in investigations being altogether stopped
None -
very few
Proportion of cases affected
Very few - half
Half-
very often Very often -
all the time
29% 47%
53% 21%
53%
82%
47%
47%
63%
79%
眺 m
眺 眺
24%
16%
16%
24%
m 叫
m m
m 勤
Participants were also asked to indicate where encryption 'at rest' is most commonly found in investigations, based on four options. The responses to this question are summarised in
Table.
162
Tabl 7. Where di lawe可orement authorities most co mmonかencounterd enc rption
of data 'at rest'?
Percentage of
respondents
26%
42%
32%
0%
100%
Where do you most commonly encounter encryption of data 'at rest'?
External hard-drives! thumb storage
Encrypted smartphones!laptops Password protected File sharingノfile hosting/Cloud storage Other (please specify) Total
2. Meetings
The meetings, and in particular the "expert process" organised by the Commission, were an integral part of the consultation activities and were instrumental in developing the
problem definition and the options described in the impact assessment.
The feedback received in the meetings was not limited to ideas presented by the Commission, in many occasions, they were the stakeholders themselves who produced ideas for discussion.
SeeArmex 2.3. for procedural information on the different meetings in which feedback from stakeholders was gathered.
3.Conferences
The conferences were an opportunity to present the Commission's work and gather feedback in person from stakeholders in a setting that allows a wider reach than the above meetings.
SeeAnnex 2.3. for procedural information on the different meetings in which feedback from stakeholders was gathered.
2. S urveys
1) Open public consultation
The European Commission launched an open public consultation325 on 11 February 2021 which closed after 8 weeks, on 15 April 2021. The shorter consultation period to
compared to the 12 weeks period usually applied by the Commission was defined in order to ensure that its outcome could be used for the preparation of the Impact Assessment. To mitigate the impact that a reduced timeframe could have on the
participation in the consultation, the Commission disseminated the call for contributions
widely, including through the targeted discussions and consultations. In addition, the Commission mn campaigns on mainstream social media. The purpose of the present open public consultation was to gather evidence from citizens and stakeholders to inform
325 Available here.
163
the preparation of the EU S trtegy for a more effective fight against child sexual abuse
initiatives and it was part of the data collection activities that the related inception impact assessment announced in December 2020.t aimed to gather feedback on current
practices as well as on practical and legal problems arising both at national and EU level from gaps and weaknesses of existing regulations. It also listed possible options to address shortcomings and provided an opportunity to indicate preferences for elements that should be included in a solution. It was addressed to a broad range of interested
stakeholders, including public authorities, EU institutions and agencies, international
organisations, private companies, professional and business associations, NGOs, academics and the general public.
The Open Public Consultation was conducted through an online questionnaire published on the internet in all EU official languages. It was advertised on the European Commission's website, through social media channels (DG HOME, DG CNECT and
Europol's EC3 Twitter accounts326), through established networks of stakeholders (e.g. WePROTECT Global Alliance, public authorities, hotlines, academia, etc. ) and at all relevant meetings.
l77 from individuals in the general public and 94 from
pacity or on behalf of an organisation. Among the 477 there was i person who has been a victim of child sexual
603 responses were collected:Z
practitioners in a professional ei
responders from general public, abuse.
The members of the general public selected a range of countries of residence: (AT, BE, BG, HR, CZ, DK, FT, FR, DE, EL, lU, TE, TT, LT, NL, PL, PT, RO, Es, sE, UK, RU, BW, XK, AL, IL, Philippines, Us, VEN, and血 dia. Es,
63 practitioners were members of non-governmental organisations, which is the largest professional group among the 129 practitioners who submitted the questionnaire in their
professional capacity or on behalf of an organisation. Other responders included:
private companies (private sector);
international or national public authorities (e.g. law enforcement agencies,
Ministries, etc.)
business or professional associations (e.g. trade associations) consumer organisations; academic and research institutions;
other entities (e.g. Bar Associations, faith-based organisations, etc.)
They were based across 23 European countries (AT, BE, BG, CY, DK, FI, FR, DE, EL, lE, IT, LV, LU, MT, NL, NO, PT, RO,S ', ES, sE, CH, UK), as well as Thailand, AU, NZ, ZT, RU, BR, French Guinea, and Us.
The respondents could also upload a document in order to provide additional information or raise specific points which were not covered by the questionnaire. The following entities submitted additional information:
Leaseweb Global B.V. - EU based IAAS Cloud hosting provider, The Netherlands
326 Based on the latest Twitter analytics fir the open public consultation to the fight against chiid sexual
abuse, the total number of impressions on DG HOME's main tweet was over 110.000.
164
GISAD i.G. (Global Institute for S tructure relevance, Anonymity and
Decentralisation), Germany
University of Ljubljana, Faculty of Education,S 1ovena
University of Hull, United Kingdom
Internet society, United S tates of America
Ministry of Justice, Denmark
BTplc, United Kingdom
Bundesverband der Freien Berufe - BFB, Germany German Bar Association (Deutscher Anwaltverein - DAV), Germany
EDRi, Belgium DOT Europe, Belgium
Twitter, United S ttes of America
TikTok Tecimology, Ireland
Match Group, United S tates of America
Se comba GmbH, Germany
Open-Xchange AG, Germany
Austrian Bar Association, Austria
Global Encryption Coalition, United S ttes of America
CuMECE (Conmrission of the Episcopates of the European Union), Belgium
International Justice Mission Netherlands
Electronic Frontier Foundation, United S tates of America
International Centre on Se xua1 Exploitation, United Kingdom
Thorn, United S ttes of America
Terre des Hommes Netherlands, The Netherlands
Defence for Children - ECPAT the Netherlands
Defend Digital Me, United Kingdom
Google, UnitedS ttes of America
Victim S upport Europe, Belgium National Center on Se xua1 Exploitation / International Centre on S exua1 Exploitation, United S tates of America
Irish Sa fr Internet Centre, Ireland
End FGM : European network, Belgium Federation of Catholic Family Associations in Europe, Belgium
Facebook, UnitedS tates of America
ETNO (European Telecommunications Network Operators' Association), Belgium
Norwegian authorities (Ministry of Justice, Ministry of Health and Care Se rices,
Ministry of Children and Families, Ministry of Local Government and
Modernisation, Ministry of Culture and Equality), Norway Permanent Representation of France to the EU, Belgium
Digital Europe, Belgium
Bumble, United S tates of America
165
The Lego Group, Denmark
Ministry of Justice and S ecurity, The Netherlands
In addition, two EU citizens submitted additional information.
Results of the public consultation are analysed and integrated in this annex as well as in the dedicated sections of the Impact Assessment.
Inception Impact Assessment
A call for feedback, seeking views from any interested stakeholders, on the basis of the
Inception Impact Assessment. The consultation, sought feedback from public authorities, businesses, civil society organisations and the public, was open for response from 2 December 2020 to 30 December 2020. Participants of the consultation were able to
provide online comments and submit short position papers, if they wished, to provide more background on their views.
2) Targeted surveys
Targeted S urey 1 - Online survey for law enforcement: Tackling child sexual abuse online
The purpose of this survey was to gather qualitative and qualitative information on the current state of play in Member S tates concerning the origin, quality and use of reports of child sexual abuse online law enforcement authorities receive.
The survey was addressed to law enforcement authorities in all Member S ttes.
The Commission received replies from sixteen (16) Member S tates. The national replies were coordinated at national level amongst different responsible ministries, the judiciary and law enforcement authorities.
The questionnaire was launched on 4 March 2021 and closed on 19 March 2021.
Targeted survey 2 - Data regarding reports of CSA online received by law enforcement authorities
The purpose of this targeted consultation was to gather data on:
the costs associated with reports of child sexual abuse online received by law
enforcement authorities (LEAs);
how the quality of reports can be improved; and the imDact of encrvrtion on investiaations.
The survey was addressed to law enforcement authorities in all Member S tates
The questionnaire was launched on 26 April 2021 and closed on 10 May 2021
3. Expert Groups, conferences and bilateral meetings
new and
evidence-based preparation of the Commission services organised
To gather feedback and data to support the
legislation to fight child sexual abuse, the
participated in various group meetings: with Member S ttes, including the Presidency, but also with a number of private sector service providers and civil society organisations.
166
(iroup expert meetings
Expert group on the implementation of Article 25 0f Directive 2011/93/EU
The Commission organised an expert workshop to support Member S ttes in the
implementation of Article 25 of Directive 2011/93/EU on the detection, taking down and
blocking of online child sexual abuse material. Representatives of EU Member S ttes,
Europol, Interpol and the iNHOPE hotlines took part. Participants discussed detection, removal of CSAM hosted in and outside of Member S tates' territories, and blocking of
illegal content. Challenges included issues such as mandatory reporting, bulletproof hosting, and removing fast moving content.
Expert workshop on current and future challenges in the fight against child sexual abuse
On 6 S eptember 2020, representatives from the EU Member S tates, Europol, hiterpol, the US department of Homeland Se curity and US department of Justice, and the WeProtect Global Alliance participated in an expert workshop organised by the Commission on current and future challenges in the fight against child sexual abuse. During the
workshop participants identified and suggested possible solutions to a number of existing and upcoming trends and challenges in the fight against child sexual abuse, both in its offline and online forms.
Meeting with civil society organisations on the upcoming legislation to fight against child sexual abuse
On 19 February 2021 with participation of close to 100 representatives of civil society organisations focused on children's rights and in particular on the fight against child sexual abuse. The focus of the meeting was to give floor to the civil society organisation to present their views on the key point of the upcoming legislation.
Plenary meeting of the Victims' Rights Platform
The first plenary meeting of the Victims' Rights Platform took place on 23 February 2021. The meeting regrouped over 40 participants, including members of the Victims'
Rights Platform and Commission representatives responsible for the victims' related
strategies adopted in the past months. DG HOME presented the state of play of the EU
strategy for a more effective fight against child sexual abuse focusing on victims' related
actions, such as the upcoming European Centre to prevent and counter child sexual abuse.
Meeting with privacy-focused civil society organisations on the upcoming legislation to fight child sexual abuse
On 26 February 2021, an online meeting with privacy-focused civil society organisations. The meeting was attended by six representatives of civil society organisations dealing with privacy and digital rights. Participants welcomed the
opportunity to share their views on key points that the upcoming legislation could address and contribute to find effective means to detect abuse and support victims, while
avoiding interfering with fundamental rights ifall internet users.
Meeting with the National Centre for Missing and Exploited Children
The Commission organised a targeted consultation meeting with experts from the National Centre for Missing and Exploited Children (NCMEC) on 4 March 2021. NCMEC welcomed the oDDortunitv to share their views on the uDcomina leaislation and
167
is effective and
and a number of to contribute to ensure that any process set up within ti
complementary to other ongoing efforts. The setting up of the
legislative and practical/operational concerns were discussed.
Meeting with industry stakeholders on the long-term instrument on the fight against child sexual abuse
On 5 March 2021, the Commission brought together a wide range of industry stakeholders with a total of 50 participants attending from 25 companies and
representative organisations. During this targeted consultation meeting, participants expressed their strong support for the creation of a European Centre to prevent and counter child sexual abuse.Se vera1 speakers emphasised the need to ensure that
legislation has regard for the diverse nature of services, and many speakers argued that the initiative should avoid creating duplication of reporting obligations, in particular where comDanies are subject to obliaations to reDort in the US.
Meeting with Member S tates' experts (experts from law enforcement, JIA
counsellors)
On 8 March 2021, the Commission organised a meeting to hear the views of Member States' experts (experts from law enforcement, JIA counsellors) and to exchange on
key points that the legislation should cover and any other consideration that would be useful for the Commission to take into account in the preparation of this legislative proposal. The meeting was attended by 70 representatives of Member S ttes. Participants welcomed the opportunity to share their views and ask questions about the key points of the upcoming legislation. They described a number of problems law enforcement encounters in their actions against child sexual abuse.
Targeted consultation meeting with European Parliament S taff
The Commission organised a targeted consultation meeting with European Parliament Staff (APAs, advisors, etc.) on 10 March 2021, for a dedicated meeting on the long-term instrument on the丘 ght against child sexual abuse. Participants stressed that that the
legislation should be cover both online and offline CSA; and welcomed the possible European centre to prevent and counter child sexual abuse. Challenges included issues such as mandatory reporting and encryption have been discussed.
Network of prevention of child sexual abuse
On 12 March 2021, the Commission brought together the members of the network on
prevention of child sexual abuse, composed of researchers, academics and key NGOs
working in this field, for a dedicated meeting. The Commission presented the efforts on the upcoming legislation to address online child sexual abuse. Participants provided feedback on the efforts that industry could further undertake in this space and the
possible roles that an EU Centre to prevent and counter child sexual abuse could fulfil.
Technical meetings on end-to-end encryption and the fight against child sexual abuse
Several group meetings and bilateral meetings took place from February to December 2020 with technical experts to discuss possible technical solutions to detect child sexual abuse in end-to-end encrypted electronic communications. The paper sumniarising the outcome of that work is in annex 9.
168
Technical meeting on safety by design
A technical meeting on safety by design took place under the umbrella of the EU Internet Forum on 21 0ctober 2021, where industry and civil society stakeholders shared
experiences and views.
Bilateral meetings
In the course of the preparation of this Impact assessment, the Commission has had bilateral meetings with a wide range of stakeholders. The Commission participated in bilateral meetings to gather feedback from stakeholders, including meetings with:
S ervice providers, including individual companies and industry associations;
Public authorities from Member S ttes;
Europol;
UK, US and AU public authorities;
Members of the European Parliament;
NOOs;
Relevant onaoina EU funded Droiect consortia.
Conferences
Commission representatives also participated in various workshops and conferences to and gather additional input. The list below contains the conferences and workshops in which the Commission participated to provide information on the OngOing work and
gather feedback from stakeholders:
ERA seminars on Preventing Child S exua1 Abuse (multiple dates)
Meeting of the Committee of the Parties to the Council of Europe "Lanzarote"
Convention on the protection of children against sexual exploitation and sexual
abuse, 25 Se ptember 2020
Technology Coalition, 24 & 25 March 2021
RENEW webinar on children's rights in the digital world, 30 August 2021
S afr Internet Forum, Deep Dive on Child Se xual Abuse material (CSAM), 7
0ctober 2021
Ministerial videoconference on the prevention and investigation of child sexual
abuse, 12 November 2021
Council of Europe Octopus conference, Workshop 6 - Automated detection of child
sexual abuse materials, 17 November 2021
EU Internet Forum Ministerial, 8 December 2021
Letters from stakeholders
The list below contains letters and public statements expressing their views on the commitments in the EU strategy for a more effective fight against child sexual abuse, and the interim Regulation in particular:
Joint letter signed by six non-govermental organisations (Save the Children,
Denmark, Miudos SegurosNa.Net, Portugal, ArcFund Bulgaria, ECPAT S weden, e-
Enfance, France, 5Rights, UK) on the EUS trategy for a more effective fight against
169
child sexual abuse and the new Commission's proposal for a Regulation on Privacy
and Electronic Communications (11 August 2020),A res(2020)423 1528
Computer & Communications Industry Association statementll - S eptemer 2020
Microsoft letter of 2nd Se ptember 2020, Ars(2020) 4589540
CSAM survivors open letter (supported by 8 organizations including the Canadian
Centre for Child Protection and NCMEC S ttement, 3 December 2020
Canadian Center for Child protection letter to LIBE, 6 0ctober 2020
Canadian Center for Child protection letter to the Rapporteur of European Parliament's Committee on Civil Liberties, Justice and Home Affairs, 9 0ctober
2020
Letter to the European Parliament's Committee on Civil Liberties, Justice and Home
Affairs (LIBE) from supporters (signed by more children's organizations in 21 EU
Member S ttes, 2 EU Associated Countries, 18 international children's
organizations and nine academics or experts), 12 0ctober 2020
EDRi open letter, of 27th October 2020
Press release WeProtect Global Alliance, 30 0ct 2020 and 15 Jan 2021
Australian eSafety Commissioner to LIBE Chair and Vice-Chairs, of 4 November
2020, Ares(2020)6329384
NCMEC letter to LIBE, CULT, FEIl, 17 27 November 2020
Europol - EUCTF S tatement, 23 November 2020
Match Group open statement, 5 December 2020
Missing Children Europe, open statement signed by 25 organisations, 23 December
2020
Missing Children Europe letter to Commissioners Johannsson and Reynders, 17
December 2020, Ares (2020)7732402
Letter to the Rapporteur of the European Parliament's Committee on Civil Liberties, Justice and Home Affairs (LIBE) signed by children's rights organisations , 22
January 2021
UNICEF paper, January 2021
ECPAT International S ttement, 22 December 2020
EP Intergroup on Children's Rights statement. 22 January 2021
UN S pecia1 Representative of the S ecretary-Generl on Violence against Children, the UN S pecia1 Rapporteur on sale and sexual exploitation of children and the UN
S pecia1 Rapporteur statement, 10 February 21PT Minister of Justice to
Commissioner Johansson, 22 February 2021, Ares(2021) 1424242
European Network of Ombudspersons for Children (ENOC) letter to the European Parliament and the Council of the European Union of 15 February 2021
US S enator Cotton announces Resolution urging European Union to Protect
Children from Online Exploitation, 3 December 2020
0ther activities in relation to the interim derogation:
170
Canadian Center for Child protection website dedicated to the interim proposal
website dedicated to the interim proposal
NCMEC website dedicated to the interim proposal (including data reduction of
reports since December 2020) (NCMEC website dedicated to the interim proposal
(including data reduction of reports since December 2020) (
NCMEC petition in Chicago327 (35 000 signatures)
4. How the results have been taken into account
The results of the consultation activities have been incorporated throughout the impact assessment in each of the sections in which feedback was received.
This impact assessment is built on the input of a large number of consultation activities in multiple forms and with a wide range of stakeholders, to whom the Commission is
grateful for their fundamental contributions.
The input has been incorporated in each of the dedicated sections of the impact Assessment. In particular, the problem definition, the policy option and the impacts reflect the views of the relevant stakeholders that participated in the expert process as well as in other consultation activities. As repeatedly conveyed during the consultations, and at political level, the exponential development of the digital world will continue to
play a pivotal role in the worsening of the current challenges to addressing child sexual abuse. EU action to address these increasing challenges is keenly expected by stakeholders.
The general objective of the new legislation is to improve identification, protection and
support of victims of child sexual abuse, ensure effective prevention and facilitate
investigations, notably through a clarification of the role and responsibilities of online service providers when it comes to child sexual abuse. It would further aim at three
specific objectives to ensure the effective detection, removal and reporting of online child sexual abuse, increased coordination of efforts as well as to improve legal certainty, protection of fundamental rights, transparency and accountability.
In the determination of available policy options, the Commission took into account four criteria to assess the impacts of each policy option, namely effectiveness/social impact, efficiency, fundamental rights, and international relations. In particular, the effectiveness as well as the social impact of each policy option to improve identification, protection and support of victims of child sexual abuse, ensure effective prevention, and facilitate
investigations has been assessed. The Commission further measured the efficiency of each policy option giving strong consideration to SME s (i.e. focusing on the assessment of the economic impact of the different options on service providers and public authorities).
Given the significant impact on fundamental rights, the effectiveness of the measures and of these conditions and safeguards should be subject to dedicated monitoring mechanisms. The main differences between the options are rather linked to the extent of their effectiveness in safeauardina and balancina fundamental riahts and their ability to
327 Change.org, 'We are in danger of losing the global battle of child safety' petition, accessed 17 May
2021.
171
offer a more adequate response in light of both the current and the evolving risks
emerging in a highly dynamic digital environment. The Commission services suggested that the proposed options have to strike the appropriate balance of interests between
ensuring an effective approach to illegal content and activities and the protection of children and their rights, on the one hand, and on the other hand the interests and rights of all users, including freedom of expression and privacy of communications.
In addition, the Commission services identified the significant risk that some providers may cease voluntary measures altogether. It was further acknowledged that increased detection and reporting would have several benefits, including increased identification of
suspects and victims in third countries; and reliable information on known CSAM which could be shared with competent authorities in third countries.S tandards regarding the
quality of reports, safeguards and transparency obligations could positively influence
practices in third countries.
172
ANNEX 3: WIl 1S AFFECTED AND uOw?
1. Practical implications of the initiative
For children, chiM victims and琉eir environment
The initiative addresses children who may be at risk of becoming victims of sexual abuse or have experienced abuse.S ince child sexual abuse has such severe consequences for children's physical and mental health, their family and social environment are also
indirectly affected. The increasing documentation of abuse for online sharing has extended the impact of child sexual abuse far into the adult lives of some victims. The Canadian Centre for Child Protection found that 6 9% of victims fear being recognised as a result of their imagery online - and 30% have been recognised.328
From a purely financial perspective, the costs that arise as a consequence of child sexual abuse are significant. Victims of child sexual abuse require immediate and long-term assistance, which includes physical and mental health care (both in childhood and
adulthood), social services and services addressing additional educational needs329. The total lifetime costs of assistance to victims arising from new substantiated cases of child sexual abuse in the United S tates in 2015 is estimated at 1.5 billion USD (approx. i billion E UR)330.
Even where measures for assistance to victims are in place, they do not fully mitigate the short and long-term effects of child sexual abuse on victims' lives, resulting in additional costs such as a lifelong loss of potential earnings due to abuse during childhood33 1 These costs are believed to constitute the largest portion of the overall economic cost of child sexual abuse. The total lifetime cost of such losses in the United S ttes in 2015 was estimated at 6.8 billion USD (approx. 4.7 billion E UR)332.
The initiative also addresses the environment of the child that provides support in cases of sexual abuse. The overall impact on them is expected to be positive, as set out here below for each group:
Victim S upport Practitioners. They are the members of civil society that are in
the first line of contact for victims and perpetrators of child sexual abuse, such as
hotline employees or child rights NGOs. Increasing the impact of their work and
giving them access to expertise and lessons learned is expected to have a positive
impact on them, as is the initiative's creation of more effective measures to stem
the flow of online child sexual abuse. At the same time, the identification of
additional victims that is expected to result from increased detection efforts will
p ut a strain on their resources: in the lofa term, however, it is hooed that the
328 Canadian Centre for Chiid Protection, Full Report 2017:s urivors' s urvey, 2017. 329 Letourneau,E ., The Economic Burden of Child S exu1 Abuse in the United S ttes, May 2018, p.41 3-
22. 330 Ib id, based on combined estimated costs for child health care, adult health care, child welfare and
special education. 331 Ibid. 332 'bld, based on combined estimated productivity losses fir non-fatal and fatal cases of child sexual
abuse
173
combined measures could eventually lead to an overall reduction in child sexual
abuse, particularly online.
Social services, providing support to child victims and their families, based on
the best interests of the child, would be expected to benefit from the exchange of
best practices and ideas across Member S ttes, which may provide opportunities to identify new and better solutions, or more effective approaches. Like other
victim support providers, the detection of additional victims will lead to an
increase in workload that may eventually level off and perhaps start declining
again in the long run.
Health care professionals: they support victims and families, and deliver
treatment to offenders and persons who fear they may offend. Here, the same
considerations as for social services and NGOs apply when it comes to an
increase in workload related to child victims. In the area of prevention measures
targeting offenders, they should benefit from the facilitation of exchange of best
practices and lessons learnt, as well as of evidence-based approaches, which can
help them to apply the best approaches in their personal practice. Educators: they play an important role in prevention, in particular through awareness raising, and on detecting early signs of possible abuse. Giving them
access to a greater array of tools and options for prevention, based on rigorous scientific analysis and evidence of effectiveness, may contribute to their ability to
protect children from child sexual abuse, but also to detect its signs earlier. Their
workload is not expected to be affected, but their interventions may become more
effective, which they might welcome, given their natural interest in the well-being of the children entrusted to them.
Civil society organisations: they take action against child sexual abuse by, e.g.
contributing to make public authorities aware of the crimes, assisting victims, and
contributing to preventing child sexual abuse through awareness raising
campaigns and programmes for offenders or persons who fear that they might offend. This initiative and especially its measures to support prevention and
victim support would help them in their work and facilitate their access to up-to- date and relevant information, as well as to similar initiatives in other Member
S tates or outside the EU. It would help them network and leverage their limited
resources more effectively, reducing the risk of inefficient or duplicate investment
of their resources.
Researchers. They contribute to expand the knowledge about the nature and
prevalence of the problem, and about possible solutions to address it. The
information exchange with practitioners is key to ensure that the research remains
relevant, is effectively used, and that the solutions proposed are properly evaluated. The initiative, and especially the creation of a centre, would enable
access to more data on the phenomenon and facilitate a rigorous analysis of the
effectiveness of measures, with a view to further improvements.
174
For digital service providersβusiness司
The initiative also addresses certain service providers (businesses) that are active on the EU market. The practical implications of this initiative on them are related to two areas:
non-legislative action, and legal obligations relating to the detection and reporting of child sexual abuse material. The legislative action focuses on mandatory detection of child sexual abuse material (known/unknown), potentially regardless of encryption.
The non-legislative actions considered would be voluntary, and thus compliance will
depend on the willingness and capabilities of service providers to take these actions. Under these voluntary measures, service providers are encouraged to increase their
transparency on how they fight child sexual abuse on their services tbrough e.g. standardised reports.
In addition, a number of measures considered relate to improved technical capabilities to make the detection and reporting of material more efficient. These measures (sharing of hash databases, Application Programme Iterfaces (APIs) for remote checking of hashes,
sharing of hash databases of service providers, sharing of technologies between service
providers) would generate integration and maintenance costs for them, especially if technical capabilities are inefficient or not available to date. However, if service
providers made use of the available technologies that are free of charge or had access to more reliable data on what is considered child sexual abuse in the EU, this could
significantly improve the detection process, speed up investigation processes and contribute to the identification and rescue of child victims. Law enforcement could act more s wi丘ly, based on higher-quality, standardised reports.
As to the legal obligations for service providers, this initiative would introduce
significant changes for service providers and the way they operate. As not all service
providers currently detect child sexual abuse material or do so to the same extent, many will have to adapt to changing regulations and deal with increased cOsts, significant changes are also expected for those services which are currently offering encrypted exchanges between users. Especially for SM Es, there is a concern that this initiative could represent a practical and financial burden. However, the possibility for businesses to use detection technology free of charge somewhat limits the impact. In addition, an EU Centre making available databases of indicators of known material (e.g. hashes) can
significantly support businesses of any size in their practical operations, reduce costs of
implementation, limit the risk of false positives, and increase legal certainty. Also, shared databases could result in cumulated cost reductions for individual companies, as they do not have to comnile their own databases anymore and run them individually.
Users げonline services
The initiative would also impact users of online services. While some service providers, including a number of social media providers and other platforms, already perform detection of child sexual abuse on their services, the present initiative would significantly expand these efforts. This has an impact on the rights of users to privacy of
communications, protection of personal data and freedom of expression and information, as detection efforts would need to perform a horizontal analysis of materials shared and of conversations in order to detect those where child sexual abuse materials are being shared or where children may be groomed into child sexual abuse.
Given that the detection would be obligatory in nature and would apply horizontally, users would face limitations in chuosmna services that do not nerform detection of child
175
sexual abuse it they would preter to avoid being subjected to such detection measures.
The impact on users is therefore significant.
At the same time, the specific category of content targeted by the measures - the sexual abuse of children - is illegal regardless of context and constitutes a particularly egregious violation of fundamental rights of the child. Children, as a particularly vulnerable group, deserve special protection. Especially in the online environment, the existing protection is currently not sufficient to prevent them from being harmed, as has become more evident during the COVID- 19 pandemic. As outlined above, the specific type of harm that lies in child sexual abuse has particularly negative and life-long consequences for children. While protection can never be expected to create full safety, these considerations have to be balanced against the impact on users outlined above.
Given the significant impact on users, the initiative includes a number of conditions and
safeguards to ensure respect for children's rights and all users' rights including the right to freedom of expression, right to private life and communications as well as to data
protection. These would notably include requiring service providers to use technologies and procedures that ensure accuracy, to limit the number of false positives to the greatest extent technically possible and therefore reduce the risk of an unwarranted suspicion of involvement in child sexual abuse. In addition, the initiative aims to create greater transparency of measures, to ensure that users are fully informed about the detection measures and their possible consequences in case child sexual abuse is found, and
accountability of processes, including supervision by designated authorities.
The initiative also proposes the creation of an EU Centre in the preferred form of an EU
agency, which would provide reliable information to service providers on what is illegal in the EU, and thus contribute to the limitation of false positives. It would also facilitate
transparency and accountability, by serving as an independent central point that can
publish information about tools used, cases launched, error rates, and, in a few years, possibly also the number of children identified and rescued based on these measures. The centre could help ensure that there is no erroneous takedown or abuse of the search tools to detect legitimate content (including misuse of the tools for purposes other than the
fight against child sexual abuse) and in facilitating complaints from users who feel that their content was mistakenly removed. These safeguards should help ensure that the
impact on users is limited to what is strictly necessary to achieve the legitimate objective and to achieve a fair balance between the important rights on both sides.
For Member S tlts, law enforcement and judicial au琉orities
As some Member S ttes struggle to put in place effective prevention programmes, lack coordination and efforts are of unclear effectiveness, this initiative intends to offer more structured support to them. This initiative would facilitate and streamline Member S tates efforts in the fight against child sexual abuse and even facilitate their cooperation with non-EU countries. Areas which could benefit from a more structured approach are
prevention efforts concerning child victims and people who fear that they may offend or re-offend as well as research and exchange of best practices.
Law enforcement would also benefit from this initiative as technologies used to detect child sexual abuse would become more reliable when making use of indicators provided by the Centre, reducing the time they have to spend reviewing reports that turn out to contain materials that are not illegal in the EU. At the same time, the expected overall increase in the number of reports will significantly increase the need for law enforcement action and put law enforcement agencies under strain. To mitigate the additional burden, the EU Centre could also surr0rt law enforcement by p r0Vidin2 reliable classification of
176
materials as illegal, especially where they have been previously detected. In addition, this
is one of the few administrative burdens that has to be categorised as positive overall as it would contribute to a more effective approach to a particularly egregious group of offences.
2. S ummary of costs and benefits
The following tables present systematically the average annual and one-off costs and
benefits which have been identified and assessed during the impact assessment process.
I. Overview of Benefits (total for all provisions) - Preferred Option (EUR million/year)
Description
l
A mount
l
C omments
� � Direct benけits
Reduction of crime! child 3 448.0 Aimual benefits from reduction of crime. sexual abuse.
Indirect benefits
Facilitation of efforts by the N/A Cost savings due to a more effective and EU Centre. efficient use of resources (e.g. avoid
duplication of efforts in the EU).
4dm加istraガve cost savings related切me 'one加,one out' apProach
Replacement of Interim 0.9 Compliance of service providers and pubiic Regulation and Council authorities with the existing legislation. Decision.
�II. Overview ~f c~sts - Preferred ~pti~n (EUR milli~n/year) �
、、、~ Citizens/Consumers Businesses Administrations
P~licy 益益か、、、、~ One-~ffl Recurrent One-~ffl Recurrent One-~ffl Recurrent ~
・.、
Direct adjustment costs - - EO, 2l E 2,69 E 0,4l E3 ,36 1
0ther costs - - EO, 01 E 0,l4 f0,02 e 0,l8
3 Direct adjustment costs - - - f0,00 f4,75 f24,42
0山er costs 一 ー ー f0,00 f0,25 eL29
Direct adjustment costs - - - E6 ,55 - fl0,58
4*** 0ther costs ー ー 一 '0,34 - f0,56
5 Direct adjustment costs - - E 19,43 E 1,62 - f3,09
0ther costs - - E l,02 f0,09 - E 0,l6
6 Direct adjustment costs - - f334,59 E 436,46 - f478,45
0ther costs - - E l7,6l f22,97 - E 25,l8
177
f237,62
E494,45 E574,18 7 Direct adjustment costs
f12,51 E26,02 E30,22 0ther costs
f26,76 f448,32 f587,13 8 Direct adjustment costs
C1,41 f23,60 f30,90 0ther costs
Costs relateti lo tur "ne加 , ofle out' approach 伊UR millionゆ ' eaり
ミミ f1.390,09 fl.515,54 Direct
adjustment costs
Indirect
adjustment costs
f73,16 f79,77 Administrative costs (釦r
offsetting)
Total
The preferred option E results from the combination of policy measures 1, 3, 4, 5, 6, 7 and 8. The one-off costs of policy measure 4 have been adjusted to take into account the
synergies of combining with measures 6, 7 and 8, which replace the voluntary detection in measure 4 for mandatory detection of known CSAM , new CSAM and grooming.See annex 4 for more details.
It is estimated that the administrative costs are 5% of the total costs in each of the policy measures, with the rest of the costs being direct adjustment costs.
The administrative costs savings related to the 'one in, one out' approach result from the
replacement of the Interim Regulation. It could be assumed that the cost savings would be equivalent to the administrative costs estimated under measure 4 on voluntary detection (5% of the total costs). This is an approximation, given that the Interim
Regulation enables voluntary practices to detect and report CSA online and remove CSAMfor the online services that today generate most CSA reports (but not all, see annex 6 on magnitude).
3. Relevant Sus tainable Development Goals
This section describes the expected impacts of the most relevant Sus tamable
Development Goals (SDG) identified in the impact assessment.
Two main S DGs which will be affected by Options B to E , are S DG 16 on peace, justice and strong institutions - considering that one of its targets is to protect children from
abuse, as well as S DG 5 on gender equality- considering the previously mentioned statistics which display how girls particularly harmed by sexual offenders.
As the S DGs are interdependent and broad, there are also three main other S DGs which will benefit indirectly from the Options A to E . One of them is S DG 3 on health and
well-being, because the Options will contribute to access to safe sexual care for children. Another is S DG 4 on quality education seeing as the Options will ensure children have a safe environment to focus on education. In addition,S DG 9 on industry, innovation and infrastructure will be indirectly affected as the Options and in particular the creation of the Centre, will facilitate technological development.
178
III. Overview of relevant Sus tanable Development Goals - Preferred Option(s)
Expected progress Comments towards the Goal
Relevant S DG
CSAhas long-term consequences that may include e.g trauma leading to inability hold a job can lead to poverty and social exclusion.
The creation of an EU Centre, which would serve asa
hub for coordinating best practices, would ensure that
research work and best practices are shared concerning countering long-tern economic consequences of CSA, and the link between poverty and CSA , thereby also
contributing to S DG no. 1.
Children from economically disadvantaged background are at a risk of forced to be sexually abused e.g. to
support their families. This includes online abuse,
through production and circulation of CSAM, but also
livestreaming abuse, the victims of which can be located
anywhere in the world.
Options A to E would also contribute to locating victims
of such abuse and ensuring that they are rescued and
given appropriate support mcluding providing for such basic needs as food and shelter.
SD1 no. 1 - no An overall reduction
poverty of child sexual abuse could limit the risk
of poverty and social
exclusion of victims
of CSA. It could limit the long-term
consequences of
CSA , which can affect the quality of
life.
Considering that S D1 3 has 28 indicators to measure
gress, Options A to E will certainly contribute to
'of them.
The Options, and in particular the creation of an EU Centre which focuses on prevention, will
also lead to a promotion of mental health for
both victims of CSA and potential perpetrators.
Considering the psychological and physical impact which CSA has on its victims, as
demonstrated in previous statistics, Options A to E will contribute to safeguarding and treating mental health issues both for children and
potential victims.
With regard to Option E in particular, the
detection, reporting and removal of CSAM and
grooming will actually foster sexual care and sexual health among both children and teenagers. This is because it could aid to prevent and report
any related abuse, thereby diminishing the
number of victims, as well as victims' risk of
self-harm, depression, potential use of substance
abuse. and other mental and Dhvsical health
issues
SDG no. 3 - Increase in
health and well promoting healthy
being lives and well-being for children, both frorn a physical
point of view and a
mental one.
Options A to E will facilitate achieving S DG no. 4 as more children will be able to concentrate on
Expected hicreased
quality education on
SDG no. 4
quality
179
their education msteact ot being attectect by child
sexual abuse.
Also, the creation of an EU Centre, which will serve as a hub for coordinating best practices, will ensure that research work and member state
initiatives are shared concerning educational
campaigns in schools, thereby also contributing to S DGn. 4
reproductive health
and the risks of
online and offhne child sexual abuse
might substantially
prevent a number of
potential victims in
山efuture
education
Child sexual abuse leads to harmful
psychological and mental consequences which, as mentioned in previous statistics, will diminish
the possibility of the affected girls leading full,
healthy lives.
S DG 5 has mne targets, which also include
adopting legislation to promote gender equality,
ending all forms of discrimination against girls and ending violence and exploitation of girls.
SDG no. 5 - A majority of
gender equality victims of child
sexual abuse are
girls. A reduction of child sexual abuse
would contribute to
reduce gender ineaualitv.
Option E in particular, and the creation of the EU
Centre will strengthen the development of ordine
tools to counter child sexual abuse, thereby contributing to teclmological inovation.
While EU Member S tates gain and share new
knowledge, best practices could be shared
globally, including with developing countries, facilitated by the EU Centre.
The proposed
legislation will lead
to service providers exploring and
developing new
teclmologles which
will allow釦r 1nllovation across
industry, both in the
EU and gl oballv
SDGno. 9-
industry, ilmovation and infrastructure
Option E would The UN itself has recognized that the global pandemic have the strongest has actually increased challenges in child protection and
impact in protecting mental health services, and that therefore common action children from sexual is necessary together. abuse and sexual
exploitation. Options A to E will increasingly support this S DG, as demonstrated in the assessment of the
benefits throughout the options which will have a
positive impact towards children
The safeguards included in the legislation, including the increased transparency, will
contribute to strengthening institutions involved in the fight against child sexual abuse, including on prevention, assistance to victims, and
detection, reporting and removal of CSA online.
SDGno. 16-
peace, Justice and strong mstitutions
180
ANNEX 4:A N LYTHにAL METHO DS
1. Qualitative assessment of policy measures
The following process was applied to determine the policy measures and the policy options formed on the basis of these measures:
1) mapping of possible policy measures:
a. The mapping covered the full spectrum of possible EU intervention: no
action, non-legislative action and legislative action.
b. Given that the issue at hand is basically a regulatory failure, it was
important to lay out the full range of tools to determine the most
proportionate EU response. c. The mapping stage included a first filter to identify the policy measures to
discard at an early stage (section 5.3 of the main report and Annex 11). d. The outcome of the mapping stage was a set of policy measures retained
for further elaboration and analysis.
2) description of policy measures retained in the mapping stage (section 5.2 of the
main report)
3) analysis of the policy measures retained in the mapping stage (this Annex): a. This stage included a second filter to identify the policy measures to
discard.
b. It includes a qualitative analysis using the same assessment criteria as
those used to analyse the options. The policy measures retained are
therefore those that provide the alternatives that are most feasible (legally,
technically and politically), coherent with other EU instruments, effective, relevant and proportional to tackle the problem and its drivers analysed in
section 2 of the main report. c. The outcome of this stage was the final set of measures for the policy
options as set out in the overview diagram in section 5.2 of the main
report;
4) description of policy options, formed by combining the retained measures into
different groups: a. The formation of options follows a cumulative logic, with an increasing
level of EU legislative action (as set out in the overview diagram in
section 5.2 of the main report). b. The cumulative logic was followed not only because the measures are in
general not mutually exclusive and can be combined but also because they are complementary in a number of ways, presenting synergies that the
combined options can benefit from.
5) analysis of policy options: the options are analysed in sections 6 (impacts), 7
(comparison of options) and 8 (preferred option) of the main report, as well as in
the present annex in more detail.
181
Mm-lgおル功'ac万on
Measure 1: Practical measures to enhance voluntary efforts
code of conduct
act
Standard
Socia1 imp Developing a standard code of conduct for service providers to sign up to, setting out the
ways in which they will use technologies for the detection, removal and reporting of child sexual abuse online, and the standards and processes they will adhere to in doing so, would to some extent enhance prevention, detection and reporting and assistance to victims.
By establishing voluntary minimum standards, the code would lead to increased levels of detection and reporting of online child sexual abuse, enabling the provision of assistance to victims, and enabling interventions to prevent criminal offences. The code would also lead to improved transparency and possibly inspire safeguards regarding actions taken by service providers and their effect on users.
Economic impact Compared to the baseline scenario, the development of a standard code of conduct would be expected to lead to an increase in the amiual number of reports of online child sexual abuse received by EU law enforcement authorities.
There would also be an impact on non-EU countries, which would also experience an increase in the annual number of reports of online child sexual abuse. This increase would to some extent depend on the extent to which the code of conduct was adopted by service providers in relation to their operations outside the EU.
Fundamental rights impact There would be a slight impact on fundamental rights compared to the baseline scenario. The absence of a clear legal framework for voluntary measures by service providers would not be remedied. Whilst such absence of EU-level legislation would leave service
providers flexibility, it would also mean a lack of clarity and possible diverging obligations under national law. The impact on fundamental rights of service providers (mainly freedom to conduct a business) is therefore mixed. Increased adoption by providers of voluntary measures signing up to the code of conduct and increased
transparency would affect the fundamental rights of users (especially right to privacy and to protection of personal data).
Voluntary action by online service providers to detect, report and remove online child sexual abuse would continue to be insufficient, and inefficiencies in public-private cooperation would be only partially addressed. The situation would therefore also still
negatively affect the fundamental rights of persons who are or may become victims of child sexual abuse (rights of the child, among others).
Standardised reporting forms
Socia1 impact Developing standardised forms for reports of online child sexual abuse from service
providers to authorities would to some extent reduce inefficiencies in public-private cooperation between online service providers and public authorities.S tandardised
reporting forms would improve the quality of reports and facilitate investigations by ensuring that all relevant information is received by the relevant law enforcement authorities in a coherent manner, maximising the potential for efficient intake of information and for swift and therefore possibly more successful investigations. The
imDact would be mainly limited to iroviders not rep0rtin2 to NCMEC. where
182
standardisation is already in place; for those reports that EU Member S tates' law
enforcement authorities receive via NCMEC, standardisation has been achieved to some extent. To ensure coherence, standardised forms should align with the standards set by NCMEC to the extent possible, to expand standardisation rather than to establish
competing standards.
Standardised reporting forms could also be used by service providers making reports to non-EU law enforcement authorities, improving the quality and relevance of reports in third countries.
Economic impact The standardisation of reporting forms would create initial implementation costs and should afterwards reduce the costs of dealing with reports for both public authorities and service providers, by ensuring that all critical information is included in reports, facilitating law enforcement responses and reducing the need for follow-up requests for further information from service providers.
Fundamental rights impact There would be a slight impact on fundamental rights of victims compared to the baseline scenario, resulting from improved efficiencies in investigations. For providers, the voluntary standardisation provides a choice and therefore does not impact their freedom to conduct a business. The creation of standardised forms should not
significantly impact users' rights to privacy and data protection and freedom of
expression.
Improved feedback mechanisms and communication channels
Social impact Improved feedback mechanisms would ensure that relevant authorities provide meaningful and timely feedback to service providers regarding the quality of their reports and the nature of the materials or activity reported as illegal or legal. This feedback would serve to assist providers in improving the quality of their repOrts, in particular, providers could use the feedback to ensure that reports contained all relevant information available to them, and to avoid making reports of content that has been found not to be
illegal. Many service providers have requested feedback to help them improve and target their processes more accurately, and it is therefore expected to be welcomed by them. Feedback could help reduce the rate of false positives and therefore improve the accuracy of the whole process.
Economic impact Improved feedback mechanisms would lead to a slight positive effect on the cost of
reports to public authorities and service providers by improving the quality and relevance of reports, and consequently reducing the need for follow-up requests for information from service providers, and reducing the amount of time spent by law enforcement authorities on reports relating to content that is not illegal. At the same time, the initial investment for authorities is likely to be important, as they will need to set up the
procedures for feedback, which will also require authorities to determine when and how
they can legally share meaningful information with the service provider. In addition, they will then incur ongoing costs in investing time to provide the feedback. It is to be
expected that the feedback should launch a virtuous cycle of improving quality of reports and reduced rates of false positives, which would over time reduce the need for feedback other than to confirm that the report was accurate.
Service providers would need to set up procedures to take into account feedback
provided, both on individual content detected and to improve their overall procedures,
183
which would create costs; however, the economic impact on them would be expected to
be a mere fraction of the impact on public authorities. It is also to be expected that there would be an economic benefit in the longer term resulting from more accurate detection, which could reduce the number of instances of follow-up on false positives.
Fundamental rights impact There would be a slight positive impact on fundamental rights of users compared to the baseline scenario, resulting from decreased likelihood of reports erroneously being made to law enforcement authorities by service providers.
APIs for remote checking of hashes
Socia1 impact The provision of Application Programming Interfaces (APIs) by public authorities to allow service providers to remotely check hashed images and videos would possibly facilitate greater adoption of voluntary measures by service providers, and ensure that such measures can be based on reliable information about materials illegal in the EU. In
turn, this would be expected to lead to improved detection, reporting and removal of online child sexual abuse.
Such APIs would, in particular, facilitate the implementation of voluntary measures by smaller providers for whom lack of expertise or financial challenges would otherwise disincentivise action. It is to be expected that it would incentivise providers that have been reluctant to take measures against CSA because of costs to implement such
measures, and therefore increase the overall volume of content subject to detection measures. As a result, an increase in the volume of CSAM detected is likely, which would have a positive impact on the ability to detect and investigate crime.
Economic impact This measure would necessarily entail costs for public authorities, including costs arising from the development of APIs and integration with existing databases of hashes.
Simi1ar1y, integration would result in costs for service providers choosing to make use of the APIs. These costs would be to some extent offset by savings to service providers resulting from the reduced need to implement their own teclmological solutions.
Fundamental rights impact The expected increase in detection measures would impact users' rights, including those to privacy and data protection, and their freedom of expression. Detection measures
require mitigating measures and safeguards to limit that impact to what is strictly necessar333. s ervlce providers would be supported in taking measures against illegal content at low cost to them, where they so choose, which would have a slight positive impact on their freedom to conduct a business. The rights of the child would similarly experience a positive impact as further instances of CSAM would likely be detected, llowing authorities to take action.
Sharing of databases of hashes between service providers Socia1 impact This practical measure to encourage the voluntary sharing of hashes between service
providers would improve the ability of service providers to detect known CSAM in their services. However, service providers would continue to lack a centralised source of hashes of material reliably identified as constituting child sexual abuse material
333For an overview 0f conditions and safegnards, please refer to section 5.2.2 of the main report.
184
throughout the Union, causing law entorcement authorities to continue to receive reports
of material that is not illegal, and some material that is illegal to go unreported.
The improved ability to detect known CSAM would likely lead to an increase in reports to authorities, however without any assurances as to an improvement in quality of the
reports. Nonetheless, it is likely that the overall volume of CSAM detected and therefore of investigations would rise, resulting in a moderate positive impact on action to protect children and investigate and prosecute crime.
Economic impact The voluntary sharing of hash databases between service providers would result in minor costs to service providers relating to the provision of hashes through a secure channel. No economic impact is expected on other stakeholders.
Fundamental rights impact Service providers would be free to participate or not, and are therefore not impacted in their freedom to conduct a business.
The impact on users' rights would be more negative compared to the availability of an authoritative set of indicators, as there are no guarantees as to the quality of hash sets
shared, and as these are usually based on the national law at the place of main
establishment, which may be outside the EU. This could result in the inclusion of hashes of content that is not considered CSAM under EU and Member S ttes' law. As a result, additional verification of any reports submitted based on this approach would be
required.
In parallel, the positive impact on child rights resulting from an increased volume of CSAMdetected is similarly more limited than in the previous measure, given the more limited benefits of a pure sharing approach without quality control mechanisms
compared to a centralised, vetted system of indicators.
Sharing of technologies between service providers Socia1 impact
to encourage the voluntary sharing of technologies between
improve the availability of technologies for the detection of
grooming. Detection, reporting and removal of all these
minor secure
se would increase as a consequence.
tical measure This service providers would
CSAMand sexual abu
new child
known C sAM, forms of online
Economic impact The voluntary sharing of technologies between service providers would result in
to service providers relating to the provision of technologies through a el.
costs cliann
Fundamental rights impact Service providers would be free to participate or not, and are therefore not directly impacted in their freedom to conduct a business. However, from a competition angle, cooperation between competitors has to respect certain limits in order to preclude or
mitigate possible antitrust concerns; a particular point of importance for service providers lies in the speed of detection tools, which are designed to avoid any friction or latency in the user experience and can be a source of competitive advantage. Therefore, such
sharing mechanisms would need to be carefully tailored and orchestrated in order to
nreclude any imnact on comDetition.
185
lechnology sharing could have a positive impact on the treedom to conduct a business it
service providers that currently have no tools in place, as they would be supported in
taking measures against illegal content at low cost to them, where they so choose.
On the other hand, the expected increase in detection measures would impact users'
rights, including those to privacy and data protection, and their freedom of expression, especially in case of erroneous detection, and would therefore require mitigating measures and safeguards to limit that impact to what is strictly necessary.
Continued support to Member S tates on the implementation of the relevant
provisions of the Child Se xua1 Abuse Directive
Socia1 impact This practical measure would imply action from the Commission: continuation of
workshops and bilateral exchanges with Member S ttes, and continued funding under ISF national programmes. Based on the experience, this measure would lead to
improvements in the implementation of the Directive, but would not address any issues outside of the scope of the Directive.
Economic impact Continued support to Member S tates would result in minor costs for the Commission
budget; the funding under IS F programmes would remain unchanged. Member S ttes would be encouraged to take further measures in particular in the areas of prevention and
support to victims, which would likely come with increased costs to them. These increased costs would be offset to some extent by the availability of centralised expertise and materials through Commission support, in particular also under the following measure to facilitate research, exchange and coordination.
Fundamental rights impact There would be no impact on fundamental rights compared to the baseline scenario; the
impact of measures implemented by Member S tates would depend on the precise measures taken.
lacilitatrng research, exchange o ! best practices and coordination in the area o!
prevention and assistance to victims
Socia1 impact This practical measure to encourage research, dissemination of good practices between relevant actors would improve the cooperation and coordination between relevant actors. This measure would also help to develop evidence-based policy in prevention and assistance to victims. It is therefore expected to have a positive social impact.
Economic impact This measure would result in minor costs for the Commission budget, as well as for Member S ttes' authorities, practitioners and other stakeholders participating in the
exchange and possibly investing in additional measures on that basis.
Fundamental rights impact While the measure itself would not have a direct fundamental rights impact, such impacts could result from the measures that Member S ttes may take on the basis of lessons learnt from research and exchange of best Dractice.
measure should facilitate more impactful prevention efforts at This would have a positive impact on the fundamental rights of tand a greater chance of not falling victim to child sexual abuse.
186
In the long run, this Member S tate level.
children, who would s
Also tor those who have tallen victim, even though they have already suttered signiticant
disadvantages, more impactful measures to support them could have a moderate positive impact on their rights.
More effective prevention measures could also extend to running joint awareness-raising campaigns or joint work on online safety measures with providers. Where Member S tates mandate the participation of providers in such programmes, there would be an impact on the freedom to provide services, which Member S ttes would have to take into account and mitigate, where applicable.
Where prevention and victim support measures are conducted in cooperation with service
providers, the overall impact on users' rights will depend on the precise measures taken and would need to be taken into account by Member S tates.
Measure 2: EU Centre on prevention and assistance to victims
This measure is analysed in detail in Annex l O
Legisルlive ac万on
Measure 3: EU Centre on prevention and assistance to victims and com batin2 CSA
online
This measure is analysed in detail in Annex 10.
Measure 4: Le2islation s pecifing the conditions for 'o1untar' detection of online
child sexual abuse
Socia1 impact This legislative measure would establish for the first time an explicit legal basis
permitting service providers to take action to detect online child sexual abuse in their services. The creation of such a legal basis would remove existing legal uncertainties,
facilitating wider implementation of such measures by providers who do not currently do so.
As a result, a modest increase in the detection, reporting and removal of online child sexual abuse could be expected, which in turn would lead to a modest increase in victims
rescued, suspects detained, and offences prevented.
In addition to removing any existing legal uncertainty that may prevent providers from
taking voluntary action, this measure would also address the limitations of the interim
Regulation. Without a legal basis for voluntary action, once the interim Regulation ceases to apply three years after entering into force, providers of number-independent interpersonal communications services will be prohibited from using technologies to
detect, report and remove online child sexual abuse in their services. These services are estimated to account for more than two-thirds of all EU reports of online child sexual abuse made by providers334.
The creation of a clear legal basis would ensure that such providers are not prohibited from taking action against online child sexual abuse following the expiry of the interim
334Data provided by NCMEC to European Commission: 2019 CyberTipline Reports: Trends S een in Chat and Messaging, October 2020, and 2020
CyberTipline Data: Reports Resolving to the European Union, March 2021
187
Kegulatlon, thereby avoiding the loss 0士 the majority 01 reports trom providers and
consequential impacts on assistance to victims, identification of suspects, and prevention of offences.
Economic impact Compared to the baseline scenario, the creation of an explicit legal basis for providers' voluntary efforts against online child sexual abuse would, to some extent, lead to an increase in the implementation by service providers of measures to detect such abuse in their services. This would likely result in an increase in the overall volume of reDorts.
This would imply additional costs both for providers - where they choose to implement measures - and for public authorities in order to adequately process and respond to
reports.
Fundamental rights impact This measure would have several impacts on fundamental rights, including the right to
protection of personal data; the right to respect for private life; the right to freedom of
expression and information; the right to security and the freedom to conduct a business.
Increased adoption of voluntary measures by service providers as a result of the enhanced
legal clarity provided by this measure would lead to safer services, increasing the likelihood of detection of online child sexual abuse. This would contribute to reducing the dissemination of child sexual abuse material (right to protection of personal data,
right to respect for private life), increased identification and rescue of victims from abuse
(right to security) and increased apprehension of offenders and prevention of future offences (right to security).
Processing of users' personal data under providers' voluntary measures to detect online child sexual abuse would affect the affects users' rights to freedom of expression and information and, to the privacy of their communications.
While the rights to freedom of expression and information do not extend to protecting an
exchange of CSAM or other illegal activities, the detection would also need to check
legal materials and exchanges for the presence of C sAM. As a result, this measure would need to include strong safeguards to ensure an appropriate balance of the different fundamental rights. These safeguards could include requiring service providers to use
technologies and procedures that ensure accuracy, transparency and accountability, including supervision by designated authorities. In addition, a database of confirmed child sexual abuse indicators provided by a designated authority, such as the potential EU centre under Measure 3, would ensure a reliable basis for determining which content is
illegal. The transparency and accountability provided by reporting to a designated authority could also help ensure that there are no erroneous takedowns or abuse of the search tools to detect legitimate content (including misuse of the tools for purposes other than the fight against child sexual abuse). The centre could provide information on
possibilities for redress for users who consider that their content was mistakenly removed.
For interpersonal communications services, the users' fundamental right to privacy of communications will be impacted. Therefore, supplementary safeguards would be
required, including targeting the detection of grooming to services where children may be at risk, and providing clear information to users that a provider is using detection tools, as well as information once suspected abuse has been reported, as well as possibilities for
188
redress. An additional safeguard lies in the anonymised processing by technologies,'
which helps to ensure that the impact on the fundamental rights of users whose communications are scanned is limited to what is proportionate and strictly necessary, since no personal data deriving from their communications would be processed unless there is a suspicion of child sexual abuse.
This measure would have no impact on the rights of service providers who choose to take no action. On the other hand, service providers who choose to detect child sexual abuse would be subject to new requirements that have not applied previously, in addition to those arising from the DSA proposal, including with regard to the aforementioned
safeguards, which would therefore have a moderate effect on their business decisions
(freedom to conduct a business).S uch requirements however are important safeguards for the fundamental rights of users, given the gravity of the accusation.
Measure 5: Legal obligation to report and remove all types of online child sexual
abuse
Socia1 impact This measure would impose a legal obligation on service providers who become aware of online child sexual abuse in their services to report the abuse to a designated authority. The obligation would apply in relation to all forms of abuse within the scope of this
initiative, i.e., previously-known CSAM , new CSAM , and grooming. The reporting obligation would ensure both swift investigations to identify offenders and, where
possible, identify and rescue victims, as well as independent verification of the illegality of the content.
While US providers are currently subject to an obligation under US law to report online child sexual abuse to NCMEC, there is no comparable obligation under Union
legislation. Where abuse relating to an EU Member S tate is detected in a US provider's services, the relevant law enforcement authority receives a report via NCMEC, the US
Department of Homeland S ecurity and Europol. Where abuse is detected in an EU
provider's services, reporting is typically not subject to any legal obligation, and no standardised reporting channels exist.
This measure would ensure that all reports of online child sexual abuse relating to EU Member S tates are reported directly to the authority designated in the legislation, improving efficiency in comparison to the current reporting channels. Through the
incorporation of definitions relating to child sexual abuse under EU/Member S tate law, this obligation would lead to improved quality of reports, reducing the number of non- actionable reports which relate to content that is not illegal in Member S tates.S imi1ar1y, this measure would ensure that an obligation to report applied in relation to content that is not illegal in a third country, but that is illegal under Union/Member S tate law.
Finally, this measure would ensure that those providers that currently choose not to
report online child sexual abuse in their services are obliged to do so.
335For example tools such as Microsoft's PhotoDNA software or other techniques to detect child sexual abuse materials. PhotoDNA and similar techniques automatically convert images into a "hash", a code
describing the image. This code ca31n0t be converted back into an image and does not contain any personal data. The company then compares the hash of the image to a database of hashes of known CSAM . Where the hash of the user's image matches a hash in the database, the image is flagged as
potential CSAM .
i 89
Economic impact
Compared to the baseline scenario, the imposition of a legal obligation for providers to
report online child sexual abuse where they become aware of such abuse could lead to an increase in the number of reports made by service providers. Nevertheless, it is assumed that where providers choose to voluntarily detect online child sexual abuse, those
providers are highly likely to report such abuse even in the absence of an obligation to do so. Furthermore, US service providers are already subject to an obligation to report child sexual abuse under US law.
This measure is therefore expected to result in only a slight increase in the number of
reports of online child sexual abuse, and only a slight increase in costs for service
providers and public authorities.
Fundamental rights impact This measure would affect several fundamental rights, including the right to protection of
personal data; the right to freedom of expression and information; the right to security and the freedom to conduct a business.
The reporting of suspected online child sexual abuse would inherently involve the
processing of sensitive personal data, namely the transfer of the reported content to the
designated authority and ultimately (if different) to the relevant law enforcement
authority (right to protection of personal data, right to respect for private life). The
processing of reports by relevant law enforcement authorities would continue to be
subject to the Law Enforcement Directive336. Processing for the purpose of making a
report would be subject to safeguards to ensure transparency.
This measure would require service providers to take certain actions, incurring costs while doing so (freedom to conduct a business).
The extent of the impact of this measure on the above-mentioned rights is affected to a
significant extent by other measures which may be implemented in tandem. In particular, the magnitude of the impact of an obligation to report online child sexual abuse will
depend on the volume of abuse that is detected, which is strongly influenced by whether the detection is voluntary or mandatory.
Measure 6 : Le2al o bli2ation to detect known CSA 1
Socia1 impact This measure would impose a legal obligation on service providers to detect known child sexual abuse material in their services, regardless of whether those services are encrypted (depending on the availability of suitable technology).
The measure would ensure that the detection of known CSAM would no longer be
dependent on the voluntary action of providers. Implementation of this measure would
336 Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the
protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/{A, 0.1 L 119, 4.5.2016, p. 89-131.Directive (EU) 2016/680 of the
European Parliament and of the Council of 27 April 2016 on the protection of natural persons with
regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, 0.1 L
119, 4.5.20 16, p. 89-13 1.
190
require providers to have access to a reliable source of in士ormation on what constitutes
CSAM, in order to avoid an undue administrative burden on service providers, to allow for reliable identification of relevant content and ensure proportionality of requirements, in line with the prohibition on imposing general monitoring obligations.
This measure would have a positive social impact by preventing the recirculation of materials previously confirmed as constituting CSAM. Over time, the overall number of
images and videos depicting child sexual abuse available on services within scope should be reduced significantly, and, with it, the instances of secondary victimisation inherent in the continued viewing of the abuse. At the same time, it should entail a significant increase in the number of relevant service providers participating, in the volume of detection and reporting, and hence in the proportion of overall cases investigated and number of children identified and removed from abusive situations.
This would also have a positive impact on the overall confidence of users in services, as their exposure to CSAM would also be reduced. This positive impact would extend also to society's expectation that services do not facilitate the sharing of illegal content,
especially in the particularly egregious case of child sexual abuse. While the targeting to
specific services would possibly somewhat reduce the overall effectiveness of the
obligation which could be greater if more services were included in scope, this can be
justified in light of the greater impact that such detection might have.
For the detection of known content, the availability of reliable indicators of what constitutes CSAM under EU law and of free-of-charge technologies facilitating automatic detection would support service providers in their identification of relevant content and ensure proportionality of requirements, in line with the prohibition on
imposing general monitoring obligations. Known child sexual abuse material is the most common type of child sexual abuse online. The tools to detect it (see al iex 8) have a
high accuracy rate and have been reliably used for over a decade. The obligation to detect known material would level the playing field and ensure the detection of that content where is currently missing, with all the necessary safeguards. The EU centre would make available the database of indicators of known material (e.g. hashes) that providers should use. The mandatory detection would also encompass materials that victims have referred for detection and removal.
As a downside, such an obligation could result in occasional false positives, that is, in
images and videos erroneously identified as CSAM. The obligation to detect therefore could be limited and not be extended to direct removal, as a first safeguard.
Given the impact on fundamental rights of all users, additional safeguards would need to
apply, building on and going beyond those set out above for voluntary detection
(Measure 4) and for the reliability of the database of indicators. These could include
independent expert auditing of the database of indicators and regular supervision and verification of the procedures of the centre, independent expert certification of tools for automated detection to ensure accuracy, as well as additional transparency and
accountability measures such as regular reporting. The legislation could also set out information rights of users and mechanisms for complaints and legal redress.
The question of how to deal with encryption is arguably its most complex aspect, given the high stakes on both sides. The inclusion of encrypted content within the scope of this measure ensures a comprehensive approach to combating known CSAM . Encryption, while beneficial in ensuring privacy and security of communications, also creates secure
spaces for perpetrators to hide their actions, such as trading images and videos, and
appr0achin2 and g roomina children without fear of detection. Any solution to detect
191
child sexual abuse theretore needs to ensure both the privacy it electronic
communications and the protection of children from sexual abuse and sexual
exploitation, as well as the protection of the privacy of the children depicted in the child sexual abuse material. It would also need to ensure that comparable services are treated
equally, in line with the principle of equality before the law.
Economic impact For both the public and the private sector, administrative and compliance costs would arise from implementing new legislation.
For service providers, the introduction of systems for the detection, where applicable, and the new or increased generation of reports would result in costs, also in relation to
follow-up requests for further relevant data from public authorities, and for handling complaints and requests for review by affected users.S ervice providers who are not
already investing in developing technologies that would allow the detection of child sexual abuse in encrypted environments will require additional dedicated resources to
implement feasible technical solutions that are a good fit for large-scale deployment. This burden may be considerably higher for smaller companies that may not have access to in- house resources. However, they would benefit from the fact that this option would limit further fragmentation of the 'internal Market with regard to administrative procedures and
obligations required from hosting service providers. Technologies for the detection of known CSA 1 outside of end-to-end encrypted communications channels have been available free of charge for years and have proven their reliability.
SMEs offering hosting services are particularly vulnerable to exploitation of illegal activities, including child sexual abuse, not least since they tend to have limited capacity to deploy state-of-the-art technological solutions to child sexual abuse material or
specialised staff. Therefore, they should not be exempted from any rules and obligations which are mitigated by ensuring that measures are proportionate. The free availability of reliable hash databases and the requisite detection tools are important in this regard. Even
though companies may have unequal resources to integrate technologies for the detection of child sexual abuse material into their products, this negative effect is outweighed by the fact that excluding them from this obligation would create a safe space for child sexual abuse and therefore defeat the purpose of the proposal. To further mitigate the economic impact on smaller companies, there is no obligation to take action other than to
report the suspicion, and the verification could be left to the expertise of the relevant authorities which would inform the provider whether the material did in fact constitute CSAM. Therefore, service providers would not be forced to invest in additional human resources for confirmation of suspected CSAM. In addition, an obligation to detect child sexual abuse in encrypted spaces would only apply where reliable technologies exist and can be made available for adaptation to providers' products.
The expected increase in reports from service providers would result in significant additional costs to public authorities, in particular law enforcement and judicial authorities, arising from the corresponding increase in investigations and prosecutions. However, this financial impact is expected to be outweighed by the positive economic
impact on victim support measures and survivor quality of life and productivity.
Apositive effect on the S ing1e Market could result from additional legal clarity and
certainty, thus limiting compliance costs. Furthermore, both the public and the private sector would benefit from a common framework creating more legal certainty and mutual trust between the public and the private sector.
192
Fundamental rights impact
This measure would result in significantly expanded and more effective action against CSAM. It would therefore have a significantly positive impact on fundamental rights of victims whose images are circulating on the internet, in particular on their right to respect for private life. In addition, in creating a more effective approach to child sexual abuse, it is expected to have a positive effect on child rights more generally, including the rights to human dignity and to the integrity of the person.
At the same time, the mandatory nature of the detection has an important impact on
providers' freedom to conduct their business. This can only be justified in view of the
necessity of the measure to achieve an objective of fundamental importance, namely the more effective protection of children and their rights. The necessity of the measure is based on the experience that victims themselves are frequently unable to seek help, in view of their inherent vulnerability and the specific efforts by offenders to avoid disclosure of their offences. At the same time, offenders are increasingly likely to share evidence of abuse with others online, as is evident from the growing figures of new materials circulating online, as set out in the problem definition. Especially in the context of interpersonal communications, providers are therefore the only ones that have
visibility on the abuse taking place. Given that up to 80% of investigations in some Member S tates are possible only because of reports from providers, such a measure is
objectively necessary.337
Nonetheless, the impact itself needs to be limited to the maximum extent possible to ensure that it is limited to what is strictly necessary. For providers, this requires providing support for the implementation of the measures.S pecifica11y, providers should have access to a reliable set of indicators of what is illegal in the EU to enable them to search for specific content. In addition, providers need to have access to free and verified detection tools, to reduce the burden on them.
In addition, users' rights are impacted to a greater extent than under the voluntary measures provided for under Measure 5. While some service providers, including a number of social media providers and other platforms, already perform detection of child sexual abuse on their services, the present measure would significantly expand these efforts. This has an impact on the rights of users to privacy and confidentiality of
communications, protection of personal data and freedom of expression and information, as detection efforts would need to perform a horizontal analysis of materials shared and of conversations in order to detect those where child sexual abuse materials are being shared or where children may be groomed into child sexual abuse.
Given that the detection would be obligatory in nature and would apply horizontally, users would face limitations in choosing services that do not perform detection of child sexual abuse if they would prefer to avoid being subjected to such detection measures. The impact on users is therefore significant.
At the same time, as set out above, the specific category of content targeted by the measures - the sexual abuse of children - is illegal regardless of context and constitutes a
particularly egregious violation of fundamental rights of the child. Children, as a
particularly vulnerable group, deserve special protection. Especially in the online
environment, the existing protection is currently not sufficient to prevent them from
337While the prohibition to imposea general monitoring obligation does not rank as a fundamental right, it serves as a safeguard to facilitate the appropriate balancing of rights and interests. The option ensures compatibility with this principle through the provision of reliable indicators of CSAM and automated tools, as set out in more detail above in section 5.2.3.5.2.3.
193
being harmed, as has become more evident during the CO VID-19 pandemic. As outlined
above, the specific type of harm that lies in child sexual abuse has particularly negative and life-long consequences for children. While protection can never be expected to create full safety, these considerations have to be balanced against the impact on users outlined above.
Given the significant impact on users, the initiative includes a number of conditions and
safeguards to ensure respect for children's rights and all users' rights, including the right to freedom of expression, right to private life and communications as well as to data
protection. These would notably include requiring service providers to use technologies and procedures that ensure accuracy, to limit the number of false positives to the greatest extent technically possible and therefore reduce the risk of an unwarranted suspicion of involvement in child sexual abuse. In addition, the initiative aims to create greater transparency of measures, to ensure that users are fully informed about the detection measures and their possible consequences in case child sexual abuse is found, and
accountability of processes, including supervision by designated authorities.
Where encryption is deployed, the detection of CSAM is compatible with most types of
encryption provided by the service provider, as both the service provider and the user retain access to the encrypted information.338 For the specific context of end-to-end
encryption in interpersonal communications, some providers have already developed proprietary approaches, and further technologies are under development.S afeguards would therefore also include not to generally weaken encryption and to ensure a high level of information security.
The initiative also proposes the creation of an independent EU Centre, preferably in the form of an EU Agency, which would provide reliable information to service providers on what is illegal in the EU, and thus contribute to the limitation of false positives. It would also facilitate transparency and accountability, by serving as an independent central point that can publish information about tools used, cases launched, error rates, and, in a few
years, possibly also the number of children identified and rescued based on these measures. The centre could help ensure that there is no erroneous takedown or abuse of the search tools to detect legitimate content (including misuse of the tools for purposes other than the fight against child sexual abuse) and in facilitating complaints from users who feel that their content was mistakenly removed. These safeguards should help ensure that the impact on users is limited to what is strictly necessary to achieve the legitimate objective and to achieve a fair balance between the imDortant riahts on both sides.
Measure 7: Legal obligation to detect new CSAM
Socia1 impact providers to detect previously-
, regardless of whether those service services
>ose a legal obligation on abuse material in their
of known CSAM reduces the re-victimisation of the child and videos and, at times, the investigation initiated with such a
This measure would unknown child s ex
are encrypted.
the detection in those images
services
Whereas
depicted report may lead to uncovering ongoing abuses, this material depicts past abuse, which in
By its nature, previously undetected CSAM usually depicts ongoing abuse, provides particularly valuable leads, and is
rcement. The added value of detecting
some cases may be years old. more recent and at times still therefore treated as highest priority by law
338 This applies, e.g. to the encryption in transit for international data transfers that the ECJ recommends.
194
"new"CSAM in terms of the ability to identify and rescue children is significant. The
positive social impact on children's welfare consequently is significantly higher than in the case of detection of known content, as in Measure 6.
The prompt detection of new CSAM also allows for prevention of its distribution,
reducing the possibility of it "going viral" in circles of abusers and being repeatedly recirculated in the future, by adding it to databases of known material. These databases are used both to feed the tools for the detection of known CSAM , and to train and
improve the tools for the automated detection of 'new'CSAM . The subsequent detection based on the comparison with these databases can also provide important information about the way in which CSAM is disseminated online and the circles of abusers,
facilitating detection and effective action against such groups, which would have a
significantly positive social impact of tackling the problem closer to its roots.
The reliability and efficacy of technologies to detect new CSAM is quite advanced,
ensuring error rates in the low percentages (0.0 1% in a recent benchmarking test of one of the key tools), yet the administrative burden on relevant service providers in ensuring the accuracy of efforts is higher and would require an additional degree of human
oversight and human confirmation of suspected C 5AM.
The proportion of materials flagged as suspected and previously new CSAM in a given year is naturally lower than that of known CSAM, where hashes reflect content created over many years, resulting in a much smaller number of materials requiring verification.
Nonetheless, it needs to be considered whether this additional burden can still be considered as proportionate and compatible with the general monitoring prohibition.
The same considerations on encryption mentioned in relation to Measure 6 apply to this measure一
Economic impact The economic impact of the imposition of a legal obligation to detect previously-new CSAMwould, in some respects, be similar to the economic impact of a legal obligation to detect known CSAM (measure 6).
As in the case of Measure 6, for service providers, the introduction of systems, increased volume of reports, follow-up requests and complaints would result in costs. Technologies for the detection of new CSAM outside of end-to-end encrypted communications channels have been available free of charge for years and have proven their reliability.
For public authorities, the expected increase in reports from service providers would result in significant additional costs to public authorities due to the increase in
investigations and prosecutions. While the overall number of new materials detected under this measure is expected to be much lower than that of known CSAM under Measure 6, cases of new CSAM require particularly urgent and detailed attention, given the greater likelihood of ongoing abuse and the need for victim identification. However, this financial impact is expected to be outweighed by the positive economic impact on victim support measures and survivor quality of life and productivity.
As in the case of Measure 6, a positive effect on the single Market could result from additional legal clarity and certainty, thus limiting compliance costs. Furthermore, both the public and the private sector would benefit from a common framework creating more leaal certainty and mutual trust between the D ublic and the D rivte sector.
195
Fundamental rights impact
The fundamental rights impacts 0f this measure are similar to those for Measure 6, yet are increased both in the positive and in the negative sense by virtue of the greater scope of the measure.
The mandatory detection of new CSAM would be based on verified indicators, to be
provided by a designated, trusted authority, such as the possible EU centre under Measure 3. In principle, this would lead to a comparable level of intrusiveness as the detection of previously known material under Measure 6. However, given that accuracy levels of current tools, while still being above 99% in recent testing, are lower than for the detection of known CSA M, human confirmation is essential (and is in any case
explicitly set out as a possible safeguard in case of automated decision-making with legal consequences). The impact on users' rights to privacy and confidentiality of communications and personal data protection would therefore be greater and would
require additional safeguards.
To limit the impact on providers' rights, especially for SME s, they could choose to rely on confirmation by the EU Centre, which would in any case review all reports as a
safeguard. In addition, strict requirements would need to apply to the technologies deployed, including on the reliability of indicators used, and reliable detection tools would be made available free of charge.
In light of the very recent nature of most undetected CSAM, this option would have a
positive impact on the fundamental rights of victims of ongoing abuse and would
significantly enhance the possibility of safeguarding victims from additional abuse. In
addition, the early detection and swift addition of newly-detected materials to databases of verified CSA M can limit the spreading of content across platforms and hence serve to
protect victims' fundamental rights to privacy and data protection.
Measure 8: Le2aI ob li2ation to detect 2rooming
Socia1 impact This measure would impose a legal obligation on service providers to detect grooming in their services, regardless of whether those services are encrypted.
The detection of grooming typically relies on tools for automatic text analysis, which are trained on verified grooming conversations and assess a given exchange according to risk factors identified on the basis of the verified grooming conversations.S uch tools are lower in accuracy than tools for the automatic detection of known or new CSA M and would therefore require additional conditions and safeguards to avoid reports of false
positives. At the same time, existing figures show that the proportion of suspicious cases
flagged is significantly lower still than that of new content, limiting the administrative burden on providers to the verification of a few cases per month.
At the same time, the detection of grooming is of particular relevance for the protection of victims and therefore arguably has the strongest positive impact. While the detection of both known and new CSAM is always detection of evidence of past abuse (but may nevertheless lead to the detection of ongoing abuse), the identification of grooming and
subsequent intervention is a measure that can ideally serve to protect children from
falling victim to in-person abuse, or to stop ongoing abuse. The comparably higher invasiveness of text analysis tools and lower accuracy rate therefore has to be weighed against the interest in more effective protection of the child.
The same considerations on encryption mentioned in relation to Measure 6 apply to this measure.
196
Economic impact
The economic impact of the imposition of a legal obligation to detect grooming would, in some respects, be similar to the economic impact of a legal obligation to detect known and new CSAM (measures 6 and 7).
As in Measures 6 and 7, for service providers, an obligation to detect grooming would
require investment in the integration of tools to detect grooming. As reports about
grooming are subject to human review in many cases, service providers could also incur
significant costs related to hiring trained staff. These costs could be mitigated by making available technologies free of charge, limiting service providers' expenses to the
integration of such tools into their services, and by allowing service providers to rely on
specialised competent authorities, such as the Centre under Measure 3, for the confirmation of cases identi丘ed as suspected grooming. By contrast, staffing costs for those authorities would increase as such cases require immediate reaction in order to ensure the protection of victims. Where service providers choose to rely on such authorities for verification before taking action, swift turnaround would have to be ensured in order to inform the provider about the need to intervene in an interaction and to protect a child.
Under this measure, law enforcement authorities would incur higher costs related to the
processing of additional reports. While the number of additional reports is expected to be
quite low compared the number of additional reports under Measure 6, in the case of
reports of grooming, swift action is required in order to ensure protection of the victim, who may be at risk of imminent or OngOing abuse.
This measure would be expected to have a positive economic impact related to victim
support and quality of life, as some children would not fall victim to hands-on child sexual abuse because of the timely detection of grooming. This could potentially reduce the impact on victim support systems, as well as having a decisive impact on the quality of life and productivity of the children throughout their lifetime.
Fundamental rights impact Mandatory detection of grooming would have a more positive impact on the fundamental
rights of children as potential victims, compared to Measures 6 and 7, by contributing to the prevention of abuse. At the same time, this obligation would be significantly more intrusive than obligations under Measures 6 and 7, since it would involve searching text in interpersonal communications as the most important vector for grooming.
On the one hand, such searches have to be considered as necessary since the service
provider is the only entity able to detect grooming. Automatic detection tools have
acquired a very high degree of accuracy (usually above 80%), and indicators are
becoming more reliable with time as the algorithms learn. At the same time, the scanning of text in conversations is inherently more invasive into users' rights than the identification of an image or a video as constituting CSAM and require additional
safeguards. This is the case even where it is targeted to services where children might be at risk and subject to strict safeguards, as set out above for the voluntary detection of
grooming.
In addition, it is questionable whether the reliability of the indicators to be provided is
sufficiently high at present to justify the limitation of providers' right to conduct a business. In particular when it comes to avoiding a disproportionate burden as set out
notably in the prohibition of general monitoring obligations, it is doubtful whether a fair balance of rights could be achieved here. The assessment of whether a conversation
197
constitutes grooming it a child is less it a black-and-white assessment compared to
CSAM. After automatic flagging, it requires a careful analysis of the exchange and the context and is therefore both inherently more intrusive and requires a significant additional investment of resources of the service provider. At the same time, the
possibility to protect children from imminent harm and the significant negative impact of that harm can help justify this measure. Further increasing the quality of the indicators and hence the accuracy of the detection process is of key importance, and safeguards must include the need to deploy state-of-the-art tecimology in order to reflect
advancements, and a requirement for human verification.
2. Qualitative comparison of policy options
The options are compared below through listing positive (+), negative (-) and 'no-change' (~) impacts compared to the baseline (with > indicating more costs in relation to
baseline).
(加tunんpractical measures to enhance prevention, detection, reporting and removal,
and assistance to victims, and establishing an EU Centre on prevention and assistance
I‘フvたfims
Score +
Assessment
+ hnproved prevention and assistance to victims through EU centre on prevention and assistance to victims
+S 1ight1 improved detection, reporting and removal of
child sexual abuse ordine in short-term
+ Limited improvement through legal advice,
jurisprudence and establishment of best practices to be
adhered to on a voluntary basis
+ Limited improvement of protection of fundamental rights
through better coordination of efforts on prevention and
assistance to victims of child sexual abuse
- Limited impact of centre to small scale and limited
abilities of a non-legislative hub.
--- Continued dependence on voluntary measures by
providers
--- Continued inability for public authorities to investigate and prosecute many crimes
--- Providers of number-independent personal communications services would be prohibited from taking measures to detect, report and remove online child sexual
abuse following the expiry of the In terim Regulation in
2024
--- Continued violation of rights of victims through failure
to detect child sexual abuse offences, rescue victims from
ongoing and imminent abuse and prevent crimes
-- Continued violation of rights of victims as a result of
failure to detect online child sexual abuse, rescue victims
from onuoina and iniminent abuse and Drevent crimes
Criteria
Effectiveness
198
Costs:)
Benefits: +
+
+
+
+ Reduction in costs to service providers and public authorities arising from improved feedback mechanisms
and standardised reporting forms
- Additional costs to service providers and public authorities arising from increased detection and reporting of known CSAM , new CSAM and grooming - Costs to public authorities and service providers arising from development and implementation of practical measures (standard codes of conduct, standardised
reporting forms, improved feedback mechanisms and
communication chaimels, APIs for remote checking of
hashes, sharing of databases of hashes, sharing of
technologies, continued support to Member S tates on
implementation of Directive 201 1/93, facilitating research,
exchange of best practices and coordination in the area of
prevention and assistance to victims) -- Fragmentation of Member S ttes' laws on detection,
removal and reporting of online child sexual abuse will
likely increase++ EU centre on prevention and assistance
to victims would provide a degree of coordination and
streamlining of activities and better use of resources.
Legislation:
ハノNo interference with legislation, as this is an option with non-legislative measures.
+ Coherent with the Victims Rights Directive through a
greater facilitation of the cooperation with Member S ttes
with regards to CSA victims. Idem with the CSA Directive
on the prevention and assistance to victims provisions.
Coordiiiation:
+ EU centre could positively influence cooperation on
prevention and assistance to victims
Funding: + The EU Centre can play a signposting role that could
facilitate a more effective and efficient use of funding for
CSA initiatives
Efficiency
Coherence
+ The practical measures proposed do not go beyond what is
necessary to achieve the specific objectives. As practical
measures, they are limited to facilitating the work of
Member S ttes, without creating new obligations.
Proportionality
199
Option 丑 option且+legislationりpcfing the conditions pr voluntary detection,刀
requiring mandatory reporting and removalげonline child sexual abuse, and刃
expanding the EU Centre to also support detection, reporting and removal
Criteria Assessment Sco re Effectiveness! ++ Improvement in terms 0f decreasing the prevalence of ++
CSA and providing assistance to victims thanks to the EU centre to prevent and counter child sexual abuse
+S 1ght1 improved detection, reporting and removal of child sexual abuse online in short-term
++ Clear legal framework for voluntary measures to detect
known and new CSAM and grooming --- Continued dependence on voluntary measures by providers
--- Continued inability for public authorities to investigate and prosecute many crimes
ーー Continued violation of rights of children and child victims as a result of failure to detect a significant amount of online child sexual abuse, rescue victims from ongoing and imminent abuse and prevent crimes
Efficiency +++ EU centre could facilitate a more effective use of Costs:>>
resources. B ene丘ts: ++ + Reduction in costs to service providers and public authorities arising from improved feedback mechanisms
and standardised reporting forms
- Additional costs to service providers and public authorities arising from increased detection and reporting of known CSAM, new CSAM and grooming - Costs to public authorities and service providers arising from development and implementation of practical measures (standard coded of conduct, standardised
reporting forms, improved feedback mechanisms and
communication chailels, APIs for remote checking of
hashes, sharing of databases of hashes, sharing of
technologies, continued support to Member S ttes on
implementation of Directive 2011/93, facilitating research,
exchange of best practices and coordination in the area of
prevention and assistance to victims)
+ Legislation: + Coherent with relevant horizontal and sectorial
legislation at EU level
+Coherent with the general monitoring obligation
prohibition.
Coordmation:
+++Facilitation of Member S ttes' and service providers'
Coherence
200
++
+
efforts on prevention, and assistance to victims through the
EU Centre
Funding: + The EU Centre can play a signposting role that could
facilitate a more effective and efficient use of fidig for
CSAinitiatives
+ The provisions do not go beyond what is necessary to
achieve the specific objectives. In particular, they do not
impose new obligations on Member S tates on prevention and assistance to victims and they are limited to facilitating their work on those areas. As for detection, reporting and
removal obligations imposed on service providers, they are
proportionate to the seriousness of the problem and the
need to act at EU level to avoid legal fragmentation that
affects the S ig1e Market.
Proportionality
げknown CSl4M Option C. option β+mandatory
Score 」ョョー
Costs: >>>
Benefits: +++
Assessment ++ Effective detection, removal and reporting of known
CSAM
++ Clear legal basis for voluntary measures to detect
known and new CSAM and grooming
+++Strong safeguards and accountability mechanisms to
ensure strong protection of fundamental rights -- Dependent on voluntary action by providers for
detection of new CSAM and grooming, which has proven insufficient
-- Continued violation of rights of victims as a result of
failure to detect new CSAM and grooming, rescue victims
from ongoing and imminent abuse and prevent crimes
+++EU centre could facilitate a more effective use of
resources, including reducing law enforcement workload
by reviewing the reports and filtering them to ensure that
the reports are actionable
+ Reduction in costs to service providers and public authorities arising from improved feedback mechanisms
and standardised reporting forms
--- Additional costs to service providers and public authorities arising from increased detection, reporting and
removal of known CSAM.
- Additional costs to service providers and public authorities arising from increased detection, reporting and
removal of new CSAM and g rooming.
Criteria
Effectiveness
Efficiency
201
+
Legislation: + Coherent with relevant horizontal and sectorial
legislation at EU level
+ Coherent with the general monitoring obligation
prohibition.
Coordmation:
+++ Facilitation of Member S ttes' and service providers' +++
efforts on prevention, assistance to victims and detection,
reporting and removal of CSA online through the EU
Centre
Funding: +
+ The EU Centre can play a signposting role that could
facilitate a more effective and efficient use of funding for
CSA initiatives
Coherence
+ The provisions do not go beyond what is necessary to
achieve the specific objectives, In particular, they do not
impose new obligations on Member S ttes on prevention and assistance to victims and they are limited to facilitating their work on those areas. As for detection, reporting and
removal obligations imposed on service providers, they are
proportionate to the seriousness of the problem and the
need to act at EU level to avoid legal fragmentation that
affects the S ig1e Market.
Proportionality
Score ++++
C+mandatory detectionげnew CS二4M
Assessment -++++ Effective detection, removal and reporting of known and new CSAM
++++S trong safeguards and accountability mechanisms to
ensure strong protection of fundamental rights -- Dependence on voluntary action by providers for detection
of grooming, which has proven insufficient
-- Continued violation of rights of victims as a result of failure
to detect grooming, rescue victims from ongoing and
imminent abuse and prevent crimes
(加tion D: option
Criteria
Effectiveness
Costs: >>>>
Benefits: ++++
++++EU centre could facilitate a more effective use of
resources, including reducing law enforcement workload by
reviewing the reports and filtering them to ensure that the
reports are actionable
+ Reduction in costs to service providers and public authorities arising from improved feedback mechanisms and
standardised reporting forms
-- Additional costs to service providers and public authorities
Efficiency
202
arising from increased detection, reporting and removal of
known and new CSAM .
- Additional costs to service providers and public authorities
arising from increased detection and reporting of grooming.
+
+++
+
Legislation: + Coherent with relevant horizontal and sectorial legislation at
EU level
+ Coherent with the general monitoring obligation
prohibition.
Coordmation:
+++ Facilitation of Member S ttes' and service providers' efforts on prevention, assistance to victims and detection,
reporting and removal of CSA online through the EU Centre
Funding: + The EU Centre can play a signposting role that could
facilitate a more effective and efficient use of funding for
CSA initiatives
Coherence
+ Proportionality The provisions do not go beyond what is necessary to achieve
the specific objectives.血 particular, they do not impose new
obligations on Member S tates on prevention and assistance to
victims and they are limited to facilitating their work on those
areas. As for detection, reporting and removal obligations
imposed on service providers, they are proportionate to the
seriousness of the problem and the need to act at EU level to
avoid legal fragmentation that affects the S ing1e Market.
Score +++++
Costs:
Benefits:
(加tila E : option D+mandatory detection ofgrooining
Criteria Assessment
Effectiveness +++++ Effective detection, removal and reporting of
known and new CSAM and grooming with a clear legal basis
+++++S trong safeguards and accountability mechanisms
to ensure strong protection of fmidamental rights
Efficiency +++++ EU centre could facilitate a more effective use of
resources, including reducing law enforcement workload
by reviewing the reports and filtering them to ensure that
the reports are actionable
+ Reduction in costs to service providers and public authorities arising from improved feedback mechanisms
and standardised reporting forms
203
ーーーーーAdditional costs to service providers and public
authorities arising from increased detection, reporting and
removal of known CSAM and grooming.
+
+++
+
ders'
Legislation: + Coherent with relevant horizontal and sectorial
legislation at EU level
+ Coherent with the general monitoring obligation
prohibition.
Coordination:
+++ Facilitation of Member S ttes' and service
assistance to victims and detection,
of CSA online through the EU
efforts on prevention,
reporting and removal
Centre
Funding: + The EU Centre can play a signposting role that could
facilitate a more effective and efficient use of funding for
CSA initiatives
Coherence
+ The provisions do not go beyond what is necessary to
achieve the specific objectives. lu particular, they do not
impose new obligations on Member S ttes on prevention and assistance to victims and they are limited to facilitating their work on those areas. As for detection, reporting and
removal obligations imposed on service providers, they are
proportionate to the seriousness of the problem and the
need to act at EU level to avoid legal fragmentation that
affects the S ing1e Market.
Proportionality
3. Quantitative assessment of policy measures
This section describes how the model to estimate the costs works, the assumptions used
and the limitations.
204
H0W memodel works
Box 1. How琉emodel estimates costs rたted to reportsげonline cんM sexual abuse
The model estimates the cost of each of the policy measures using the concept of an
'average or typical report' of online child sexual abuse.
The composition of an average/typical report and the number of reports expected annually are used to estimate the costs to public authorities and service providers in the baseline scenario. For each measure, modifiers are used to estimate the expected changes to the composition of an average report and number of reports. This allows the net costs of the measure relative to the baseline to be estimated. The baseline scenario naturally leads to no net costs.
The measures considered under this initiative would give rise to costs to three groups of stakeholders: the possible European Centre to prevent and counter child sexual abuse,
public authorities, and service providers. The model attempts to estimate both one-off and continuous (annual) costs. Typically, these costs have two components: the
salary/hour and the hours it takes to do the tasks:
Costs = cost/hour of the person doing the tasks x hours required to do the tasks.
o Cost/hour:
Labour cost per hour for service providers: In the case of service providers, the labour cost per hour is based on
the average of the salaries in the EU of workers whose activities are
classified under s ection J (information and communications)339 in the
NACE Rev. 2 statistical classification of economic activities in the
European Comnuity340
This cost includes compensation of employees, plus taxes, minus
subsidies.
An additional 25% is added to account for overheads (i.e. expenses not
related to direct labour, such as the cost of office equipment.). The value is 49.25 EUR]hour, including the overheads described
above.
Where the options include a legal obligation to detect child sexual
abuse, the costs include an estimate for the deployment (one-off cost) and maintenance (continuous/annual costs) of pre-existing
technologies and infrastructure, and the cost of initial and ongoing
training.
The costs for such options assume a total of 34 600 providers affected by such ob1i2ations341. It is also assumed that costs will be
」5リEurostat, Labour cost levels by NACE Rev. 2 activity, accessed 9 April 2021. 340 Eurostat, NACE Rev. 2 -S ttistical classification of economic activities, accessed 26 April 2021. 34l Eurostat, Armual enterprise statistics for special aggregates of activities困ACE Rev. 2), accessed
12 May 2021. As clear data for the number of relevant online service providers are not available, lids
analysis uses data on the number of enterprises in industries J 61 (Telecomiuiications) and J 63
(Information S ervice Activities). In addition to providers falling within the scope of the definition of 'relevant online service providers' for the purposes of this initiative,J 61 and J 63 include many enterprises falling outside the scope. Therefore, for the purpose of this analysis, it is assumed that 20%
205
comparatively higher for the 20 providers with the largest market
share, due to the need for specialised infrastructure to handle the high volume of expected reports, and the need to integrate the obligations into larger and more complex ecosystems.
Developing new technical solutions to detect child sexual abuse online (e.g. in
in encrypted spaces), only for measures 6-8 below:
The cost includes the development, deployment and maintenance of
technical solutions by a small number of service providers possibly in
partnership with public authorities. The technology would
subsequently be made available to other relevant online service
providers through the Centre.
In order to achieve a realistic estimate, market wages that experienced software engineers and testers342 are expected to make working for a
large technology company were taken as a baseline where this was
possible. The estimates are prepared utilising yearly wage figures as
available instead of cost per hour.
The average salary was taken as 148 000 EURlyear for experienced software engineers and 49 000 EUR/year for testers.
Labour cost per hour for public authorities:
In the case of public authorities, the labour cost per hour is based on
the average of the salaries in the EU of whose activities are classified
under s ection O (public administration)343 in the NACE Rev. 2
statistical classification of economic activities in the European
Community. This cost includes compensation of employees, plus taxes, minus
subsidies.
An additional 50% is added to account for overheads (i.e. expenses not
related to direct labour, such as the cost of equipment). The value is 46.20 EUR/hour, including the overheads described
above.
It is assumed that this value remains constant for all options and over time.
o Hours required to do a task:
S ince the salary/hour is assumed to be constant, the model focuses on
estimating the hours required to do the tasks.
These hours required to do the tasks can change if:
the time required to do one task changes, or
it enterpnses in these industries are relevant online service providers, and that others do not provide relevant online services.
342 Levels.fyi and payscale provide information on salary levels with popular technology companies to
help prospective job candidates make decisions. 343As this category is not included in the source cited above in 339, this data was calculated using the
following sources:
Eurostat, Labour cost, wages and salaries (including apprentices) by NACE Rev. 2 activity - LCS
surveys 2008. 2012 and 2016, accessed 13 April 2021; Eurostat, Labour cost index by NACE Rev. 2 activity - nominal value, annual data, accessed 13 April 2021.
206
the total number of tasks changes.
Taking into account the proportion of reports of' each type (known CSAM , new CSAM and grooming) under the baseline scenario, and the number of hours required to process a
report by service providers and public authorities344, the baseline cost of processing a
typical report of online child sexual abuse was estimated.
The one-off costs were calculated using estimates of the time it takes to carry out the tasks (e.g. development or integration of technologies).
The continuous costs were calculated in comparison with the baseline:
1. First, the costs of the baseline were calculated, including the time taken by service
providers and public authorities to process a typical report of online child
sexual abuse, and average number of annual reports expected for the years 2021-
2025 based upon the number of reports received in previous years. The number of
reports processed by public authorities was adjusted to reflect the percentage of
reports received by public authorities that are typically actionable345. The model
assumes that the costs of public authorities derived from taking action on the
actionable reports. The costs of public authorities in processing ail the reports and discard the non-actionable ones has been incorporated as part of the costs for
taking action on actionable reports for the purposes of the model.
2.S econd, it was estimated how each of the options changed the time required for a
provider or public authority to process a report of online child sexual abuse of
each type (known CSAM, new CSAM, grooming) and the number of reports: a. For measures involving voluntary detection, these changes were
estimated as percentages of deviation from the baseline parameters, i.e.,
percentages by which the number of reports or the time required to
process a report increased or decreased as a result of the measure. These
group of percentages are called "modifiers" in the explanations below, and
are tabled for each of the options. b. For measures imposing obligations on service providers to detect specific
forms of online child sexual abuse, changes in the number of reports were
estimated by modelling the potential number of reports (see Table): i. The number of reports per user account of online child sexual
abuse in 2020 was estimated for the service provider which
currently makes the overwhelming majority of reports to NCMEC
(Facebook)346. Facebook was responsible for 95% of service
provider reports to NCMEC globally in 2020. Assuming that for
EU reports in 2020, Facebook was also responsible for 95%
(990 000 reports), and assuming that there were 203 million
Facebook accounts in the EU347, approximately 0.005 reports were
made to NCMEC for each EU user account.
to law enforcement authorities (see 344Based on discussions with service providers and a targeted Amiex 2).
345Targeted surveys of law enforcement authorities (See Armex 2) 乃id. We reSocial,
' Digital 2020' country reports, accessed 9 April 2021
207
346
347
ii. The total number of EU user accounts was estimated by
combining data on the number of users of social media and
messaging services in the EU (252 million) with data on the
typical number of accounts held by each user (7.23)348. This leads
to an estimated total of 1.8 billion EU accounts on social media
and messaging services.
iii. It was assumed for the purposes of this model that the number of
cases of detected online child sexual abuse per user account
estimated for the service in (i) above, is typical of the number of
cases that would be detected by the services in (ii) under
mandatory detection.
iv. The data and assumptions in (i), (ii) and (iii) above were combined
to produce an estimate for the potential number of reports under an
obligation to detect online child sexual abuse. The total number
of potential EU reports ifall types of online child sexual abuse is
approximately 8.8 million per year according to this model.
v. Based on the assumption that 70% 0f such reports are
actionable349, this leads to a potential of 6 .6 million actionable
reports per year. 3. Finally, the continuous costs for that option resulted from applying the modifiers
to the baseline values or substituting the modelled number of potential reports to
obtain the time/attempt and the number of attempts for each option's scenario.
In the case of measures 6-8, the continuous costs include the maintenance and
development 0f technical solutions to detect child sexual abuse regardless of the
technology used in the online exchanges (e.g. encrypted environments,) and costs
relating to implementation and training arising from obligations to detect each form of online child sexual abuse
Tabl 1. Estimation げnumber ofpotential EU reportsげonline child sexual abuse (all figures are estimates and refer to 202の
Number of
potential EU
repoホ
Number of EU social media and
messaging accounts
Socia1 media and
messaging accounts
per EU user
Number of EU social media and
messaging users
Number of reports per EU Facebook Account
EU Facebook Accounts
Number of EU
reports 丘-om Facebook
Percentage of global reports from Facebook
8,812,811 7.23 1,822,476,819 0.0049 252,057,500 95% 990,706 203,610,000
Number of EU
reports
1,046,350
In summary, to calculate the costs of each option, the following questions were analysed for each 0f the measures:
1. Are there any one-off costs?
2. Does the measure change the time required for a service provider or public
authority to process a typical report of online child sexual abuse? i.e., does the
untry reports, accessed 9 April 2021
ment authorities (see Annex 2).
208
We reSocial, ' Digital 2020' ci
Targeted surveys of law enforce
348
349
measure change the proportion of reports that are of each type, or the time
required to process reports of each type? 3. Does the measure change the total number of typical reports of child sexual
abuse online?
4. Combining the above, does the measure change the total continuous costs for a
provider to detect, report and remove child sexual abuse online, or for a public
authority to investigate and prosecute a report of child sexual abuse online?
The following general assumptions were made:
o The cost/hour = 49.25 EUR/hour for service providers, 46.20 EUR/hour for public authorities remains constant for all options and over time.
o The time required to process a typical report is an estimated average, taking into
account the proportion of reports of each type (known CSAM, new CSAM and
grooming), and the number of hours required to process a report by service providers and public authorities350. This time is updated for each of the measures based upon their effect on the composition of a typical report, i.e., based upon how each measure
affects the percentage of reports of each type. The differentiation between different
forms of online child sexual abuse is based upon the assumption that a greater level
of human oversight and consideration is necessary for certain types of content such as
grooming. o The cost of handling a typical report under each measure is obtained by combining
the cost per hour with the overall number of typical reports expected under that
measure.
o Measures 6-8 also include costs relating to implementation and training arising from
obligations to detect each form of online child sexual abuse. These costs include an
estimate for the deployment (one-off cost) and maintenance (continuous/annual costs) of pre-existing teclmologies and infrastructure, and the cost of initial and ongoing
training. o The costs for such measures assume a total of 34 600 providers affected by such
obligations351. It is also assumed that costs will be comparatively higher for the 20
providers with the largest market share, due to the need for specialised infrastructure
to handle the high volume of expected reports, and the need to integrate the
obliaations into laraer and more com〕lex ecosystems.
350 Based on discussions with service providers and a targeted survey to law enforcement authorities (see Annex 2)
35l Eurostat, Armual enterprise statistics for special aggregates of activities困ACE Rev. 2), accessed 12 May 2021. As clear data for the number of relevant online service providers are not available, this
analysis uses data on the number of enterprises in industries J 61 (Telecommunications) and J 63
(Information S ervce Activities). In addition to providers falling within the scope of the definition of 'relevant online service providers' for the purposes of this initiative,J 61 and J 63 include many enterprises falling outside the scope. Therefore, for the purpose of this analysis, it is assumed that 20% of enterprises in these industries are relevant online service providers, and that others do not provide relevant online services
209
The next section describes the specific assumptions used to answer the above questions
for each of the measures, and presents the estimated costs.
CalUltionげthe cost estimatespr each measure.
Measure 0: Baseline
The analysis of the costs of the baseline serves as a reference to estimate the costs for
public authorities and service providers of the other options.
1) One-off costs.
There are logically no one-off costs in the baseline.
2) Time per typical report.
The time per typical report was estimated by first estimating the time taken by service
providers and public authorities to process a report of known CSAM, new CSAM, or
grooming. These times were then combined with the proportion of reports of each type in a typical report to estimate the time taken by service providers and public authorities to
process a typical report of online child sexual abuse under each measure.
The following tasks were considered:
Se rvice providers: o Human review of one case of content flagged as possible child sexual
abuse online
o Preparation/completion of one report of child sexual abuse online
o S ubmission of one report of child sexual abuse online to relevant
authorities
o Respond to requests for further information/clarification
Public authorities:
o Prioritisation of reports received
o Decision on commencement of investigation (where applicable) o Analysis/classification of reported content
o Investigation o Rescue of victims
o Arrest of suspects o Prosecution of suspects o Feedback to person / organisation reporting child sexual abuse online
The estimated time for a service provider to process a report in the baseline scenario is 45
minutes, 60 minutes, and 90 minutes, respectively, for reports of known CSAM, new CSAMand grooming.
The estimated time for a public authority to process a report in the baseline scenario is i hour for reports of known CSAM , and 2 hours each for reports of known CSAM , new CSAMand grooming.
3) Total number of reports.
Reports of child sexual abuse online forwarded by NCMEC to EU law
enforcement authorities have increased from 52 000 in 2014 to over
i million in 2020一
210
quality it reporting, while ensuring the intormation provided is limited to
what is feasible and strictly appropriate.
Development of improved communication channels:
o 30 working days x 27 Member S ttes
o The setting up of a single point of contact system and ensuring appropriate
security for communication channels requires conceiving, validating and
implementing such a system for the whole Member S tte, involving multiple actors.
o Costs may differ depending on the nature of the system established. 30
working days is taken to represent an average figure.
Development and integration of APIs to allow for remote checking against hash
databases:
o 100 000 EUR development cost for API; and
o 5 working days x 5 Member S ttes
o Due to the complexity of establishing and maintaining databases of hashes, and the likely redundancy for providers of maintaining API connections to
multiple databases, it is assumed that a small number of Member S ttes would
integrate such an API.
The total one-off costs to EU and Member S tates' public authorities under this measure are EUR 433 990.
Service providers:
Development of standard code of conduct:
o 5 working hours x1o service providers o As the adoption of a standard code of conduct is a voluntary measure, and the
vast majority of reports of CSA online are currently made by a small number
0f service providers, it is estimated that consultations on the development ifa
code would involve a small number of providers, including both small and
large companies. o Training and implementation of the finaised code is assumed to be a part of
service providers' ordinary activities, not incurring any additional costs.
Development of standardised reporting forms:
〇 5 working hours x top 10 service providers 〇 As the vast majority of reports of CSA online from service providers are
currently made by a small number of providers, consultations on development of standardised reporting forms can be most effective by focusing on the
providers that are most active in this area in order to determine the
information that can and should be included in reports in order to ensure that
they are actionable for law enforcement authorities.
Development of improved feedback mechanisms:
o 5 working hours x top 10 service providers o Consultations on the development of improved feedback mechanisms can be
most effective by focusina on the D roviders that are most active in this area in
213
order to determine the intormation gaps which currently prevent providers
from improving the quality of their reports. . Development of improved communication channels:
o 30 working days x top 10 service providers o As this is a voluntary measure, the development of improved communication
channels is likely to be of most interest to the providers making the largest numbers of reports ofCSA online.
o The setting up of a single point of contact system and ensuring appropriate
security for communication channels requires conceiving, validating and
implementing such a system, involving multiple actors.
o Costs may differ depending on the nature of the system established. 30
working days is taken to represent an average figure.
Development and integration of APIs to allow for remote checking against hash
databases:
o 5 working days x 50 service providers o Due to the complexity of establishing and maintaining databases of hashes, a
significant number of service providers are expected to have an interest in the
integration of APIs to allow for remote checking against hash databases
operated by public authorities.
The total one-off costs to service providers under this measure are 224 088 EUR.
2) Time per report.
Known CSAM , new CSAM, and solicitation:
-5% in relation to the baseline.
Decreased cost per report for all types of CSA online due to improved efficiencies as a result of initiatives under this measure.
3) Total number of reports.
Known and new CSAM:
. +10% in relation to the baseline.
・ Increased detection, reporting and removal of both known and new
CSAM by relevant online service providers due to increase in voluntary activities as a result of initiatives under this measure.
Solicittion:
. +20% in relation to the baseline.
Increased detection and reporting of solicitation by relevant online service
providers due to:
increase in voluntary activities as a result of initiatives under this
measure;
current low level of adontion of relevant technoloaies.
214
Table below summarises the above modifiers for this measure. Table summarises the
resulting changes to a typical report.
Tabl 6:Sll lll lll aグげm1odi/er under Measure i
Known CSAM New Grooming
Time per report (hours)
Annual reports (average)
-5%
10%
-5%
10%
-5%
20%
Tabl 7. Composition, time and costげa typical report under Measure i
Kiown
New CSAM
Grooming
Total
1,914,323
218,391
870
2,133,584
89.72%
10.24%
0.04%
100%
E35.09
E46.79
E70.18
C36.30
叩 一 晒
1,43
0.74
E 87.77
E175.54
E175.54
E 96.79
即 一 印 一 即
1 一 ( 、 J 一 ( 、 J
4) Total continuous costs.
The change in continuous costs was calculated as the product of the increase in annual
reports and the costs per report indicated above.
Table below summarises the calculations of the total continuous costs per year under
Measure 1.
Tabl 8: Calculationげcontinuous costs per year under Measure i
Public authorities S ervice providers
Cost per average report E 96,79 e 36,30
Annual reports (average) 1.493.509 2.133.584
Annual costs e 144.557.486 E 77.453.822
Annual costs (baseline) e 141,016,361 E 74,627,445
Net annual costs E 3,541,125 E 2,826,377
Measure 2: EU Centre on prevention and assistance to victims
The quantitative assessment of this policy measure is described in detail in Annex 10
Measure 3: EU Centre on prevention and assistance to victims and to combat CSA
online
The quantitative assessment of this policy measure is described in detail in Annex 10
215
Measure 4: Legislation s pecifin2 the conditions for voluntary detection of online
child sexual abuse
1) One-off costs.
Public authorities:
Development of legislation: o The one-off costs to public authorities in this measure concern the
development of legislation specifying the conditions for voluntary detection
of child sexual abuse online. Assuming that the instrument would be a
Regulation, it would not require transposition by the Member S tates. However
some adaptations of national law may be needed to make it compliant with the
instrument. In any case, it is assumed that these possible costs of developing the legislation and eventually implement it at national level would be
absorbed by existing budget and under the existing resources in public authorities.
Service providers:
. Infrastructure for the detection of online child sexual abuse:
o +3 460 (+ 10%) service providers implementing voluntary measures;
o 30 working days typically required to implement voluntary measures.
o 2 working days to train content moderators on detection of known C sAM, 4
days for new CSAM and 5 days for grooming.
under this measure are EUR 137 687 240.
under this Measure.
The total one-off costs to service providers
2) Time per report. There are no changes to the time per report
3) Total number of reports.
Known CSAM , new CSAM, and solicitation:
+10% in relation to the baseline.
Increased detection, reporting and removal of all forms ofCSA online by relevant online service providers due to increase in voluntary activities as
a result of the increased legal certainty regarding processing activities as a
result of this measure.
Where this Measure is included in Policy Options which also include
Measures 6, 7, or 8, costs under this measure relating to voluntary detection of types of online child sexual covered by those measures are
omitted for the service providers subject to detection orders, as voluntary detection is redundant in that scenario.
Table below summarises the above modifiers for this measure. Table 10 summarises the resulting changes to a typical report.
216
Table 9:Swnmaグ げmodfer under Measure 4
Known CSAM New Grooiiiing
Time per report (hours)
Annual reports (average)
0%
10%
0%
10%
0%
10%
Table 10: Composition, time and costげαtypical report under Measure 4
Knowii
New CSAM
Grooming
Total
1,914,323
218,391
798
2,133,512
89.73%
10.240/
0.04%
100.00%
f36,94
f49,25
f73,88
f38,21
0,75
1,00
1,50
0,78
f92,39
f184,78
f184,78
f101,88 m 一 柳 珂
4) Total continuous costs.
The change in continuous costs was calculated as the product of the increase in annual
reports
Table
and the costs per report indicated above.
11 summarises the calculations of the total continuous costs per year under this
Measure
Table 11. Calculationげcontinuous costs per year under Measure 4
Public authorities S ervice providers
Cost per average report E 101.88 E3 8.21
Annual reports (average) 1,493,458 2,133,512
Annual costs E 152,156,397 E 81,524,982
Annual costs (baseline) E 141,016,361 E 74,627,445
Net annual costs E 11,140,035 E6 ,897,538
Measure 5: Legal o bIi2ation to report and remove all types of online CSA
1) One-off costs.
Public authorities:
Development of legislation: 〇 The one-off costs to public authorities in this measure concern the
development of legislation establishing an obligation to report and remove all
types of child sexual abuse online. Assuming that the instrument would be a
Regulation, it would not require transposition by the Member S ttes. However
some adaptations of national law may be needed to make it compliant with the
instrument. In any case, it is assumed that these possible costs of developing
217
the legislation and eventually implement it at national level would be
absorbed by existing budget and under the existing resources in public authorities.
Service providers:
Infrastructure for the reporting and removal of online child sexual abuse:
o +346 (+ 10%) service providers implementing additional infrastructure for
reporting and removal; o It is assumed that the majority of service providers have in place
infrastructure that allows them to report instances of CSA online, and remove
them once they have become aware.
o It is therefore assumed that the cost to put in place the necessary infrastructure
for reporting and removal would be the equivalent of 15 working days for
10% of the total number of providers concerned by CSA online (49.25/hour x 3460 providers x 120 b!provider).
The total one-off costs to service providers under this measure are EUR 20 448 600
2) Time per report. There are no changes to the time per report under this Measure.
3) Total number of reports.
Known CSA1, new CSAM , and solicitation: + 1% in relation to the baseline.
S 1ight1y increased detection, reporting and removal of all forms of CSA
online by relevant online service providers due to increase in voluntary activities as a result of the increased legal certainty regarding processing activities as a result of this measure.
It is assumed that, where relevant online service providers carry out
voluntary detection and removal of CSA online, the overwhelming
majority of those providers will make reports on a voluntary basis, leading to only a slight increase under this measure.
Table 12 below summarises the above modifiers for this measure. Table 3 summarises the resulting changes to a typical report.
Tabl 12:Sllmma ?ァ げ1110dier under Measure 5
KnownCSAM NewCSAM Grooming
Time per report (hours) 0% 0% 0%
Annual reports (avera2e) 3% 3% 3%
218
Tabl 13.Co position, time and costげαtypical report under Measure 5
Known
New CSAM
Grooming
Total
792,502
204,494
747
997.743
89.73%
10.24%
0.037%
100.00%
f36,94
f49,25
f73,88 C38.21
聖 聖 m
f92,40
f184,80
el84,80
C101.89
聖 墾 墾 m
4) Change in continuous costs.
The change in continuous costs was calculated as the product of the increase in annual
reports and the costs per report indicated above.
Table 4 summarises the calculations of the total continuous costs per year under this
Measure.
Table 14: Calculationげcontinuous costs per year under Measure 5
Public authorities Se rvice providers
Cost per average report E 101.89 E3 8.21
Annual reports (average) 1.415.876 1.997.743
Annual costs E 144,267,579 E 76,337,029
Annual costs (baseline) E 141,016,361 E 74,627,445
Net annual costs E 3,251,217 E 1,709,584
Measure 6 : Le2a1 o bli2ation to detect known CSAM
1) One-off costs.
Public authorities:
Development 0f legislation: o The one-off costs in this measure concern the development of legislation
establishing a legal obligation for relevant online service providers to detect
known child sexual abuse material. Assuming that the instrument would be a
Regulation, it would not require transposition by the Member S ttes. However
some adaptations of national law may be needed to make it compliant with the
instrument. In any case, it is assumed that these possible costs of developing the legislation and eventually implement it at national level would be
absorbed by existing budget and under the existing resources in public authorities.
o Development and integration of tools to detect known CSAM regardless of
the technology used in the online exchanges (e.g. in E 2EE environments): The one-off costs for public authorities include contributing to the
develoDment of those tools. The tools should ideally be develoDed in
219
partnership with service providers and be at par with solutions used to
detect child sexual abuse in un-encrypted environments in terms of
effectiveness, and safeguard fundamental rights, including privacy and
data protection.
Service providers
The one-off costs for service providers include the following:
implementation of infrastructure for the detection of known CSAM (120
hours/year for each of the 34 600 providers concerned);
development of technical solutions that allows companies to detect child sexual
abuse regardless of the technology used in the online exchanges (e.g.
encryption). The solution should ideally be developed in partnership with public
authorities, and should be tailored to the company's existing services, fit within
their business model and be at par with solutions used to detect child sexual
abuse in un-encrypted environments and safeguard fundamental rights, including
privacy and data protection (10% of the above);
additional costs for the top 20 largest providers, derived from the need to ensure
interoperability of different platforms, and additional costs due to the larger user
base and/or volume of online exchanges (5 million per provider);
training for the providers' content moderators in order to appropriately deal with
content flagged as known CSAM (16 hours/year for each of the 34 600
providers);
The total one-off costs to service providers under this measure are EUR 352 199 400.
Tabl 15:Swn mai げone-iがcosts under Measure 6
Integration of infrastructure to detect known
Integration of infrastructure to detect known SAM (top 20
providers)
Integration of tools to detect known CSAM regardless of the
technology used in the online exchanges
Training of content moderators
Total
f 2 04,486,0 00
f 100,000,000
f20.448.600
f 27,264,800
f3 52 , 19 9,400
0 0
Cし (も
0 0
0
‘し (F、
(t
2) Time per report. There are no changes to the time per report under this Measure.
3) Total number of reports.
Known
To estimate the number of reports, the model assumes that under the obligation to
detect known CSA 1, the maximum number of reports containing known CSAM
would be reached.
220
To estimate this maximum number, the model considers the maximum number of
reports that could be achieved under all the obligations in the initiative, 8.8
million (see "How the model works" section).
Under this scenario, the proportion of reports of new CSAM (18%) and grooming
(2%) would increase in relation to the current situation and the baseline (10.24% and 0.04% respectively), which are assumed to increase significantly due to the
less extended deployment of technologies for their detection at present compared to known CSAM .
This means that the total maximum number of reports containing known SAM
would be 80% of 8.8 million (7.1 million). As discussed previously, the model
assumes that each report contains just one type ofCSA online.
Tabl 16: distribution げreportsun dr the baseline and all detection o bligations scenarios
Baseline All detection obligations combined
Known 89.73% 80%
New 10.24% 18%
Grooming 0.04% 2%
Table 17 below summarises the above modifiers for this measure.
Table l 7:Summ r' げrnod!lers m dr Measure 6
Grooming
0%
New Known CSAM
0%
0%
0%
0%
7.1 miliion in total
Time per report (hours)
Annual reports (average)
Due to the greater clarity and stricter legal rules regarding the detection of known under this Measure, it is assumed that the number of non-actionable reports made by providers is reduced by 5% (instead of by 30% under voluntary reporting). For new CSAMand grooming the situation would remain the same in relation to non-actionable
reports (i.e. 30%).
the number of reports of known CSAM under this measure, while of new CSAM and grooming is unaffected, results in a significant
The large increase in
reports composition of the average report.T ab1e8 summarises the resulting
the number of
change to the
changes to the average report:
Table 18: Composition, time and costげan average report under Measure 6
221
4, くJ
8
8
。ノ つA
8
●」
● ●
● .
6
。ノ (つ
”I
「つ 4I
「ノ ●j
KnownCSAM 6,697,736
NewCSAM 138,976
Grooming 508
Total 6,837,220
97.96%
2.03%
0.0 1%
100%
0.75
1.00
1.50
0.76
97.25%
2.74%
0.01%
100%
7,050,249
198,537
725
7,249,511
92.40
184.80
184.80
94.29
0 0
0 4
0 0
0 0
● ●
● .
「乙 4I
4I ●]
4) Change in continuous costs.
The change in continuous costs was calculated as the product of the increase in annual
reports and the costs per report indicated above.
In addition, continuous cost also include those of operating and maintaining the infrastructure and technologies to detect known CSAM , including:
Costs for all the providers concerned by this measure (40 hours/year for each of
the 34 600 providers); Additional costs related to the maintenance and rolling development costs of
technical solutions that allows for detection of CSA online regardless of the
technology used in the online exchanges (10% of the above); Additional costs for the top 20 largest providers, derived from the need to ensure
interoperability of different platforms, and additional costs due to the larger user
base and/or volume of online exchanges (1h per day = 24*365 = 8760 hours/year, at an increased hourly rate ofE1000).
Training of content moderators (8h per year).
Table 19 summarises the calculations of the total continuous costs per year under this Measure.
Tabl 19: Calculationげcontinuous costs per year under Measure 6
山 製“り丑!」」.
Cost per average report
Annual reports (average)
Detection costs
lperation!maintenance of in&astructure to detect known CSAM
lperatiow'maintenance of infrastructure to detect known
CSAMregardless of the technology used in the online
exchanges
lperatiow'maintenance of infrastructure to detect known
CSAM(top 20 providers)
Training of content moderators
Total
Annual costs (baseline)
Net annual costs
Public Authorities
f 94.29
6,837,220
f644,647,419
Eo
f37.28
7,249,511
f270,250, 104
f68, 162,000
eo f6,8 16,200
E0
Co
f644,647,419
f 141,016,361
f503,63 1,058
f 175,200,000
f 13,632,400
f 534,060,690
f74,627,445
f 459,433,246
Measure 7: Le2a1 obli 2ation to detect new CSA 1
1) One-off costs.
222
Public authorities:
Development of legislation: o The one-off costs to public authorities in this measure concern the
development of legislation establishing a legal obligation for relevant online
service providers to detect, report and remove previously-unknown child
sexual abuse material. Assuming that the instrument would be a Regulation, it
would not require transposition by the Member S ttes. However some
adaptations of national law may be needed to make it compliant with the
instrument. In any case, it is assumed that these possible costs of developing the legislation and eventually implement it at national level would be
absorbed by existing budget and under the existing resources in public authorities.
o Development and integration of tools to detect new CSAM regardless of the
technology used in the online exchanges (e.g. in E 2EE environruents): The one-off costs for public authorities include contributing to the
development of those tools. The tools should ideally be developed in
partnership with service providers and be at par with solutions used to
detect child sexual abuse in un-encrypted environments in terms of
effectiveness, and safeguard fundamental rights, including privacy and
data protection.
The one-off costs for service providers include the following:
implementation of i nfastructure for the detection of new CSAM (240 hours/year for each of the 34 600 providers concerned);
development of technical solutions that allows companies to detect child sexual
abuse regardless of the technology used in the online exchanges (e.g.
encryption). The solution should ideally be developed in partnership with public
authorities, and should be tailored to the company's existing services,丘 t within
their business model and be at par with solutions used to detect child sexual
abuse in un-encrypted environments and safeguard fundamental rights, including
privacy and data protection (10% of the above);
additional costs for the top 20 largest providers, derived from the need to ensure
interoperability of different platforms, and additional costs due to the larger user
base and/or volume of online exchanges (5 million per provider);
training for the providers' content moderators in order to appropriately deal with
content flagged as known CSAM (32 hours/year for each of the 34 600
providers);
The total one-off costs to service providers under this measure are EUR 6 04 398 800
Tabl 20.Su lll lll a lッげone-iがcosts under Measure 7
Service Providers
E408,972,000
E100,000,00
Public Authorities
e0
e0
Description
Integration 0f infrastructure to detect new CSAM
Integration of infrastructure to detect new CSAM (top 20
providers)
223
Development of legislation:
o The one-off costs to public authorities in this measure concern the
development of legislation establishing a legal obligation for relevant online
service providers to detect, report and remove previously-unknown child
sexual abuse material. Assuming that the instrument would be a Regulation, it
would not require transposition by the Member S ttes. However some
adaptations of national law may be needed to make it compliant with the
instrument. In any case, it is assumed that these possible costs of developing the legislation and eventually implement it at national level would be
absorbed by existing budget and under the existing resources in public authorities.
o Development and integration of tools to detect grooming regardless of the
technology used in the online exchanges (e.g. in E 2EE environments): The one-off costs for public authorities include contributing to the
development of those tools. The tools should ideally be developed in
partnership with service providers and be at par with solutions used to
detect child sexual abuse in un-encrypted environments in terms of
effectiveness, and safeguard fundamental rights, including privacy and
data Drotection.
Service providers:
The one-off costs for service providers include the following:
implementation of infastructure for the detection of grooming (240 hours/year for each of the 34 600 providers concerned);
development of technical solutions that allows companies to detect child sexual
abuse regardless of the technology used in the online exchanges (e.g.
encryption). The solution should ideally be developed in partnership with public
authorities, and should be tailored to the company's existing services,丘 t within
their business model and be at par with solutions used to detect child sexual
abuse in un-encrypted environments and safeguard fundamental rights, including
privacy and data protection (10% of the above);
additional costs for the top 20 largest providers, derived from the need to ensure
interoperability of different platforms, and additional costs due to the larger user
base and/or volume of online exchanges (5 million per provider);
training for the providers' content moderators in order to appropriately deal with
content flagged as known CSAM (40 hours/year for each of the 34 600
providers);
measure are EUR 6 18 031 200. The total one-off costs to service providers under this
226
The increase in the number of reports of grooming under this measure, while the number
of reports of known and new CSAM is unaffected, results in a change to the composition of the average report. Table 26 summarises the resulting changes to the average report:
Tabl 26: Composition, time and costげan average report under Measure 8
Cost
per
average report
E 3 6. 94
C 49.25
e 73.88
E 41.17
S ervice provi1ers
Time
per
Proportion average report
�
I 82.28% 0.75
9.39% 1.00
8.33% 1.50
100% 0.84
Number of
Reports
1,740,293
198,537
176,256
2,115,087
E 92.40
C184.80
E184.80
C11o.97
Public authorities
Time
per
average report (l」皿 」,.'〕
2.00
4.00
400
2.40
Proportion
79.90%
9.12%
10.98%
100%
Number of
Reports
1,218,205
138,976
167,443
1,524,625
Type
Known
New CSAM
Grooming
Total
4) Change in continuous costs.
The change in continuous costs was calculated as the product of the increase in annual
reports and the costs per report indicated above. The same considerations as those of Measure 6 apply, with the following changes:
Additional costs for the top 20 largest providers, derived from the need to ensure
interoperability of different platforms, and additional costs due to the larger user
base and/or volume of online exchanges (1h per day = 24*365 = 8760 hours/year, at an increased hourly rate ofE2000).
Triinii12 of content moderators (20h p er V ear).
Table 27 sunimarises the calculations of the total continuous costs per year under this Measure.
Table 27. Calculationげcontinuous costs per year under Measure 8
E4 1.17
2,115,087
e11o.97
1,524,625
E87,080,985
E68, 162,000
E169,188,523
Co
E6,8 16,200 Eo
E350,400,000
E34,08 1,000
E546,540, 185
E74,627,445
E471,912,741
Eo
eo
E169,188,523
E141,0 16,36 1
E28,172,162
Cost per average report
Annuai reports (average)
Detection costs
lperatioii!maintenance of in&astructure to detect grooming
lperatiow'maintenance of i nfrastrllcture to detect new
CSAMregardiess of the technology used in the online
exchanges
lperatiow'maintenance of infrastructure to detect grooming (top 20 providers)
Training of content moderators
Total
Annuai costs (baseiine)
Net annual costs
228
4. Quantitative assessment of policy options
Calculationげthe cost estimatespr each policy option
Given the cumulative nature of the options, the total costs are the sum of the costs of each of the measures. For options C, D and E , which combine voluntary and mandatory detection, the model takes into account the synergies between measures 4 and 6, 7 and 8
respectively, to consider either the costs of voluntary measures or mandatory depending on the option.
Option A : practical measures to enhance prevention, detection, reporting and removal, and assistance to victims, and establishing an EU Centre on prevention and assistance to victims
Table 28: Calculationげtotal costs under Option4
0NE1OFFcosTs co NTINU9U (ANNUAL)ONE-OFF 0 POLICY LUさ1さ
MEAS URS
Public Se rvice Public Se rvice Authoiities Providers Authorities Providers
i EO .4 E 0.2 E3 .5 E 2.8
2 EO .O E 0.0 E 10.3 E 0.0
Total EO .4 E 0.2 E 13.9 E 2.0
0ption B : option A + legislation 1) specifying the conditions for voluntary detection, 2) requiring mandatory reporting and removal of online child sexual abuse, and 3) expanding the EU Centre to also support detection, reporting and removal
Table 29: Calculationげtotal costs under Optionβ
oNE一 oFFcosTs CONTINUOUs
(
ANNUAL )O - O F 0
POLICY L )さ1さ
MEAS URS
Public S ervice Public S ervice Authorities Providers Authorities Providers
i E 0.4 E 0.2 E 3.5 E 2.8
3 E 5.0 E 0.0 E 25.7 E 0.0
4 E 0.0 E 137.7 E 11.1 E 6.9
5 E 0.0 E 20.4 E 3.3 E 1.7
Total E 5.4 f158.4 f43.6 fii.4
0ption C: option B+ mandatory detection of known CSAM
In this option, a number of service provider will be subject to mandatory detection of known CSAM. Therefore, the one-off costs of voluntary detection of known CSAM under measure 4 should be deducted (i.e. training of content moderators and integration of infrastructure to detect known CSAM). These are taken into account in measure 4*.
229
The continuous costs would eventually be lower than the combination of measures 4 and
6 but they have been left in the calculations to maintain a conservative estimates of the
costs. This also allows taking into account the transition period before the detection order
is imposed on the service provider, during which it may choose to start or continue
detecting voluntarily.
Tabl 30. Calculationげtotal costs under Option C
POLICY MEAS URS
Public Se rvice Public Se rvice Authorities Providers Authorities Providers
i E 0.4 E 0.2 E 3.5 E 2.8
3 E 5.0 E 0.0 E 25.7 E 0.0
4* E 0.0 E 94.1 E 11.1 E 6.9
5 E 0.0 E 20.4 E 3.3 E 1.7
6 E 0.0 E3 52.2 E 503.6 E 459.4
Total f5.4 f466.9 f547.3 f470.9
0ption D: option C + mandatory detection of new CSAM
The same considerations in relation to one-off costs under measure 4 made in option C
apply. In this case, measure 4** should exclude the one-off costs related to training of content moderators and integration of infrastructure to detect new CSAM , in addition to those of known CSAM . Therefore, the only one-off costs under measure 4** are those related to training of content moderators and integration of infrastructure to detect
grooming on a voluntary basis. The same considerations in relation to continuous costs under measure 4 made in option C apply.
Tabl 31: Calculationげtotal costs under Option D
一ーー一一一ーーー CO NTINUOUS (NNUL、 ONE-OFF C0STS
POLICY L )さ1さ
MEAS URS
Public S ervice Public S ervice Authoiities Providers Authorities Providers
i E 0.4 E 0.2 E 3.5 E 2.8
3 E 5.0 EO .O E 25.7 E 0.0
4'ま E 0.0 E 47.7 E 11.1 E 6.9
5 E 0.0 E 20.4 E 3.3 E 1.7
6 E 0.0 E3 52.2 E 503.6 E 459.4
7 E 0.0 E6 04.4 E 250.1 E 520.5
Total f5.4 f 1,025.0 f797.4 f991.3
230
Option E : option D + mandatory detection of grooming
The same considerations in relation to one-off costs under measure 4 made in option C
apply. In this case, there would not be one-off costs, since those are included in the
mandatory measures to detect known and new CSAM and grooming. The same considerations in relation to continuous costs under measure 4 made in option C apply.
Tabl 32: Calculationげtotal costs under OptionE
CONTINUOUS (ANNUAL) C0STS
ONE-OFF C 0STS
Se rvice Providers
E2.8
EOD
E6.9
E1.7
E459.4
E520.5
E471.9
E1,463.3
Public Authorities
E 3 .5
E 25.7
E 11.l
e3 .3
f503石
E 250.1
E 28.2
E 825.6
Service Providers
E0.2
EO.0
e90.0
e20.4
E352.2
E604.4
E6 18D
E1,595.3
Public Authorities
E 0.4
E 5.0
EO .0
EO .0
EO .0
EO .0
EO .0
E 5.4
POLICY MEASURS
4*まま
Total
Calculationげthe benずt estimatespr each policy option
As discussed in the benefits section in the main report, the total costs of child sexual abuse in the EU are EUR 13.5 billion.
This estimate is derived from a paper which estimated the total economic burden of child sexual abuse in the United S tates, which appeared in the peer-reviewed journal Child Abuse & Neglect356. The paper estimates total costs including health care costs,
productivity losses, child welfare costs, violence/crime costs, and special education costs, based on secondary data drawn from peer-reviewed journals.
Regrettably, similar studies relating to the EU do not appear to have been published to date. However, studies on the economic cost of violence against children (including child sexual abuse) suggest that costs are comparable among high-income countries357 Therefore the estimates provided in the above-mentioned paper are assumed to be
applicable in the EU context, when adjusted to take account of the differences between the sizes of the US and EU populations.
The benefits derive from savings as a result of CSA associated costs, i.e. savings relating to offenders (e.g. criminal proceedings), savings relating to victims (e.g. short and long-term assistance), and savings relating to society at large (e.g. productivity losses).
in the United S ttes, May 2018. of Child Maltreatment in High Income
356 Letourneau et al., The economic burden of child sexual abuse 357 See, for exampie Ferrara, P. et al., The Economic Burden
Countries. December 2015.
231
The calculation of benefits assumes that there is a direct correlation between the only
factor that can be quantified, the increase in reports358, and the estimated savings. Specifically, the model assumed a cost decrease of 25% for option E (highest number of
reports) and applied the same ratio of increase in reporting vs decrease in costs from
option E to the other OptiOns.
To calculate the number of reports under each option, the following was taken into account:
Option A (measures 1 + 2): the number of reports for this option is the same one
as in measure 1, since measure 2 (EU Centre on prevention and assistance to
victims) would not lead per se to an increase in the number of reports.
Option B (measures 1+3+4+5): the number of reports for this option is those of
measure i+ net number of reports in measures 4 and 5 (i.e. number of reports in
the measure minus those of the baseline). Measure 3 on the fully fledged EU
Centre, would not lead per se to an increase in the number of reports.
Option C (measures 1+3+4+5+6): the number of reports for this option is those of
option B+ the number of reports of known material under measure 6 on
mandatory detection minus the number of reports for known material under
measure 4 on voluntary detection).
Option D (measures 1+3+4+5+6+7): the number of reports for this option is those
of option C + the number of reports of new material under measure 7 on
mandatory detection minus the number of reports for new material under measure
4 on voluntary detection and measure 6 under which detection of new CSAM is
also voluntary).
Option E (measures 1+3+4+5+6+7+8): the number of reports for this option is the
potential number of reports that could be detected as described in table i on
section 3 of this annex. "How the model works".
Costs over 10 years
For the purpose of comparing the options and calculating overall costs, the total combined cost (not discounted) to service providers and public authorities over a period of 10 years (2021-2030) was considered (equal to one-off costs + 10 x amual costs for both public authorities and service providers combined):
358 Fir simplicity in the intemal calculations the model uses the number of reports from service providers rather than the number of reports reaching public authorities. Tms has no impact on the comparison of
options.
232
Table 33. total costs over loyears
oNE~ oFF c NINU
UO E- O (
ANNUAL
)
PoLにY CosTs c 0POLICY OPTIONs
1" years
Public S ervice Public S ervice
Authorities Providers Authorities Providers
A f0,4 f0,2 E 13,9 f2,8 f167,5 B f5,4 f158,4 E 43,6 f11,4 f714,5
C f5,4 f466,9 E 547,3 f470,9 El 0.653,7 D f5,4 f1.025,0 f797,4 E 991,3 f18.917,8 E f5,4 C 1.595,3 f825,6 E 1.463,3 f24.489,0
Sensitivity analysis
As explained in the report, it would be safe to estimate that the quantitative benefits
could be up to 50% of the annual costs of the CSA in the EU (considering that the amount of EUR 13.8 billion was a conservative estimate). And it would be even safer to
assume that the benefits could be 25% of the annual costs of CSA in the EU. For
comparison purposes, it seems useful to conduct a sensitivity analysis to determine how
the benefits would change under various assumptions of decrease of annual costs ofCSA in the EU: 50%, 40%, 30%, 20%, 10% and 5%.
Table 34: estimated annual benetspr the policy options伊UR billioり 50% decrease in annual CS二4 costs
POLICY ~ーーーー . . ノI、I. tSnmaIeaCOSI reaucuon be nenIS (DiiiiOflS De r vean ui i iui'さ in repOning ('7)
A 10% 1% 0,18f
B 23% 3% 0,38f
C 288% 41% 4,54f
D 348% 49% 4,88E
E 354% 50% 4,45E
Table 35: estimated annual benetspr the policy options 伍UR buhioり 40% decrease in annualフS二4 costs
POLICY o PTIoNs in reporting (%) tsumaIea cosr reaucuon tenems(D iiuons per y ear'
A 10% 1% 0,14f B 23% 3% 0,29E C 288% 32% 3,42E D 348% 39% 3,53E E 354% 40% 3,07f
233
Table 36: estimated annual benグtpr the policy o戸ons伍UR billioり
30% decrease in annuaに'二4 costs
POLICY ~ー,ー、ー . . ノh,, LS iimiea dOsr reauuOn benenIS lolilions De r vearI ui i iui'さ in repor[ing 1/)
A 10% 1% 0,10f B 23% 2% 0,20E
C 288% 24% 2,29e D 348% 29% 2,17f
E 354% 30% 1,69f
Table 37: estimated annual benグtspr the policy options伍UR billioり 20% decrease in annualSl4 costs
POLICY ~ーーーー . . ノh.. tSumaにa cosr reauCuOn iennIS (DlilloflS ner u1 i iui'さ in reporung ,7)
A 10% 1% 0,05f B 23% 1% 0,09E C 288% 15% 0,99E D 348% 18% 0,59E E 354% 20% 0,31E
Table 38: estimated annual benetspr the policy options伊UR billioり 15% decrease in annual CSシ4 costs
POLICY oPTIoNs in reporting (%)
tsmllara cosr reauuon tenenIs(Dlmo ns per y ear'
A 10% 0,4% 0,04E B 23% 1,0% 0,06E C 288% 12% 0,61E D 348% 15% 0,14f E 354% 15% -0,38E
Table 39: estimated annual benずtspr the policy options 伍UR billioり 10% decrease in annualSl4 costs
POLICY E s timated increase
PIN in re
p~
rti
ng
(%)
Es
timate
d
cost red uction B enef its (
billions p
er y
ear)
A 10% 0,3% 0,02f B 23% 1% 0,02C
C 288% 8% 0,05 f
D 348% 10% -0,54 f
E 354% 10% -1,07f
234
Table 40: estimated annual benグtpr the policy o戸ons伍UR billioり
5% decrease in annuaに7Sシ4costs
POLICY 0PTIONS
Benefits (billions per year)
a00E -0,03E -0,51E
421E
-1,76E
Estimated cost reduction
% % %
1 ( 、 J 1
0 0
4,
5%
5%
Estimated increase in reporting (%)
10%
23%
288%
348%
354%
A B C
D E
From the above sensitivity analysis it is possible to determine the minimum decrease in
annual CSA costs so that a given option produces net benefits:
Table 41: minimum%decrease in total annualS二4 costs to generate net befグts in
each policy OptiOn
A 0,13%
B 0,6%
C 8%
D 14%
F 18%
235
ANNEX 5: RELEVANT LEGISLATION AND POLI IS
The following legislative instruments and policies at EU, national and international level, are relevant for fighting against child sexual abuse (online and offline):
1. EU law
EU Charter of Fundamental Rights359, which recognises that children have the
right to such protection and care as is necessary for their well-being, among other
provisions.
EU data protection and privacy legislation: The legislation resulting from the data protection reform360 is of critical importance in the fight against child sexual abuse online:
o Regulation (EU) 20 16/679 on the protection of natural persons with regard to
the processing of personal data and on the free movement of such data361
(General Data Protection Regulation, GDPR). o Directive (EU) 2016/680 on the protection of natural persons with regard to the
processing of personal data by competent authorities for the purposes of the
prevention, investigation, detection or prosecution of criminal offences or the
execution of criminal penalties, and on the free movement of such data362
(Police Directive). o The 2002 ePrivacy Directive363 ensures the protection of fundamental rights
and freedoms, and in particular the right to privacy, with respect to the
processing of personal data in electronic communications over public networks. In particular, the Directive requires Member S tates to ensure the
confidentiality of communications by prohibiting and limiting the processing of traffic and location data without the consent of the user concerned, except for specific circumstances, and sets out the conditions to be met where national
law restricts those rights and obligations. In January 2017 the Commission
adopted a proposal for a Re2ulation on Privacy and Electronic
" Charter of Fundamental Rights 0f the European Union of 26 0ctober 2021, 0.1 C 326, 26.10.2012. 360 S ee here for more information. 361 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the
protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC.
362 Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the
protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminai offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA.
363 Directive 2009/136/EC of the European Parliament and of the Council of 25 November 2009
amending Directive 2002/22/C on universal service and users' rights relating to electronic communications networks and services, Directive 2002/58/EC conceming the processing of personal data and the protection of privacy in the electronic communications sector and Regulation (EC) No 2006/2004 on cooperation between national authorities responsible for the enforcement of consumer
protection laws.
236
Communicationsうb4 to replace the 2002 Directive. This proposal aims at
enhancing the protection of rights for users of all electronic communications
services and ensure protection of their terminal equipment. It will complete and further harmonise the privacy rules in the European single market and
overcome fragmented implementation of the Directive. It will create a level
playing field and reduce compliance cost for businesses. It also aims to
enhance consistency with the General Data Protection Regulation. . It will
strengthen enforcement powers. This proposal is still under negotiation. In
2017, the European Parliament adopted a report365 and gave the mandate to the
rapporteur to begin inter-institutional negotiations. On February 2021, the
Council agreed on a negotiating mandate366 At the time of writing, the inter-
institutional negotiations between the Council the European Parliament, and
Commission started on 20 May 2021.
EU legislation on the digital single market:
The E-commerce Directive367 establishes the free provision of information
society services inside the EU. These services providers should be subject
only to the rules applicable in their country of establishment and Member
S tates cannot restrict the provision of such services in the coordinated field.
However, this 'home state control' principle is subject to certain exceptions,
including for effectively tackling criminal offences. The e-Commerce
Directives also exempts, subject to certain conditions, certain online service
providers from liability for user content that they transmit or store.
The proposed Digital se rvices Act package368 (comprising of the proposed
Digital s ervices Act369 and Digital Markets Act370). The Digital s ervices Act
(DsA), proposed on 15 December 2020, aims to clarify and upgrade liability and safety rules for digital services, including new procedures for faster
removal of illegal content. The DSA proposes to clarify that intermediary service providers can continue to benefit from the exemptions from liability if
they are conducting voluntary own initiative investigations or other activities
aimed at addressing illegal content. It also proposes to require providers to
establish notice and action mechanisms, prioritise reports received from
364 Proposal for a Regulation concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/BC (Regulation on Privacy and Electronic C oimunicatons) COM (2017) 10 final.
365 Report on the proposal for a Regulation of the European Parliament and of the Council concerning the
Respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/C.
366 Proposal for a Regulation of the European Parliament and of the Council concerning the respect for Private life and the protection of personal data in electronic communications and repealing Directive 2002/58/BC - Mandate for negotiations with the European Parliament, 6087/2 1.
367 Directive 2000/3 1/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market
( 'Directive on electronic commerce'), 0.IL 178, 17.7.2000, p. 1-16.
368 'The Digital se rvices Act package '
, accessed 8 April 2021. 369 Proposal for a Regulation of the European Parliament and of the Council on a S ing1e Market For
DigitalS ervces (Digital S ervices Act) and amending Directive 2000/31/BC, COM/2020/825 fmal. 370 Proposal for a Regulation of the European Parliament and of the Council on contestable and fair
markets in the digital sector (Digital Markets Act), COM/2020/842 final.
237
trusted tiaggers, suspend the provision it the services tor users trequently
providing manifestly illegal content and to promptly inform the relevant
authorities if they become aware of suspicions of any serious criminal offence
involving a tbreat to the life or safety of persons. This proposal is still under
negotiation.
Proposal to amend the Europol Regulation371: it aims at strengthening Europol's mandate among others by enabling Europol to cooperate effectively with private
parties, in particular by allowing Europol to exchange data directly with private
parties for purposes other than simply identi句ing the competent authority in
Member S tates. It also proposes to clarify Europol's capacity to process personal data in support of financial or criminal intelligence operations and criminal
investigations for crimes falling within Europol's mandate. This proposal is still
under negotiation. The 2011 Child se xua1 Abuse Directive372, contains provisions harmonising definitions and criminal offences covering both offline and online acts. It also
addresses criminal procedure, administrative and policy measures in the areas of
prevention, investigation and prosecution of offences, as well as assistance to and
protection of victims. As a directive aiming to harmonise criminal law, it is based
on Article 82(2) and Article 83(1) of the Treaty on the Functioning of the
European Union (the T FEU)373 and is addressed to the Member s ttes
The Victims' Rights Directive374 ensures that all victims of crime receive
appropriate information, support and protection and are able to participate in
criminal proceedings. The Directive provides victims with a right to information, a right to understand and to be understood, a right to access support and
protection in accordance with their individual needs, as well as with a set of
procedural rights. For certain groups of victims, including child victims of sexual
exploitation, there are specific rules that respond more directly to the needs of
some victims, e.g. in view of protecting them from secondary victimisation, retaliation and intimidation.
The regulation on preventing the dissemination of terrorist content online375
aims to ensure that online service providers play a more active role in addressing terrorist content online. In particular, it aims at reducing accessibility to terrorist
content online, in view of terrorists' misuse of the internet to aruom and recruit
3n Proposal for a Regulation of the European Parliament and of the Council amending Regulation (EU) 2016/794, as regards Europol's cooperation with private parties, the processing of personal data by Europol in support of criminal investigations, and Europol's role on research and innovation, COM/2020/796 final
372 Directive 201 1//93/EU of the European Parliament and of the Council of 13 December 2011 on
combating the sexual abuse and sexual exploitation of children and child pomography, and replacing Council Framework Decision 2004/68/JHA, OIL 335, 17.12.2011.
373 Treaty establishing the European Community (Consolidated version 2002), Ql C 325, 24.12.2002, p. 33-184.
r4 Directive 20 12/29/EU of the European Parliament and of the Council of 25 0ctober 2012 establishing minimum standards on the rights, support and protection of victims of crime, and replacing Council Framework Decision 2001/220/JHA, OIL 315, 14.11.2012.
r5 Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on
addressinu the dissemination of terrorist content online. O.IL 172. 17.05.2021
238
supporters, to prepare and tacihtate terronst activity, to glority in their atrocities
and urge others to follow suit and instil fear in the general public. The regulation creates a system of binding removal orders, with a requirement that terrorist
content identified in the removal order is removed or access to it is disabled
within one hour. It also imposes an obligation on service providers, where
appropriate, to take certain specific measures to protect their services against the
dissemination of terrorist content. The regulation also strengthens co-operation between national authorities and Europol to facilitate follow-up to removal
orders.
The revised Audiovisual Media s ervices Directive (VMsD)376 strengthens the
protection of minors from harmful content and the protection of the general public from illegal content on video-sharing platforms. Concerning harmful content, the
A VMSD focuses on user-generated videos which 'may impair minors' physical, mental or moral development'.S uch content is allowed in on-demand services, but they may only be made available in such a way that minors will not normally hear or see them. This could be done by the use of PiN codes or other, more
sophisticated age verification systems. Concerning illegal content, the AVMS D
focuses on 'content the dissemination of which constitutes an activity which is a
criminal offence under Union law', including offences concerning child
p ornographV as set out in Directive 2011/93/EU.
2. EU policy European Conirnission:
The EU strategy for a more effective fight against child sexual abuse377 sets
out a comprehensive response to the growing threat of child sexual abuse both
offline and online, which aims at improving prevention, investigation, and
assistance to victims. The strategy aims to provide the EU with the right legal framework to protect children by ensuring that existing EU rules are fully
implemented, and proposing new legislation where needed, particularly to clarify the role that online service providers can play to protect children. The strategy also sets out initiatives to boost coordination, including by examining the
possibility to create a European Centre to prevent and counter child sexual abuse.
The legislation to be proposed is one aspect of the strategy's aim to provide an effective response, at EU level, to the crimes of child sexual abuse.
The s ecurit Union strategy378 focuses on three main priority areas: fighting
organised crime, countering terrorism and radicalisation, and fighting crime in a
digital age. The objective of the S ecurit Union S trtegy is to create a
multidisciplinary, coordinated and integrated approach to security. This strategy sets out the iter-dependent s trtegic security Driorities to be taken forward at EU
376 Directive (EU) 2018/1808 of the European Parliament and of the Council 0f 14 November 2018
amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or adminisirative action in Member S tates concerning the provision of audiovisual media services
(Audiovisual Media S ervces Directive) in view of changing market realities, 0.IL 303, 28.11.2018. 377 EU strategy for a more effective fight against child sexual abuse, COM(2020)607 final. 378 EUS ecurit UnionS trategy, COM(2020)605 final.
239
level in 2020-2024. The EU strategy fir a more effective fight against child
sexual abuse was adopted as one of the first deliverables of the approach taken by the security Union strategy.
The Communication on shaping Europe's digital future379, which notably states
that the "dissemination of illegal content must be tackled as effectively online as
it is offine".
The EU strategy on victims' rights380 outlines actions that will be conducted by the European Commission, Member S ttes and civil society to ensure that all
victims of all crime can fully rely on their rights. The EU S trtegy on victims'
rights is based on a two-strand approach: empowering victims of crime and
working together for victims' rights. The Key priorities of the strategy are: (i) effective communication with victims and a safe environment for victims to
report crime; (ii) improving support and protection to the most vulnerable
victims; (iii) facilitating victims' access to compensation; (iv) strengthening
cooperation and coordination among all relevant actors; and (v) strengthening the
international dimension of victims' rights. EU strategy on the rights of the child381 addresses persisting and emerging
challenges and proposes concrete actions to protect, promote and fulfil children's
rights in today's ever-changing world. The EU S trategy on the Rights of the Child
includes a targeted actions across six thematic areas, each one defining the
priorities for EU action. Under the thematic area Combating violence against children and ensuring child protection, the strategy announces actions to put forward a legislative proposal to combat gender-based violence against women
and domestic violence, table a recommendation on the prevention of harmful
practices against women and girls, and present an initiative aimed at supporting the development and strengthening of integrated child protection systems, which
will encourage all relevant authorities and services to better work together.
Organised crime strategy382 sets out actions to boost law enforcement
cooperation, reinforce the effectiveness of investigations on organised crime
structures and high priority crimes, remove profits of organised crime and prevent infiltration into the legal economy. It also presents actions to provide a modern
law enforcement response to technological developments. The S trtegy is
accompanied by a new S trtegy on Trafficking in Human Beings.
The EU s trategy on Combatting Trafficking in Human Beings383 proposes concrete actions to identify and stop trafficking early on, to go after criminals by
turning trafficking from a low-risk and high-return crime to high-risk and low-
return crime, and to protect the victims and help them rebuild their lives. The
majority of the victims in the EU are women and girls trafficked for sexual
exploitation.
379 Shping Europe 's digital future, COM (2020)67 final
380 EUs trtegy on victims' rights (2020-2025), COM (2020)528 final. 38l EUS trtegy on the rights of the child,COM (2021)142 final. 382 EUS trtegy to tackle Organised Crime 2021-2025,COM (202 1)170 final. 383 EUS trtegy on Combatting Trafficking in Human Beings,COM (2021) 171 final.
240
The EU Gender Equalitys trategy384 presents policy objectives and actions to
make significant progress by 2025 towards a gender-equal Europe. The key
objectives are ending gender-based violence; challenging gender stereotypes;
closing gender gaps in the labour market; achieving equal participation across
different sectors of the economy; addressing the gender pay and pension gaps;
closing the gender care gap and achieving gender balance in decision-making and
in politics. The strategy makes a commitment to combat online violence targeting women by clarifying internet platforms' role in addressing illegal and harmful
content.
As noted in section 2, this initiative responds to calls for further and concrete action made by the Council and the European Parliament.
Council of the EU. In its October 2019 conclusions on combatting the sexual abuse of
children385, the Council notably: reaffirmed "the EU's and Member S ttes' commitment to protect the fundamental
rights of children, and the rights of victims of crime, and to combat the sexual abuse
and sexual exploitation of children, both offline and online, irrespective of the
physical location or nationality of the child. Reducing the number of children who
fall victim to sexual abuse and increasing the proportion of successful investigations remains a key political and operational priority.";
stated that it considered "industry, and in particular online platforms, to be a key contributor to preventing and eradicating child sexual abuse and exploitation,
including the swift removal of child sexual abuse material online. Notwithstanding current efforts, the Council notes that more must be done to counter technical, legal and human challenges that hamper the effective work of competent authorities.";
recognised "the necessity of setting out a multi-stakeholder approach, bringing
together industry, civil society, law enforcement and governrnents (including through
public-private partnerships) to coordinate prevention efforts and thus maximise
their effectiveness."; and, among others,
invited "the EU and its Member S ttes to assess periodically the effectiveness of
legislation on combatting the sexual abuse and sexual exploitation of children to
ensure that it is fit for purpose. Gender-sensitive assessments should address in
particular the prevention, investigation and prosecution of crimes, including those
conmiitted in abuse of online platforms, as well as the provision of assistance and
support to child victims during and after the investigation, and protection measures
during criminal proceedings. Measures should however not be limited to the area of
criminal law."
European Parliament. In its November 2019 resolution386, the European Pari iarnent
notably:
384 A Union 0f Equality: Gender Equality S trtegy 2020-2025,COM (2020)152 final. 385 Council conclusions on combatting the sexual abuse of children of 8 0ctober 2019, No. 12862/19. 386 European Parliament resolution of 26 November 2019 on children's rights on the occasion of the 30山
amiversary of the UN Convention on the Rights of the Child (2019/2876(RSP)).
241
called for the creation of an EU child protection centre that would ensure an
effective and coordinated approach on the protection of children's rights in all
internal and external EU policies and actions and give an effective and
coordinated response to child sexual abuse and all forms of violence against
children;
urged "the Commission and the Member S ttes to work out a national strategy and put in place a holistic multi-stakeholder approach to eradicate sexual violence
and child abuse both online and offhne."; called on the current and upcoming Council presidencies to step up efforts to
ensure that Member S ttes take concrete actions to better assist victims and work
out effective preventive, investigative and prosecution measures to ensure that
perpetrators are brought to justice;
urged "the Commission and the Member S ttes to take concrete measures to end
child sexual abuse by investing in preventive measures, identifying specific
programmes for potential offenders and more effective support for victims." and,
among others, called on "CT companies and online platforms to take their share of
responsibility in the fight against child sexual abuse and exploitation online" and
stressed "the need for more investment, in particular from industry and the private
sector, in research and development and new technologies designed to detect
CSAM online and expedite takedown and removal procedures".
In addition, in its December 2020 resolution on the S ecurity Union s trtegyづど', the
European Parliament notably: reiterated the European Parliament's support for the creation of a European centre to prevent and counter child sexual abuse, as set out in the July 2020
EU strategy for a more effective fight against child sexual abuse; stressed the importance of preventing, detecting and reporting child sexual
abuse; and, among others, noted that a growing number of children and teenagers are falling victim to online
grooming.
European Economic and S ocia1 Committee. In its October 2020 0pinion on combatting child sexual abuse online388, the Committee notably:
stated that it "believes that it is time for the EU to have its own European Centre to
Prevent and Counter Child S exua1 Abuse and calls on the Commission to urge that
such a centre will be set up and developed. The centre should build on Europol's
work, to work with companies and law enforcement bodies, to identify victims and
bring offenders to justice."
387 European Parliament resolution of 17 December 2020 on the EU S ecurit Union S trtegy (202012791(RSP)).
388 European Economic and So ci1 Committee, Combatting child sexual abuse online,TE N/72 1 CII
(2020) 568 final 2020/0259 CID, 29 0ctober 2020.
242
considers it would be useful to have a third party perform regular testing/auditing,
using a sample non-CSAM (Child S exua1 Abuse Material) match similar to EICAR
test files in the anti-virus industry
3. National law
EU Member S tates.S ever1 Member S tates have either adopted or in the process of
adopting national provisions, which aim at regulating online service providers with
regard to illegal content and acts online. These include:
o Germany: ・ Network Enforcement Act (NetzDG),389 which aims at combating
hate crime, criminally punishable fake news and to improve the
enforcement of German criminal law online, notably in terms of
deletion of content. Under the law, which came into effect on January
1, 2018, social networks - among other obligations - have to set up a
complaints management system for reporting illegal content and must
take down or block access to manifestly unlawful content within 24
hours of receiving a complaint.S ocia1 networks that fail to set up a
complaints management system or do not set one up properly -
especially where this means that they do not delete criminal content in
full, on time or at all -face fines of up to E 50 million. In addition to
complying with this operational provision, social media platforms are
also obliged to publish bi-annual reports. A revision was adopted in
April 2021,390 providing m tr ahla for detailed reporting obligations in
case of detection of child sexual abuse materials.
・ Draft Act amending the Protection of Young Persons Act:391 aims
to regulate the dissemination of various forms of media harmful to
minors. It provides for the establishrnent of an obligation for internet
services relevant for children and minors to take appropriate and
effective structural precautionary measures to protect them from
dangerous content, protect their individual rights and their data, and
further develop tools for strengthening media literacy. In order to
enforce the implementation of the amendments, the draft also includes
the restructuring of the Federal Review Board for Media Harmful to
Minors into an authority to become the Federal Agency for the
Protection of Children and Young Persons in the Media.
o France:
' Law aimed at combating hate content on the internet (Avia
law):392 this law, which was adopted on 13 May 2020, obliges online
service providers to remove within 24 hours any content which has
been reported by any user (physical or legal person) or by the police as
manifestly unlawful (for ex.: material contamnina incitement to hatred
389 Additional information on the NetzDG can be found here. 390 Gesetzespaket zurB ekimpfung der H asskrimin 39l Additional information on the Drafi Act can be 392 Additional information on the Avia law can be
Rechtl laufe here. here.
器 」
m
or violence). Ihese obligations are addressed in particular to the big
platforms such as Facebook, YouTube and Twitter, the search engines and the websites exceeding a visitor-threshold to be determined by national law. The time-frame to carry out the removal obligations is
reduced to one hour - and applies not only to platforms but to any website - where the content has been flagged as terrorist propaganda or child sexual abuse material. Any failure to delete the content or
make it inaccessible within these time-limits is punished under
criminal law and triggers a fine up to 250.000 euros. Moreover, the
law requires the platforms to adopt the organisational and
technological measures appropriate to ensure that the flagged contents
are examined and removed within the due deadlines. The law grants the French media regulator extensive powers to systematically monitor
the levels of compliance with the law.S erices under the scope of the
law would also be subject to reporting and transparency obligations on
their content moderation activities and technical and human means
devoted to it. The French regulatory authority would also be granted broad powers of supervision and enforcement, including the issue of
binding guidelines. Where the regulator considers that the measures in
place are not adequate to the purposes of the law and that the platform has not aligned with its recommendations to mend non-conformity, it
can issue fines up to 20 million euros or 4% of the annual turnover, whichever is higher. Although certain provisions of the law were
deemed unconstitutional by the French Constitutional Council on 18
June 2020, particular concern has been voiced, across France's
political spectrum, about the need to regulate online service providers more strictly. In the meantime, the French law that aims to regulate online hate speech entered into force on ist July 2020393.
・Draft law to regulate online platforms:394 would create a new
national (administrative) authority equipped for fighting piracy,
protecting minors (including combatting the commercial exploitation of the image of children under sixteen years of age on online
platforms) or defending the public against disinformation and online
hate speech. The new authority would be in charge of enforcing
platform rules, including the Digital S ervices Act. The principle
obligations established in the draft relate to (i) cooperation with
judicial or administrative authorities, the retention of reported and
withdrawn content, and the appointment of a point of contact; (ii) the
transparency of the general conditions of use, the moderation system, the conditions for the suspension or termination of the account and the
Dublic reDorting on their moderation Dolicv: (iii) Drovidina users with
393 Loi f0 2020-766 du 24juin 2020 visant a lutter contre les contenus haineux sur internet, JORF f0 0156
du 25 juin 2020. 394Additional information on the draft law can be found here and here.
244
a mechanism tor reporting illegal content and processing said reports
promptly; (iv) the establishment of internal processes to combat the
withdrawal 0f content and suspension of accounts; (v) the evaluation
and mitigation of systemic risks associated with the service; (vi) an
obligation to report periodically to the Conseil S up&ieur de
l'Audiovisuel (Higher Audio-visual Council) and (vii) possible formal
notices and sanctions imposed by the same Conseil S up6rieur de
l'Audiovisuel in the event of non-compliance with these obligations. The draft aims to broaden the scope of the actors to whom the judicial authorities may prescribe any measures designed to prevent or stop
damage caused by an illegal site or illegal content; the injunction of
the judge would no longer be limited to hosting or internet service
providers, but to "any person" who may contribute to these preventive measures. Among the new tools of the new authority are blacklists of
illegal websites (containing a list of so called 'mirror sites' having
content, which is identical or equivalent to that of the service covered
by a court ruling) and mechanisms to make it easier to block such
websites.
o The Netherlands:
・ Draft law on fighting child sexual abuse:395 would impose a duty of
care on companies to address illegal content proactively. It would also
establish a new independent public law administrative body in charge of enforcing the removal of terrorist and child sexual abuse content
online. The authority would cooperate with hotlines and law
enforcement; have a legal basis to search for child sexual abuse
material proactively; have the power to issue notices to hosting service
providers, and to apply fines in case of non-compliance (for ex.: if
child sexual abuse material is not taken down within 24 hours). A
range of administrative instruments will allow action to be taken
against communication service providers through whose services store
or transmit child sexual abuse material, but who fail to take
(voluntary) measures to prevent this. This law will make it possible to
issue these providers witha binding order. Hosting service providers would be required to take appropriate and proportionate measures to
limit the storage and dissemination of child sexual abuse online. This
law also serves to implementa number of motions calling for greater efforts to combat child sexual abuse material online and a stricter
approach to providers who fail to cooperate in this or who do not
make sufficient efforts.
〇 Austria:
395The public consultation and the draft law are accessible hr.
245
'Draft law on measures to protect users on communication
platforms (Communications Platform Act):396 0n 1 January 2021, the Austrian "Communication-Platforms-Act" entered into force.
Operators had until 1 April 2021 to implement it. The law applies to
"communication platforms," which are defined as "information
society service[s], the main purpose or an essential function of which
is to enable the exchange of messages or presentations with
intellectual content in written, oral or visual form between users and a
larger group of other users by way of mass dissemination.". In
principle, all domestic and foreign providers would be affected,
provided that that they had more than 100,000 registered users in
Austria in the previous quarter and more than 500,000 euros revenue
generated in Austria in the previous year. Certain communication
platforms are exempt, such as certain media companies that are
already covered by specific legal requirements, or online trading
platforms and not-for-profit online encyclopedias, even though they have a commentary section. All regulated communication platforms would be required to appoint a "responsible representative" to ensure
compliance with domestic law and for service of process and
cooperation with law enforcement authorities. Depending on the
severity of the violation, the financial strength of the platform, the
number of registered users and the frequency/repetition of violations, different tvoes of Denalties will be imDosed.
Third countries:
o US.S ince many of the service providers whose cooperation is essential in the
fight against child sexual abuse online are headquartered in the US, its
national legal framework is also relevant in this context. It includes:
18 U.s. Code§ 2258A397, which obliges online service providers to report to the National Centre for Missing and Exploited Children instances of child sexual abuse online in their systems that they become aware of.
The PROTECT Our Children Act of 2008398, introduced in 2008
by the current US President Biden, requires the Department of
Justice to develop and implement a National S trategy Child
Exploitation Prevention and Interdiction, to improve the Internet
Crimes Against Children Task Force, to increase resources for
regional computer forensic labs, and to make other improvements to increase the ability of law enforcement agencies to investigate and prosecute child predators.
Additional information on the draft law can be found here 18 U.S.C.§ 2258A, Reporting requirements of providers.
396
397
398 Providing Resources. Officers, and Technology To Eradicate Cyber Threats to Our Children Act of 2008,S . 1738, ll0t1 Congress, 2008.
246
UK. The Online Harms White Paper399 covers both illegal and harmful
content. It provides for a new regulatory framework establishing a duty of
care on companies to improve the safety of their users online, overseen and
enforced by an independent regulator. The regulator would have a suite of
powers to take effective enforcement action against companies that have
breached their statutory duty of care. This may include the powers to issue
substantial fines and to impose liability on individual members of senior
management. It sets out a programme of action to tackle content or activity that may not cross the criminal threshold but can be particularly damaging to
children or other vulnerable users. This includes requiring companies to
provide effective systems for child users, and their parents or carers, to report, remove and prevent further circulation of images of themselves which may fall below the illegal threshold, but which leave them vulnerable to abuse.
Following the consultation on the Online Harms Whilte paper, the draft Online sa fty Bill400, which aims to establish a new regulatory framework to tackle harmful content online, was published on 12th May 2021
4. International conventions and agreements The 1989 UN Convention on the Rights of the Child, which establishes the
right of the child to be protected from all forms of violence40 1
UNCRC General comment No. 25 on children's rights in relation to the
digital environment402, of 2 March 2021, makes explicit - for the first time - that
children's rights apply in the digital world, including the protection from child
sexual abuse and exploitation. It sets out, among others, that state parties should
take all appropriate measures to protect children from exploitation and abuse,
including by legislating and enforcing business sector responsibility. It also
states that digital service provider's compliance can be achieved through due
diligence, in particular by means of child impact assessments. In particular,
paragraphs 36-39 (Section I, Children's right and business sector) provide the
釦llowing:
36.S tlts parties shouM take measures, including through the development,
monitoring, im]ルmentation and evaluation げ legislation, regulations and
poルjes, to ensure compliance勿businesses with their o bligations to prevent
琉eir networks or online services from being used in ways that cause or
contribute to vioたtuons or abusesげchildren 's rights, icんding their rights to
Privacy and protection,an d to Provide children, parents and caregivers wi琉
prompt and effective remedies. They should also encourage businesses to provide
399 UK Home Office, Consultation Outcome: Online Harms White paper, updated 15 December 2020. 400 UK Home Office. Draft Online S afet Bill' 12 May 2021. 401 Also of relevance for child sexual abuse in the domestic context is the Council of Europe Convention
on preventing and combatting violence against women and domestic violence,cETs No.210, 01.08.2014. 402 United Nations, UNCRC General coninient No. 25 (2021) on children's rights in relation to the digital
environment of 2 March 2021,C RC/C/GC/25.
247
public ilfr ation an d accessiblean d timeケadvice to support children 's safe
and benがcial digital activities.
37. S tats parties have a duかto protect children from infringementsげ訪eir
rightsりbusiness enterprises, including me rigんto be protected jhm allprms
げviolence in the digital environment. Al琉ough businesses may not be directケ
involved in perpetrating harmful acts, they can cause or contribute to violations
げchildren 's rig加to freedom 戸om violence, including mrough me design and
operation げ digital services.S tte parties should put in place, monitor and
enforce laws and regulations aimed at preventing violations げthe right to
protection from violence, as well as those aimed at investigating, adjudicating on
and redressinz vioたtions as mev occur in reたtion to琉e digital environment.
38.S tates parties shouM require the business sector to undertake cんM rights due
diligence, in Particuルr to carry out chiH rights impact assessments anddおclose
mem to銃e public, wi訪special consideration given to琉editferentiated and, at
times, severe impactsげthe digital environment on children. They should taん
appropriate steps to Prevent, monitor, investigate and Punおhcん尼rights abuses
勿businesses.
39.んaddition to developing legislation and policies,S tts parties shouM
require a刀businesses that qがect childrenk
environment to implement reguたtory framewo
services
relation
琉at adhere to琉e highest standards
rights
ln
ln
dus
reルtion tomedigital
tiつノcodes and terms
e訪ics, privacy and safeか
げ ・ m
to切e design, ellgineerig development, operation, distribution and
marketing げ meir products and
children,have
require such
ces.That includes businesses琉at
children as end users or otherwise q施Ct children. They
businesses to maintain
accountabiliか and encourage mem to
target
銃ouM
high standards げtransparency and
take measures to innovate in坑e best
interests げ the child. They should also require the provisionげage-appropriate
expたnations to children, or to Parents and ca
their termsげservice.
verspr very young children,げ
The 2007 Council of Europe Convention on Protection of Children against se xua1 Exploitation and se xua1 Abuse (Lanzarote Convention)403, which served
as an inspiration for the Child S exual Abuse Directive.
The Council of Europe Convention on Cybercrime (Budapest Convention)404. This 2001 instrument obiges Parties to establish certain criminal offences
relating to child sexual abuse material in their domestic law. In addition, the
Convention also provides, among others, a framework for mutual legal assistance,
403 Council 0f Europe Convention on Protection of Children against S exua1 Exploitation and S exua1 Abuse, CETS No.201, 01.07.2010.
404 Council of Europe Convention on Cybercrime, ETS No.185, 01.07.2004
248
and requires parties to ensure the availability of certain procedural powers in
relation to the detection, investigation and prosecution of cybercrime offences at
both the domestic and international levels. The Parties to the Convention are
engaged in negotiations for an additional Protocol to the Convention to enhance
existing rules to improve cross-border access to e-evidence405.
405For more information see here.
249
ANNEX 6 : ADDITIONAL INFORMATION ON THE PROBLEM
This armex presents additional information on the definition and magnitude of the
problem.
1. Definition
The problem is that some child sexual abuse crimes are not adequately addressed in the EU due to insufficient prevention, challenges in their detection, reporting and action, and insufficient assistance to victims.
Figure 1 presents and overview of the different parts of the problem in its broadest form:
Figure 1: overview げ琉ediがとrnt partsげ訪eproblem
PREVENTION ABUSE NOT PREVENTED ABUSE PREVENTED
「
ー ー ー ー ー ー ー ー
L1. 1-'reve m on
There is consensus among practitioners in the fight against child sexual abuse (including law enforcement) that prevention is essential, because it is obviously best for children to
protect them from falling victim to the crime rather than acting after the fact. Once the offender commits the crime the victim is harmed, and, even if law enforcement rescues them and stops the abuse, it is already too late to avoid the inimediate and long-term negative consequences for victims of the abuse that has already taken place.
There are two main types of prevention efforts406: 1. Efforts focused on the child and his or her environment and on decreasing the
likelihood that a child becomes a victim. Examples include awareness raisin2
406 See here for an overview of international efforts to prevent child sexual abuse: Unicef, Action to end Child S exua1 Abuse and Exploitation: A Review of the Evidence 2020, December 2020, p. 77, 143.
250
campaigns to help inform children, parents, carers and educators about risks and
preventive mechanisms and procedures, as well as training. 2. Efforts focused on potential offenders and on decreasing the likelihood that a person
offends. Examples include prevention programmes for persons who fear that they
might offend, and for persons who have already offended, to prevent recidivism.
The Child S exua1 Abuse Directive40' requires Member S tates to put in place effective
prevention programmes. It requires Member S tates to ensure that persons who fear they may commit child sexual abuse offences have access to effective intervention
programmes or measures designed to evaluate and prevent the risk of such offences being commltted408. s ii1ar1y, Member s ttes are obliged to make effective intervention
programmes available at any time during criminal proceedings to prevent and minimise the risks of repeated offences409. The 2011 Directive also requires Member s ttes to take action to discourage and reduce the demand that fosters all forms of sexual exploitation of children, to raise awareness and reduce the risk of children becoming victims of sexual abuse or exploitation4 10
The monitoring of transposition into national law of this Directive indicates that Member states struggle with putting in place such programmes4 11, of the two types above, where
frequently multiple types of stakeholders need to take action. As a result, children and their environment are insufficiently aware of the risks and of means of limiting them, and
persons who fear they may offend do not find avenues for support to try to avoid
offending.
1.2. Detection, reporting and action
Wllere prevention fails, the first step to address these crimes is to detect them as early as
possible and report them to law enforcement.
Despite the seriousness of these crimes, a long time often passes before they are
detected412, if that ever happens. The lack of detection can have several reasons:
frequently, the abuser establishes control over the victim, using secrecy, blame, and threats to prevent the child from disclosing the abuse413. The child may also be unable to seek help due to an intellectual or physical disability, or because the child is afraid of the
consequences of going against the abuser's will, as the abuser often belongs to the circle of trust of the child (four in five cases), i.e. people they know afd trust or depend
40' Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children and child
pornography, OJ L 335, 17.12.2011, p. 1-14 408 Ibid, Art. 22. 409 'bid, Art. 24. 410 Ibid, Art. 23. 411 Member S tates struggle in particular with the implementation of Articles 22, 23 and 24 of the
Directive, focused on prevention. For more details, see the Report from the Commission to the
European Parliament and the Council assessing the extent to which the Member S tates have taken the
necessary measures in order to comply with Directive 201 1/93/EU of 13 December 2011 on combating the sexual abuse and sexual exploitation 0f children and child pornography, COM/2016/0871 fmal.
412 McElvaney, R., Disclosure 可 Child se xuai Abuse: Delays, Non一disclosure and Partial Disclosure. What the Research Tells Us aud Implications pr Practice, 26 June 2013, p. 159-169; see also The Irish Times, Historic sex abuse victims waiting average of 20 years to come forward, 17
April 2019. 413 Darkness to Light, Child S exua1 Abuse S tatistics, accessed on 20 April 2021;S ee also the research on
Child S exual Abuse Accommodation Sy idrome, which may explain why children often do not report sexual abuse incidents or withdraw their complaints, Masumova, F., A Need for Improved Detection of Child and Adolescent S exua1 Abuse. May 2017.
251
0n414, including family members in one in five cases415. Or the child victims simply
may be too young to recognise that what is happening to them is abuse416. As a
consequence, the child may not tell anyone and those close to the child either are not aware of the problem or are accomplices to the crimes417. For example, in a recent case in a campsite in Germany, two men sexually abused 32 children, aged between 3 and 14, over 10 years, including a foster girl that had been trusted to one of the men418. In another recent case, a stepfather had been sexually abusing and raping his three
stepchildren for 8 years until the mother found out by chance419.
Frequently the only way that these crimes come to the attention of public authorities is when the offenders exchange online the images and videos of the abuse or try to
approach children online for sexual purposes. For example, in Germany the police rescued a 10 year old boy and a 13 year old girl that had been abused by their father 42 times before an online service provider detected the images of the abuse and reported them to public authorities420.
Even when the abuse does not occur in the circle of trust, such in the case of online solicitation of children where an offender lures or extorts the child into sexual abuse, internet companies (more precisely referred to as online service providers) are often the
only ones to be able to detect the crimes. In these cases, the child may not dare to tell
anybody for fear of the offender, who often threatens the victims with sharing the images and videos of their abuse with their family and friends if they tell anyone. For example, in a recent case in the UK, an offender who pretended to be a girl online was convicted of
abusing 52 children, ranging from ages 4 to 14, after targeting more than 5000 children globally. He threatened the victims with sharing online the sexual images that he had lured them into producing and forced some of them to abuse younger children and record the abuses.S ome victims later tried to kill themselves. The investigation only started after Facebook, the online service he mainly used to find victims, detected the abuse and reported it to public authorities42 1,
Reports of child sexual abuse online are both evidence of a crime, as the possession and dissemination of child sexual abuse materials and grooming of children into abuse are in themselves criminal offences, and at the same time often also a lead for uncovering further offences, including at times ongoing child sexual abuse.
Reports from online service providers can contain three main types of abuse: 1. past abuse, through the distribution of known material, i.e. images and videos
that have already been detected before and identified as child sexual abuse;
414 This includes in particular children with disabilities living in institutional care. 415 Gewirtz-Meydan, A., Finkeihor, D.,S exu1 Abuse and Assault in a Large National S a p1e of Children
and Adolescents, 16 S eptember 2019.S ee also Canadian Centre for Child Protection, S urvivor's Su rvey Full Report 2017, July 2017; and ANAR,S exua1 Abuse in Childhood and Adolescence
according to the Victims and its Evolution in S pin (2008-2019), February 2021. 416 NationalSo ciet for the Prevention of Cruelty to Children困SPCC), What is sexual abuse?, accessed
on 9April2021. 417 Pereda, N., Diaz-Faes, D.A., Family violence against children in the wake of COVID-19 pandemic: a
review of current perspectives and risk factors, 20 0ctober 2020. 418 The Guardian, Two men jailed for decades of child abuse at German campsite, 5 S eptember 2019.
DW, Germany: Long jail terms handed out in campsite sex abuse trial, 5 S eptember 2019. 419 S 11ddeutsche Zeitung,S tiefvater wegen jahrelangen Kindesrnissbrauchs vor Gericht, 20 January 2021. 420 S 11ddeutsche Zeitung,S o1inger soll eigene Kinder missbraucht haben, 29 January 2021. 421 UK National Crime Agency, Paedophile who targeted more than 5,000 children globally in child
sexual abuse case jailed for 25 years, 10 February 2021.
252
2. ongoing abuse, through the distribution of new material, i.e. images and videos
0f child sexual abuse which had not been detected before; 3. future abuse, through the detection of grooming (also referred to as online
solicitation or enticement), i.e. text-based threats for children in which an adult,
frequently hiding its true identity422, establishes online contact with a child for
sexual purposes423
These reports have been instrumental for years in rescuing children in the EU from
ongoing abuse. They have led to, for example: the rescue of 11 children, some as young as 2 years old, who were exploited by a
network of abusers in s weden424;
the single largest operation ever against child sexual abuse in Denmark425;
the rescue of a 9 year-old girl in Romania, who had been abused by her father
for more than a year426;
the arrest of an offender in France who groomed 100 children to obtain child
sexual abuse material from them427;
the rescue of 2 girls in Czechia, abused by a 52 year-old man, who recorded the
abuse and distributed it online428;
These reports have also been instrumental in preventing the abuse of children in the
EU, through the detection of online solicitation.
Annex 7 contains additional information on sample cases of child sexual abuse in the EU that started with a report from online service providers.
Law enforcement in the EU receives the vast majority of child sexual abuse reports from two sources: 1) service providers, through NCMEC; and 2) the public and hotlines,
through hot1ines429:
422 Craven,S ., et al., S exua1 grooming of children: Review of children: Review of literature and theoretical considerations, November 2006.
423 0nline solicitation may also reflect ongoing abuse (e.g. when the child is extorted into producing new
images and videos). 424 S wedish Cybercrime CentreS C3,S wedish Police. 425 Europol, Internet Organised Crime Threat Assessment, 18 S eptember 2018, p. 32. 426S tir1e Kanal D, O femeie de 27 de ani, din Ba c豆u§ i-a abuzat sexual fetila de doar 9 ani pentru a -i
multumi iubitul, 9 November 2018. 427 As reported by the French police. 428 As reported by the Czech police. 429 Based upon the median percentage of reports received by law enforcement authorities from each
source according to the targeted survey of law enforcement authorities (see Annex 2). Respondents indicated that about 45% of reports originate from providers through NCMEC, while about 10% of
reports originate from hotlines in their own jurisdiction or another jurisdiction, representing the largest two external sources
253
Figure 2. the 'wo main sourcesげCS且reports pr law e可bremet in the EU
CHILD SE XUAL AB USE ONLINE ー 〔known and new matenat and grooming)
� � � � � � � � � � � � � � ,
I
一
!
!
一
一
一
一
!
!
一
ー
一
一
一
!
!
!
一
一
一
!
!
!
一
一
一
!
!
一
一
一
一
一
!
!
!
!
!
!
!
!
!
!
!
! '
!
!
!
!
!
一
一
!
!
一
一
一
一
一
一
一
一
一
一
一
一
一
一
●DETECTION [sERViCE PROVIDERs1 「
ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー ー
e ACTiON
PUBLIC HOTLINE
庭 ! !
! 一
! 一
! !
! !
! !
! !
! 一
! 一
! !
! !
! !
! !
! 一
! 一
! 一
! 一
! 一
! 一
! 一
! 一
! 一
! 一
! 一
! !
! 一
! 一
! 一
! 一
! 一
! !
! 一
! 一
! 一
! 一
! 一
! !
! !
! !
! 一
! 一
! 一
! !
! !
! !
! 一
上 上 ‘
一
讐lr':54
�
1
REscUE V1cll1s /
RREsT o FFENDERs 」
’一 ー I
N ACTION
[
sERVICE PROVIDER]
' L CONTENT REMOVED
From service providers
In a survey addressed to public authorities in Member S tates, more than t'vo thirds of
respondents indicated that the largest proportion of leads in child sexual abuse cases were
reports from online service providers about abuse discovered on their systems430.
1. Detection.
In the detection stage of the process, known CSAM , new CSAM or solicitation is detected by technologies used by the provider.Se vera1 types of technologies are currently used by providers and organisations in this stage, many of which are made freely available as a service to qualified users431. Technologies for the detection of known CSA1 typically make use of a process known as hashing, which generates 'digital fingerprints' of files. By comparing these fingerprints with those of content that has been
previously verified as csAM , new copies of the content can be easily detected432
Technologies for the detection of new CSAM are commonly based on artificial
intelligence. Using previously-verified CSAM, these technologies are trained to identify whether new material is likely to constitute csA 1433.s ee annex 8 for more details on the detection technologies.
2. Reporting.
430 See Annex 2. 43lH. Lee et al., Detecting child sexual abuse material: A comprehensive survey, Forensic Sci ene
International: Digital Investigation, Volume 34,S eptember 2020, 301022. 432 Thom, 'introduction to Hashing: A Powerful Tool to Detect Child se x Abuse Imagery Online
' ,
12 April 2016.
433Thom, 'How sa fr's detection technology stops the spread of CsAM ', 13 August 2020.
254
In the reporting stage, content that has been flagged as possible CSA online is processed
prior to receipt by relevant law enforcement authorities. In this stage, the service provider may perform additional verification, such as human review, of flagged content to confirm that the content constitutes CSA online. In addition, the provider blocks access to the CSAonline and makes a report to NCMEC. US law obliges service providers to report to NCMEC child sexual abuse online in their services where they become aware of the abuse (i.e. it does not make providers subject to an obligation to detect such abuse).
NCMEC verifies in some cases that the reported content constitutes CSA online, in accordance with the relevant definitions under US law, and attempts to determine the relevant jurisdiction(s). Where the report relates to an EU Member S tte, the report is forwarded to the US Department of Homeland Se curity Investigations (1S') for onward
transfer, either to Europol or directly to the relevant EU law enforcement authorities434.
Europol cannot receive information directly from private parties, including NCMEC (or service providers)435, hence the intermediary role of Us 1s'.
Reports which are received by Europol are cross-checked and forwarded to the relevant Member S ttes436
3. Action.
In the 'action' stage, reports are received by the competent law enforcement authorities in Member S ttes. Those authorities then review the reports in accordance with national
law, confirming that the report relates to possible criminal activities and commencing a criminal investigation.
Based upon the information contained in the report, law enforcement authorities take
steps to identify and rescue victims from ongoing or imminent abuse, and to identify, investigate and ultimately arrest suspects. Where necessary, authorities engage further with the service provider to obtain further information relevant to the investigation, and, in limited cases, to provide feedback to providers on their reports in order to improve oualitv in future.
Box 1: challenges 加 cross-border access to electronic evidence
In many cases, additional information is needed by law enforcement authorities from service providers when investigating child sexual abuse, with those service providers often being located in another Member S tate, or in a third country.S ignicant and
longstanding challenges exist regarding processes to obtain access to e-evidence across borders. Indeed, e-evidence is relevant in about 85% of criminal investigations, and in two thirds of these investigations a request to service providers in other jurisdictions is needed437
434Europol chamiels NCMEC reports to 18 EU Member S ttes. The rest of the Member S ttes receive
reports directly丘 om NCMEC through a secure (VPN) channel set up by 1S'. 435
Impact Assessment accompanying the document Regulation of the European Parliament and of the Council amending Regulation (EU) 2016/794, as regards Europol's cooperation with private parties, the processing of personal data by Europol in support of criminal investigations, and Europol's role on research and innovation of 9 December 2020,S WD120201543 final.
436The proposal for a revision of Europol's mandate includes the possibility for Europol to receive
personal data from private parties.S ee Proposal for a Regulation amending Regulation (EU) 20 16/794, as regards Europol's cooperation with private parties, the processing of personal data by Europol in
support of criminal investigations, and Europol's role on research and innovation of 9 December 2020,
COM(2020) 796 fmal. 437 See the Impact assessment for the proposals on cross-border access to e-evidence S WD/20 18/118.
255
While several mechanisms exist tor access to such evidence, each has sigrnticant
difficulties. Judicial cooperation between the public authorities of different countries (for example, through mutual legal assistance charmels or European Investigation Orders) is
typically slow and resource intensive. Direct cooperation between service providers and
public authorities is possible in some cases, however in general it is unreliable, inconsistent and lacks transparency and accountability.
In general, less than half of all requests to service providers are fulfilled, and two-thirds of crimes involving cross-border access to e-evidence cannot be effectively investigated or prosecuted438. There are currently several initiatives which seek to address challenges related to e-evidence at the Union level and interntionally439.
From訪ePuh万c and hodines
About 10% of the reports that law enforcement in the EU receives come from hotlines, which in turn receive reports from either the public or other hot1ines440
1. Detection.
In the detection stage, suspected child sexual abuse online is encountered either by a member of the public, who makes a report to the national hotline in their country, or by a hotline searching proactively for child sexual abuse online.
2. Reporting.
In the reporting stage, the hotline reviews the suspected child sexual abuse in accordance with national law. Where the reported content does not constitute CSAM, no further action is taken. Where the hotline concludes that the content does constitute CSAM , the hotline adds hashes to INHOPE's ICCA1vI database, and attempts to determine the
jurisdiction in which the content is hosted.
'f the content is hosted in the same jurisdiction as the hotline, the hotline sends a report to the relevant law enforcement authorities for investigation. The hotline also sends a notice-and-takedown request to the relevant service provider, alerting the provider of the abusive content on their service and responsibility to remove the content under the eCommerce framework. The hotline then monitors and confirms the service provider's compliance.
If the content is determined to be located in another jurisdiction, the hotline forwards the reDort to the national hotline in that jurisdiction, if one exists. The hotline in the host
438Ibid. 439
Proposal for a Regulation and of the Council on European Production and Preservation Orders for electronic evidence in criminal matters of 17 April 2018,C OM/201 8/225 final; and Proposal for a Directive of the European Parliament and of the Council laying down harmonised rules on the
appointment of legal representatives for the purpose of gathering evidence in criminal proceedings of 17 April 2018,COM !2018/226 final.
Negotiations on an EU-Us e-evidence agreement: Council Decision authorising the opening of
negotiations in view of an agreement between the European Union and the United S tates of America on cross-border access to electronic evidence for judicial cooperation in criminal matters, 9114/19.
Negotiations on a S econd Additional Protocol to the Convention on Cybercrime: Council Decision
authorising the participation in negotiations on a second Additional Protocol to the Council of Europe Convention on Cybercrime (CETS No. 185), 9116/19.
440 S u of median percentages of reports of child sexual abuse online received by law enforcement authorities from hotlines in their own jurisdiction or another jurisdiction, as a percentage of the total number of reports received.So urce: Targeted survey of law enforcement authorities (see Amex 2).
256
country re-examines the reported content in accordance with the national law of that
jurisdiction and, if the reported content is confirmed to constitute child sexual abuse under the applicable law, forwards the report to the relevant law enforcement authorities and service provider for action, as above.
In cases where the content is found to be hosted in another jurisdiction which does not have a national reporting hotine, the hotline forwards the report to Interpol for action.
3. Action.
In the action stage, reports are received by the competent law enforcement authorities in the jurisdiction where the reported content is hosted, and notice-and-takedown requests are received by the service providers hosting the content.
Under the eCommerce framework, providers' exemption from liability for illegal content ceases to apply if they do not act promptly once they are made aware of the content' s
presence on their services. Upon receipt of a notice-and-takedown request, the provider take steps to remove the reported content from their services in accordance with their
legal obligations, while the hotline monitors and confirms that the content is removed.
Reports received by law enforcement authorities are reviewed in accordance with national law in order to confirm that the report relates to possible criminal activities, and a criminal investigation is launched. Due to the nature of reports received from hotlines, which are sourced by members of the public and hotlines themselves from the open web,
reports typically contain only limited information.
Based upon the information contained in the report, law enforcement authorities take
steps to identify and rescue victims from ongoing or imminent abuse, and to identify, investigate and ultimately arrest suspects. Where necessary, authorities engage further with the service provider to obtain further information relevant to the investigation.
Box 2: regnルtoグchallen ges and the effectivenessげhollines
The operation of hotlines is not explicitly provided for in Union law, and is provided for
by national law in only five Member S ttes. Iotlines also lack an explicit and uniform
legal basis for the exchange of CSAM and data related to CSAM with other hotlines, service providers and law enforcement authorities441. EU hotines usually operate based on co-regulation and self-regulation frameworks, leading to legal uncertainty with gaps in relation to the legality of processing of reports and related data. This, in turn,
significantly restricts the activities that can be undertaken by EU hot1ines442
While the importance and effectiveness of proactive searches for CSAM by hotlines has been demonstrated, the lack of a clear legal basis for EU hotlines to undertake such searches means that currently just one EU hotline can do so, and only to a limited extent443
Also, the lack of a clear and consistent legal framework for notice-and-takedown requests significantly complicates the work of hotlines. Many EU hotlines are unable to send notice-and-takedown requests to providers, while the lack of harmonised monitoring, sanctions and definitions of prompt removal undermine compliance444.s imilarly to
4" Ibid. 442 European Commission, tuみou framework可best practices to tackle child sexual abuse material
online, 2020.
443Ibid. 444 Ibid.
257
reports from US service providers, differences between definitions ofCSAM in different
jurisdictions, including between different Member S ttes, can create difficulties: content that is classified as illegal by the hotline that receives a public report may not be illegal in the jurisdiction where the content is hosted. Consequently, such reports must be assessed
by multiple hotlines, leading to delays or even resulting in the content being le丘 online445.
1.3'Hssおtance lo vic万rs
Victims of child sexual abuse need tailored and comprehensive assis tance446,
immediately and in the long-term447
An example of immediate assistance is the support of victims during criminal
investigations and proceedings, to prevent that they suffer additional trauma (e.g. by setting specific standards for interviews with child victims)448.
An example of long-term assistance is the support of victims to stop the sharing and distribution online of the images and videos depicting their abuse, which perpetuates the harm. Victims have to live with the knowledge that the images and videos showing the worst moments of their lives are circulating and anyone, including their friends or
relatives, may see them449.
The Child Se xua1 Abuse Directive introduced measures to support victims of child sexual abuse, including measures to prevent that victims suffer additional trauma through their involvement in criminal investigations and proceedings450, to ensure that assistance and support are available as soon as there are reasonable grounds to suspect an
offence451, and that special protection is assured for children reporting abuse committed within the family452.
The monitoring of transposition into national law of the Directive indicates that Member States are incurring delays to fully implement these articles concerning assistance and
support to victims before, during and after criminal proceedings453. In addition, as noted in the EU strate av for a more effective丘g ht against child sexual abuse, the efficiency and
445Ibid.
446Unicef, Action to e nd Child Se xu1 Abuse and Exploitation: A Review of the Evidence 2020, December 2020.
447Victims' testimonies, which may help understand victims' need for assistance, are available at The Truth Project, Experiences S hared, accessed on 20 April 2021; Royal Commission into Institutional
Responses to Child Se xua1 Abuse, Narratives, accessed on 20 April 2021. 448 Canadian Cenire for Child Protection,S urviVor
's Su rvey Full Report 2017, July 2017; ANAR,Se xu1 Abuse in Childhood and Adolescence according to the Victims and its Evolution in S pain (2008-2019), February 2021.
449 Seerelated victims testimonies at The New York Times, 'If Those Were Pictures of You, You Would
Understand', 9November 2019. 450 Directive 201 1/93/EU of the European Parliament and of the Council of 13 December 2011 on
combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/I{A, OIL 335, 17.12.2011, Art. 20. L 335, 17.12.2011, p. 1-14
451 Ibid. Art. 18. 452 Ibid. Art. 19. 453For more details, see the Report from the European Parliament and the Council assessing the extent to
which the Member S tats have taken the necessary measures in order to comply with Directive 2011/93/EU of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography of 16 December 2016, COM!2016/0871 final.
258
ettectiveness it ettorts to assist victims is limited as these do not systematically make
use of existing best practices and lessons learned in other Member s tates or globally454.
Also, although not explicitly required by the Directive, no Member S tate has put in place measures to support victims in ensuring removal of child sexual abuse materials
circulating online. Victims are unable to take action themselves as they would be
committing a crime when searching for child sexual abuse images.
As figure 2 indicated, even if the abuse is detected and the investigation is successful, there are situations in which the victim does not receive the necessary assistance.
2. Magnitude
It is not possible to determine exactly the number of crimes that cannot be effectively addressed in the EU due to insufficient prevention, challenges in their detection,
reporting and action, and assistance to victims. Data at this level of detail is not collected
by public authorities.
In addition, these crimes appear to be significantly underreported.S tudies show that whereas about one in five girls and one in ten boys become a victim of child sexual
abuse455, one in three victims will never tell anyone and at least four in five child sexual abuse cases are not reported directly to public authorities456 (i.e. by the victims or people close to the victims).
There are indications that the co VID-19 crisis has exacerbated the problem457, especially for children who live with their abusers458. In addition, children are
spending more time online than before, possibly unsupervised.459 While this has allowed them to continue their education and stay in touch with their peers, there are
signs of increased risk of children coming into contact with online predators460. With more offenders isolated at home, the demand for child sexual abuse material has increased46i (e.g. by 25% in some Member s tates462), which in turn leads to increased demand for new material, and therefore new abuses463.
4う4 EUS tr1egy fir a more effective fight against child sexual abuse COM (2020) 607 final. 455M.S toltenborgh, 1.1. van Uzendoorn, E.M.Euser, M.J. Bakermaris-Kranenburg, A global perspective
on child sexual abuse: Meta-analysis of prevalence around the world, 2011, pp. 79-101. This study, based on 331 independent samples and almost 10 million individuals, found an overall prevalence rate of 13%, with the rate for girls being more than twice that of boys (18% vs. 8%, respectively). These numbers concur with those of another study involving more than 10 000 individuals, which found a
prevalence of 7.9% of males and 19.7% of females: Pereda N, Guilera G, Forns M, G6mez-Benito J, The prevalence of child sexual abuse in community and student samples: a meta-analysis, 2009.
456 Gewirtz-Meydan, A., Finkelhor, D.,S exu1 Abuse and Assault in a Large National Sa p1e of Children and Adolescents, 16 S eptember 2019; Martin E,S ilverstone P: How much child sexual abuse is "below the surface", and can we help adults identify it early, May 2013.
457Europol, Exploiting isolation: Offenders and victims of onlme child sexual abuse during the COVID 19 pandemic, 19 June 2020.
458 WePROTECT Global Alliance, World Childhood Foundation, Unicef, UNDOC, WHO, ITU, End Violence Against Children and UNESCO, COVID- 19 and its implications for protecting children online, April 2020.
459Europol, European Union serious and organised crime threat assessment, 12 April 2021. 460 Ibid. 461 NetClean,
'NetClean Report - COVID- 19 Impact 2020', accessed 14 April 2021. 462 Europol, Exploiting isolation: Offenders and victims of online child sexual abuse during the COVID
19 pandemic, 19 June 2020. 463 The number of child sexual abuse reports globally quadrupled in April 2020 (4.1 million reports)
compared to April 2019 (around 1 ntillion. as reported to the US National Centre for Missing and
259
With regard to the victims:
a majority are female (girls are more than twice as likely to be abused than
boys)464;
one of every seven victims of sexual violence reported to law enforcement
agencies is under 6 years465;
three out of four victims depicted in the images and videos is younger than 13
years old466;
With regard to the offenders:
Although prevalence data is scarce, studies indicate that around 3% of the male
population could have a paedophilic disorder467;
Estimates suggest that only 50% of child sexual abusers have a sexual orientation
towards children (paedophilia or hebephiliac)468;
S tudies suggest that up to 32% of high-risk offenders who view child
pornography may re-offend469
99.6% of people convicted in the US in 2019 for non-production CSAM (e.g.
distribution) were men, with an average age of 4 1470.
2工Data on reporting勿online service providers
Amount げreports
The past few years have seen a strong increase in reports of child sexual abuse online submitted by online service providers globally: from i million reports in 2010 to over 21 million in 2020:
Exploited Children, CNN, The pandemic is causing an exponential rise in the online exploitation of
children, experts say, 25 May 2020. 464 Co llin-V6zina, D., et al., Lessons learned from child sexual abuse research: Prevalence, outcomes, and
preventive strategies, 18 July 2012, p. 6.S ee also :SVSo lutions - Preventing S exa1 Violence Against Children - Together For Girls, which analysed available data from 24 countries (primarily in high- and middle-income couniries) and found that sexual violence in childhood ranged from 8% to 31% for girls and 3% to 17% for boys.
465 Gewirtz-Meydan, A., Finkelhor, D.,S exua1 Abuse and Assault in a Large National S a p1e of Children and Adolescents, 16 S eptember 2019, p.2.
466 INIOPE, Annual Report. 2019, p. 31. 467 In a self-report survey with a sample of 1,978 young adult males from S wden, 4.2 % reported they
had ever viewed child sexual abuse material (Seto, et al, 2015). In another self-report survey with a
sample of 8,718 adult males in Germany, 2.4% of respondents reported using that material (Dombertュ et al, 2016). Not all offenders have a paedophilic disorder (other motivations to offend include exploitation for financial gain), and not everyone who has a paedophilic disorder ends up being an offender (some
people seek support in dealing with their paedopbilia). 468 Fast,E., Paedophilia and sexual offending against children: Theory. Assessment and intervention by M
逸to, 2010. 469 Eke, A.,S eto, M., Williams, J., Examining the criminal history and future offending of child
pornography offenders, 2011. Link between those who view Internet child pornography and those who commit CSA unclear. Nonetheless, it appears that for high-risk CSA offenders, pomography increases the risk of offending in a study of 341 offenders, according to Kingston, D., Pornography Use and S exua1 Aggression: The Impact of Frequency and Type of Pornography Use on Recidivism Among S exua1 0ffenders, 2008.
470 United S tates Se ntencing Commission, Federal S entencing of Child Pomography (non-production offences), June 2021.
260
Figure 3. total reports submitted 勿 online service providers, 2010-2020
Millions 25 21.45
18.46 16.99
+ 9 750 % 02 ー
83I
44 ■
2016 2017 2018 2019 2020 2015
1.11
�
2014
0.22 0.33 0.42 0.51
0 くI〕
0 く!】
0
つ」 ・1
ll
2010 2011 2012 2013
These reports included more than 6 5 million
multiple files, of various types (e.g. images, images and videos471.A report can contain videos and text), and can concern one or
several types of abuse (e.g. known material, new material, and grooming).
Asimilarly stark increase has occuued with reports concerning the EU (e.g exchanged in the EU, victims in the EU, etc.): from 23 000 in 2010 to million in 2020:
more images than i
Figure 4. EU-related reports submitted勿online service providers, 2010-2020
Thousands
1200
1000
800
600
400
200 17.5
+ 5 980 %
20.34 24.28 28.38 52.96
�
142.58
.
270.69
I
461.3
I 722.98 725
2010 2011 2012 2013
1046.35
2014 2015 2016 2017 2018 2019 2020
These reports contained more than 4 million images and videos472.
Breakdown り cornlフαny
Asingle company, Facebook, submitted 95% of the reports in 2020. Five companies (Facebook, S napcht, Google, Microsoft and Twitter) submitted 99% of all reports in that
year473
471 As reported to the US National Centre for Missing and Exploited Children困CMEC). Its CyberTipline received a total of 65,465,314 files within reports in 2020.
472 As reported to the US National Centre for Missing and Exploited Children国CMEC). Its CyberTipiine received 4,265,151 files in the reports that resolved to the European Union Member S ttes.
473 National Centre for Missing and Exploited Children (NCMEC), 2020 Reports by Electronic S ervice Provider (ESP), accessed 20 April 2021. In 2019 the number was similar, 94%.
261
Figure 5. breakdown げreports submittedりonline service providers in 2020
Thousaiids
25.000
20.307
20000
:5 iii
:0000
5 000
288
0ther
companies
65 ゾ
97
■■ .
即 一 。
547
Go gie
一
,
,‘
There are currently 1630 companies registered to report to NCMEC. In 2020, NCMEC received reports from 167 service providers, meaning that approximately 88% of
providers registered with NCMEC made no reports at all. 0f these 167 providers, around 80% made fewer than 100 reports.
There is no evidence that 95% of the current cases of child sexual abuse in online service
providers occur in Facebook. In fact, experts suggest that comparable levels of abuse occur in similar services from other companies, and the difference in detection levels is rather due to the different intensity of detection efforts474. This means that there is a substantial amount of child sexual abuse online that remains undetected.
Content of reports
The most reported content is known material, followed by new material and grooming475:
Table 1: content of EU-related reportsノrm online service providers in 2020476
Type of child sexual abuse online 2020
All material (images and videos) 4 265 151
Known material (images and videos) 3 736 985
New material Images 436 754
Videos 91 412
Grooming 1 453
The New York Times, Tech Companies Detect A S urge in Online Videos of Child S exua1 Abuse, 7
February 2020; The Verge, As platforms get better at detecting child sexual abuse videos, they 're
to determine the exact amount of
hashes, and through that number
of them, 7 February 2020. of new and known videos is unknown. It is possible es, based on the number of hits with the database of
ase of video hashes at NCMEC wn and few) received
datab
(knoi
amount of new images. There is not yet a similar
finding more The amount known imag estimate lhe
4, くJ
「I 「I
4, 4・
and therefore it is only possible to estimate the amount of videos 476 National Centre for Missing and Exploited Children.
262
Table i above describes the content of reports. The number of reports is in general higher
because a report can contain multiple types of child sexual abuse online (e.g. known
images mixed with new ones, etc), and the same file can be reported multiple times. For
example, a set of images of children abused by a group of offenders has been reported to NCMEC almost 900 000 times since 2005. In another example, images of a child abused
by a family member has been reported to NCMEC over 1 million times since 2003477.
The amount of new images detected increased by more than 10 times and the amount of grooming reports increased by more than 5 times from 2019 to 2020478. The COVID pandemic may explain these dramatic increases. As both children and
perpetrators spent more time at home, the possibilities for grooming and new abuses
increased, including through the production of self-generated material.
Box 3. grooming and seチgenerated material involving children
Abuse relating to self-generated sexual content/material involving children is common and features increasingly in investigations479. This content includes material that has been created as a result of grooming (i.e. an offender lures or extorts the child into
producing that material), as well as material which, while originally voluntarily- produced, is used or distributed in an exploitative or abusive way480.
76% of law enforcement authorities report that self-produced material as a result of
grooming is a common or very common feature in investigations481, while 65% indicate that this is the case for self-produced material as a result ofse xtortion482. 98% of authorities indicate that such material is increasing483.
75% of children surveyed in a study in Finland had been asked to send explicit pictures of themselves, while almost 80% had been sent explicit images and more than i in 10
experienced grooming on a weekly basis484
There are also indications that the COV ID-19 pandemic has significantly affected the
frequency of self-generated sexual content/material involving children. In 2020, the Internet Watch Foundation confirmed 68 000 cases of self- aenerated imagery.
4H Seethe NCMEC's presentation (in particular minute 45:20) in an online event organised by the
European Parliament Intergroup on Children's Rights on EU legislation on the fight against child sexual abuse online, on 15 0ctober 2020.
478 Ibid. In 2019, in EU-related reports the amount of new images was 39 614. In 2020, it increased by 1003%. The amount of grooming reports was 240, and it increased by 505% in 2020.
479NetClean, NetClean Report 2018, accessed 26 April 2021. 480 Terminology Guidelines for the Protection of Children from S exua1 Exploitation and S exu1 Abuse,
28 January 2016. 481 NetClean, NetClean Report 2018, accessed 26 April 2021.
S ee also Europol, European Union serious and organised crime threat assessment, 12 April 2021; and Intemet Watch Foundation, "Grave threat" to children from predatory internet groomers as online child sexual abuse material soars to record levels', 12 January 2021.
482 NetClean, Netc ian Report 2018, last accessed 26 April 2021. 483 Ibid. The same study indicates that live-streaming of self-produced content is also a significant issue.
57% of law enforcement authorities report that induced self-produced live-streamed content is common or very common in investigations, whiie two thirds (67%) report that captures of what
appears to have been voluntarily self-produced content is common or very common.S ome respondents noted the difficulty in many cases of determining if an image has been produced voluntarily or if it is as a result of grooming or sexual extortion (for example, 'an image which appears to be voluntarily self-produced can easily be that of sextortion').
484 S ave the Children Finland, 'Osa lapsista saa aikuisilta seksuaalissiivメteisiil viestejil viikoittain -
ainutlaatuinen selvitys lasten ja nuorten kokemasta groomingista julki '
, 26 April 2021.
263
representing 44% 0f the imagery on which the [WF took action that year, and an
increase of 77% in comparison to 2019485. In the vast majority of cases (80%), the victims were girls between 11 and 13 years of age486. In addition, some law enforcement authorities have seen an increase during the pandemic in young people sharing self-
produced material in exchange for money487.
Breakdown 勿typeげservice
The vast majority of reports (more than 80% in 2020, up from 69% in 2019) originate in
interpersonal communication services (e.g. messenger applications such as Facebook
Messenger, and email):
Figure 6.' breakdown げ reports 勿 type げ service in 2019 an d 2020488
Chat。八Essagitl
g
一 497
,0
肋
i 1kdi
�,
0185 ,
2
Umre血Acc ount=君4
File s hrrt:葺i E mal I 37 E i
l
Forum or Message Boardま;
Onl
ine G aining
Other ;:
i叩 2加 3叩 4加 5加 卿 「00 8叩 9加
Thousands
2020 originated from a chat or messaging media or online gaming platform that had
reports in in social
In the case of grooming, 31% of
service, whereas 68% originated messaging or chat capability.
Figure 7.' breakdown げgrooming reports勿type of service in 2019 an d 2020489
'Sexua1 abusers have never been so social, self-generated child sexual last accessed 21 April 2021.
reports on behalf of their multipie engine, and social media products
485 Internet Watch Foundation, abuse prevention campaign'
486 Ibid. 487 NetClean, Net Clean Report 2020, last accessed 26 April 2021.
;ubmits search
y that
sharing
488 The term "Umbrella Account" refers to a compan' products or services (e.g., a company that has file
may file all reports under the same name). The term "Other" includes: hosts/providers, marketplace, advertising, adult sites, safety solutions (companies who offer moderation or monitoring services for
うfer to platforms that have messaging or chat
services, oniine marketplaces safety solutions こes for other platforms) or moderations apps.
264
"Or nc1u(
dia" and 'Other" i
plat釦rms). ,errs "Soci1 me
term offer moderation or
other 489 The I
capability. The
(companies who
7加
助
gnificantly lower than the
4加
254
3加
i 2020 ■ 2019
2加 i叩
Sa dMed日
ChまorMe●ag iflg
Online Garn ilg
2.2. Data on reporting勿the pablic and hotlines
The number of reports from the public and hotlines number of reports from service providers. For example, in 2020 the 47 members of the iNHOPE network of hotlines processed 1 million (1 038 268) URLs, of which 267 192 were unique URLs containing csA 1490. In contrast, in 2020 service providers made a total of almost 21.5 million reports to NcMEc491, as indicated earlier
According to a 2018 Eurobarometer survey, 6% of EU internet users have encountered c sA1492. However, a majority (59%) of users who encountered illegal content online
reported that they took no action, while those who did take action were most likely to
bring the content to the attention of the provider493.
In addition to the comparatively low volume of reports made by members of the public, there is also significant variation in the quality of reports. For example, in 2020
Germany's eco Complaints Office found that only two in five (40%) public reports relating to child sexual abuse online were justified494. In the same year, the Internet Watch Foundation ('WF) found that just 14% of reports of suspected CSAM from members of the public actually constituted csA 1495. In 2019 Hotline.ie, the national hotline for Ireland, found that while 85% of reports received were flagged by the reporter as suspected CSAM, just 24% were determined by the hotline's analysts to constitute C SA1496.
While the majority of hotlines focus solely on receiving reports from members of the
public, a small number have begun to proactively search for CSAM online in recent years497. Proactive searches by hotlines have p roven to be hiahlv effective. leadina to a
490 INIOPE, Annual Report 2020, 4 May 2021 491 National Center fir Missing and Exploited Children,
'2019 Reports by Electronic se rice Providers (EP)
' , accessed 21 April 2021.
492 European Commission, 'Flash Eurobarometer 469: Tackling Illegal Content Online',Se pteiber 2018
493乃id.
494eco Complaints Office, Annual Report 2020, 13 April 2021. 495Intemet Watch Foundation, Annual Report 2020, accessed 4 May 2021 496 Hotline.ie, Annual Report 2019, 19 0ctober 2020.
497Currently, four hotlines search proactively for CSAM: the Internet Watch Foundation (UK), the Canadian Centre for Child Protection. NCIvIEC (Us. throuah a pilot proiect. and S vars internetas
265
substantially higher quality 0f reporting than public reports. In 2020, fir example, the
'WF found that while less than one in five (14%) reports of CSAM from members of the
public were actionable, over fourin five (87%) reports resulting from their analysts' proactive search were actionable498. As a result, the overwhelming majority (87%) of all reports actioned by 'WF in 2020 resulted from proactive search499
***
Despite the increasing volumes of CSA online reported, it is not possible to determine
exactly the actual amount of CSA online that is taking place at the moment. Given the hidden nature of the crime, it is likely that the reported cases are just the tip of the
iceberg. To given an indication of the amount of CSAM that circulates, during the arrest
ofjust one child sexual offender in Germany in 2019, the police confiscated 14 terab舛es of CsAM, including more than three million photos and 86,000 videos500. And the takedown of a single darkweb forum ("Boystown") dedicated to exchange CSAM showed that it had more than 400 000 registered users501.
(Lithuania, in limited form).See : European Commission,S tdv on framework of best practices to tackle child sexual abuse material online, 2020.
498Internet Watch Foundation, Annual Report 2020, accessed 4 May 2021. 499 Ibid. 500 DW, Child sex abuse at German campsite: How authorities failed the victims, 5 S eptember 2019 501 Europol, 4 arrested in takedowii of dark web child abuse platform with some half a million users. 19
November 2021.
266
ANNEX 7:SAM PL CASES OF CHILD SE XU L AB USE ONLINE IN THE EU
Samp1e cases in the EU that started with detection of i ma2es and/or videos
The following are actual, anonymised sample cases shared by law enforcement agencies in the EU. All the cases started with the detection of child sexual abuse images videos on online services.
Austria
. Case #1:
o Austrian law enforcement received in 2019 a report from NCMEC submitted by Facebook alerting of the distribution via Facebook Messenger of images and
videos of minors performing sexual acts.
o The investigation led to the identification of a S 1ovak citizen living in Austria
who forced minors through the threat of violence to produce images and videos of
themselves performing sexual acts and to send them to him. The material was
also distributed online to other users.
o The report led to the identification of all 30 victims. The suspect was arrested and
convicted to five years of imprisonment.
. Cas e#2:
o Austrian law enforcement received in 2019 a report from KIK Messenger
alerting of the distribution of child sexual abuse material.
o The investigation led to the identification of an Austrian citizen.
o The search of his house and further investigations revealed that he sexually abused his 2 year old daughter, who was rescued.
Case #3:
o Austrian law enforcement received in 2019 a report from Sna pchat alerting of
the distribution of child sexual abuse material.
o The investigation led to the identification of an Austrian citizen who had forced
several female minors to produce nude images of themselves and provide them to
him, under the threat of making publicly available images and videos he made in
the bathroom of a soccer field while acting as a referee.
o The report led to the identification of a lar2e number of victims.
Bulgaria
Law enforcement in Bulgaria received in 2018 a report from the National Child
Exploitation Coordination Centre alerting of the distribution of child sexual abuse
material through KIK Messenger.
The report led to a criminal investigation in which two mobile phones from a suspect were seized, containing 517 video files with child sexual abuse material.
The material included videos with brutal scenes of child sexual abuse with a child
around 2 years old.
267
Czech Republic
Law enforcement in the Czech Republic received in 2017 a report from NCMEC
alerting of the distribution of child sexual abuse material by email, initiated by
Google.
The report led to a criminal investigation in which a 52 year old man was arrested
following a house search, where additional child sexual abuse material was found.
This person had abused 2 girls and recorded the abuse. The 2 girls were identified
and rescued.
Denmark
. Case#1:
o Following reports from KIK alerting of the distribution of child sexual abuse
material through KIK Messenger, Danish authorities arrested, a Danish
national in his forties with no criminal record.
o During preliminary examination of his mobile phone, Danish police found
several recordings of himself abusing his 10 year old daughter. o The 10 year old victim was rescued and the suspect is undergoing criminal
proceedings.
Case #2 - Operation Umbrella502:
Facebook reported to the National Center for Missing and Exploited Children
(NCMEC) the distribution of videos via Facebook Messenger503 depicting a
Danish boy and a girl who were engaged in sexual activity. NCMEC forwarded the case to Denmark via Europol. Over 1000 people had distributed the videos to one or more people via
Facebook Messenger and were charged for distribution of child pornography. This operation, still ongoing, is the single largest operation ever against child sexual abuse in Denmark.
O o
Estonia
Law enforcement in Estonia received in 2017 a report from NCMEC alerting of the
distribution of child sexual abuse material by email.
The report led to a criminal investigation in which a person was arrested for
exchanging and possessing child sexual abuse material.
France
. Case#1:
502
503 Europol, Internet Organised Crime Threat Assessment, 18 S eptember 2018, p. 32
The case was also reported in the media (in English).
268
French police received in 2018 a NCMEC report submitted by Facebook
alerting of the distribution of child sexual abuse material via Facebook
Messenger. The investigation revealed that the offender provided P1ayStation codes to
young boys in exchange of child sexual abuse material.
o The offender was arrested. There were around 100 victims.
. C ase#2:
o French police has receiveda number of cases from NCMEC submitted by KIK alerting of the distribution of child sexual abuse material via KIK
Messenger. o The cases typically involve multiple offenders (up to 20 offenders per case). o The cases have led to multiple arrests.
Germany
German Federal Police received a NCMEC report in July 2019 submitted by Facebook alerting of the distribution via Facebook Messenger of material showing the sexual abuse of a very young girl.
The NCMEC report also indicated that the material could have been recently
produced.
The report led to a criminal investigation and a house search in which a suspect was
incriminated with abusing his 4 year old daughter, and his 10 year old son, who
were rescued and safeguarded.
Greece
Greek police received two NCMEC reports submitted by Yahoo! informing about a
user who exchanged child sexual abuse material via Yahoo!'s messenger service.
The house search of the offender revealed that he was also in contact, via S kype, with
individuals (mothers of underage children) in the ASEA N region and was sending
money to them so they would send him indecent pictures of their underage children.
The ASEA N authorities were notified of all the details.
Ireland504
Law enforcement in Ireland received in 2013 a report from NCMEC alerting of the
distribution of child sexual abuse material by email.
The material was detected by Mic rosof when Matthew loran used a Grail account
to send child sexual abuse material to an email address on Microsoft's platform.
The report led to an investigation in which it was discovered that loran had been
sexually exploiting children.
Irish police identified six victims in Ireland as a result of the investigation.
504 The case was also reported in the media.
269
Romania505
Romanian police received in 2016 a NCMEC report submitted by Facebook
concerning child sexual abuse material exchanged via Facebook Messenger.
The investigation revealed that a mother had been abusing her 9 year old daughter for more than a year and sent the material generated in the sexual abuse to her
boyfriend (not the father of the girl) in England.
The mother was arrested and her daughter was rescued.
weden
Case # 1:
o S wedish police received a NCMEC report alerting that one person had shared
two child pornographic images on Facebook Messenger of material known
to the police. o S wedish police carried out a search at the suspect's home and found child
sexual abuse material in hard drives.
o The material included the suspect abusing his stepdaughter, who was
rescued in the operation. o The suspect was sentenced to nine years in prison for, among other things,
gross rape against children.
Case #2:
o S wedish police received a report from the National Child Exploitation Coordination Centre in Canada in which a person was sharing child sexual
abuse material through KIK Messenger. o A house search was conducted in which child sexual abuse material was
釦und.
o Thanks to the investigation, nine S wedish children were identified.
o The suspect was sentenced to four years in prison for different child
pornography offenses.
Case #3:
〇 S wedish police received a NCMEC report submitted by Facebook concerning child sexual abuse material exchanged via Facebook Messenger.
o The investigation revealed that a female suspect was producing child sexual
abuse material with the children of her romantic partners and sharing it with
another male.
o Further investigation revealed a network of two other female producers and
three male consumers of child sexual abuse material.
o 11 victims were identified and rescued, ranging from ages 2 to 14 when the
crimes occurred, out of more than 50 victims in total.
505 The case was reported in the media, see here and here
270
Spain
Law enforcement in S pain received a report from NCMEC alerting of the distribution
of child sexual abuse material by email.
The investigation by law enforcement in S pain led to the arrest of one person, who
actively shared online with other child sex offenders the child sexual abuse material
he produced. The person arrested produced that material by abusing children within his family circle.
Given the gravity of the situation, law enforcement focused on locating the victims,
eventually rescuin2 2 children within the family circle.
Samp1e cases in the EU that started with detection of online solicitation
The following are actual, anonymised sample cases of online solicitation in the EU that service providers reported to NCMEC.
Austria
An adult man enticed an 11-year-old female child via an online chat service to
produce and share sexually explicit images. An adult man enticed a 12-year-old female child via an online chat service to
produce and share sexually explicit images. Chat logs submitted with the report showed the man threatened the child he would notify police if she did not send
explicit images and videos. Fearing this threat, the child produced additional
content and sent it to her exploiter. A 45-year-old man enticed a 13-year-old male child via online private messaging to engage in sexual activity. Chat logs submitted with the report showed the man
was talking to the child about leaving the country and making plans to meet the
same weekend the report was made to NCMEC. The man was in a position of
authority as a coach and talked about wantina to adoot and marry the child.
Bel2ium
A 21-year-old man enticed a 14-year-old female child via an online private
messaging service to produce and share sexually explicit images. Chat logs submitted with the report indicated the man previously had enticed the child to
meet in person so that he could exploit her by engaging in sexual activity.
A 15-year-old used an online platform to traffic his 9-year-old girlfriend for sexual
abuse exploitation. His reported profile stated:
'Tm lookingpr a pedophile who wants to****my 9 year old girlfriend and want her to Pawんr''
An adult man used an online chat feature to entice six female children and sent
them graphic images of himself engaged in sex acts. At least one of these children
271
was enticed to create and send an explicit image 01 herselt to the man who then
demanded she produce and send more images. When she declined, the man
threatened to harm her, saying he "knows where she lives".
A 51-year-old man used a messaging service to entice a 13-year-old male child to
produce and share sexually explicit content of himself. Chat logs submitted with
the report indicated the man was the child's uncle, had direct access to him, and
discussed specific sexual acts with the child. The chat also indicated the uncle was
offering the child money in exchange for sending sexually explicit files.
Croatia
A 48-year-old man used an online chat service to entice a 14-year-old female child
to produce and share sexually exploitative images of herself. The man also enticed
her to sexually abuse her 11-year-old sister and said he wanted to meet in person to
abuse her. Chat logs provided with the report show the child victim disclosing that
she used force to abuse her younger sister, specifically stated the following: "he screamed"
’五did, but lhad to do it勿prce.S he waslI ghtillg me... .she cried"
Cyprus
An adult man used the chat function on an online gaming platform to engage in
sexually exploitative conversation with another adult gamer about his 13-year-old
daughter. The man provided the other adult gamer with his daughter's screemiame
on another chat platform so the other man could contact the child to "seduce" her.
A 41-year-old man from Cyprus enticed a 15-year-old child victim from Moldova
to produce and send sexually exploitative imagery of herself. Chat logs submitted
with the report indicated the man previously had enticed the child to travel to
CvDrus so he could e xloit her throuah sexual activity.
Czech Republic
A 29-year-old man used a private messaging platform to entice a 14-year-old female victim to produce and share sexually exploitative images of herself. Chat
logs submitted with the report indicated the man previously had enticed the child to
meet in person so he could sexually exploit her. The man lived close to the child
and was making plans to meet her so he could continue to sexually abuse her.
A 29-year-old man enticed five child victims between the ages of 8 and 12 years old. The man enticed two of the children to engage in sex acts, including bestiality, with each other. He enticed another victim to sexually abuse her 3-year-old sibling. Chat logs submitted with the report indicated the man offered money or expensive
gifts to the victims to entice them into producing and sharing the sexually exploitative images.
Denmark
An adult man used a platform's chat function to send sexualized messages about
272
children to another adult. Chat logs submitted with the report indicated the man
planned to sexually abuse his 13-year-old daughter who was intoxicated at the time.
A 41-year-old man in the United S tates enticed multiple children under the age of
13 to produce and send sexually exploitative imagery of themselves. This man was
communicating online with a 20-year- old man from Denmark and the two men
discussed trading sexually exploitative images. At least one child, a 9-year-old female child, was coerced to engage in sexual activity over a video call after being threatened that she would be publicly exposed if she refused.
Estonia
An adult male created and used multiple online accounts to entice over 12 children, some as young as 9-years-old, to produce and share sexually exploitative imagery. Chat logs submitted with the report indicated that in some cases the man offered to
pay the children in exchange for initial images and then coerced to send additional
images by threatening to publicly expose their images online.
Finland
An adult enticed numerous child victims in Finland, Lithuania, Norway, the United
Kingdom, and the United S ttes to produce and send sexually exploitative imagery of themselves. After obtaining initial images, this adult would blackmail the
children by threatening to send the images to the children's families unless they continued producing and sending additional images. Chat logs submitted with the
report indicated the adult also was sharing child sexual abuse material with other
adults online.
An adult man used an online messaging service to engage in sexualized
conversations about children with another adult. The man made multiple statements indicating he had sexually abused his young daughter on multiple occasions and had shown her pornography since she was an infant. Chat logs submitted with the report detailed the man's plans to continue sexually abusing his
daughter.
France
A 46-year-old man enticed a 15-year-old female child to meet in person for sexual
activity. The man also disclosed he was sexually molesting his minor daughter. A 36-year-old man used a platform's messaging service to entice a 14-year-old female child to engage in sexual activity. Chat information provided with the report indicated the man was the child's uncle and had direct access to her.
A 38-year-old man in a position of trust as a youth football coach used a platform's
messaging service to entice a 13-year-old female child to meet for sexual activity. Chat logs submitted with the report indicated the man was a friend of the child's
father and had frequent access to her during weekend visits.
A 48-year-old man enticed a female child to meet for sexual activity. Chat
information submitted with the report indicated the man was the child's stepfather
273
and provided the child with a location where they could meet in secret so that he
could sexually exploit her.
A 28-year-old man enticed a 14-year-old female child to meet for sexual activity. Chat logs submitted with the report indicated the man was the child's half-brother
and had direct access to the child victim.
An adult man enticed several female children between the ages of 14 and 17 to
produce and share sexually explicit images. After the suspect coerced the children
to produce images, he blackmailed them to produce and send additional content by
threatening to publicly expose the initial images he had received. Chat logs
provided with the report included the following statements showing the severe
distress of the children as the man blackmailed them to produce increasingly
egregious content:
''...you rea伽want to ruin my i雄’' 'l've aireaみtried to commit suicide please don 't start again ''
I五kging lo destroy lll〕’i弟” '7 wanl切die"
'I勿going lo m myseグ’ A 42-year old man used a platform's private chat function to entice a 12-year-old female child to engage in sexual activity. Chat logs submitted with the report indicated the man was in a relationship with the child's mother, had direct access to
the child, and already had exploited her by forcing her to engage in painful sexual
activity: '7 can 't anymore with your rOr.., your Mom and l are done ok"
"We shouM do it sifter.., it causes some bleeding usualケthe戸rt time" "Wait mom is up... erase everything"
A 36-year-old man used a platform's messaging service to entice a 14-year-old female child. Chat logs submitted with the report indicated the man was a school
teacher in a position of trust and with access to children. Chat logs submitted with
the report indicated the man already had met and sexually abused the child and was
trying to make plans for future meetings. A 46-year-old man used a platform's messaging service to entice a 13-year-old male child to produce and share sexually explicit content. Chat logs provided with
the report indicated the man was the child's uncle, had direct access to the child, and had sexually molested the child on multiple occasions. Chat logs also indicated
the man was coercing the child to meet him in isolated areas of the home so he
could sexually exDloit him when no one else was home.
A 42-year old man used a private messaging service to entice a 13-year old female
child to engage in sexual activity. Chat logs submitted with the report indicated the
man had previously enticed the child to meet and had sexually abused her.
A 32-year-old man used a platform's messaging service to entice a 13-year-old male child to produce and share sexually explicit content. Chat logs submitted with
the report indicated the man had enticed the child to sexually abuse his 9-year old
274
brother and directed him to continue the abuse as indicated by the tollowing
statements:
"Go to him in the room" "Tell him hes加uM open your pants"
"o you don 't want to take the virgini加)乃our brother" "Tell him to give you a bル可ob"
"Come on dare to take your brother's virginiか and then you were the戸rt w加 had''
A 32-year-old man used multiple online personas to entice female child victims to
engage in sadistic sexual conversations and produce and share sexually explicit
imagery of themselves. Chat logs provided with the report indicated the man also
was communicating with an 18-year-old woman who he paid to produce imagery of her sexually abusina her infant child.
Greece
A 50-year-old man enticed a 14-year-old male child to produce and send sexually
exploitative imagery. Chat logs submitted with the report indicated the man had
enticed the child to meet in person on previous cases and had sexually abused him.
The man also referred to having made videos of himself sexually abusing the child.
A 29-year-old man used a platform's messaging services to entice a 13-year-old female child to engage in sexual acts. Based on the report, it appeared the man had
previously enticed the child to meet and sexually abused her and the two lived in
close proximity to one another.
A 40-year-old man used a platform's messaging service to entice a minor female
child to meet for sexual activity.血 frmtion submitted with the report indicated
the man lived in close proximity to the child and knew the child as a friend of her
family. A 41-year-old man used a platform's messaging service to entice a 12-year-old female child to produce and share sexually explicit content. Chat logs submitted
with the report indicated that after coercing the child to send initial images, the
man began to blackmail her to produce and send additional content. The man
threatened to spread the child's images online if she did not comply and threatened
that she had no options but to send more images: "I have alreaみsaved it on myp加ne so iかou don 't obey I post it on the web''
''lかou do whatlsay l won 't spread your photos on琉e internet" "0h and you canprget about琉reatening me wi琉the police,ldon't care"
'勿notqかaidげ琉e police,l will upルad your photos loootimes勿me time the
hearings end"
Ireland
・A29-year-old man used a platform's messaging service to entice a 15-year-old female child to meet and engage in sexual activity. Chat logs submitted with the
report indicated the man lived in close proximity to the child and previously had
275
enticed her to meet in person and sexually abused her. The man also sent several
messages to the child urging her to keep their relationship secret because he would
go to jail if her parents found out.
Italy
A 27-year-old man enticed a 12-year-old female child to produce and share
sexually exploitative imagery. After the man obtained initial images from the child, he blackmailed her to create and send additional content by threatening to expose her images publicly. Information provided by the reporting company also indicated
the man had direct access to children, including his minor daughter.
Latvia
An adult used a platform's chat room service to entice three children between the
ages of 8 to 15 years old. Chat logs submitted with the report referred to the
victims appearing nude and the adult's desire to meet the children in person.
Lithuania
An adult male who used a platform's chat feature to entice a 12-year-old male child
for sexual activity. Chat logs submitted with the report detailed the man pressuring the child to expose himself in various degrees of nudity and to engage in sexual
acts on camera for the man.
The parent of a 15-year-old child in Luxembourg reported that their child was
being enticed into a sexual relationship by an adult man in the United S ttes using a social media platform's chat feature.
An adult used a platform's messaging service to entice a 15-year-old female child
to produce and share sexually explicit images of herself.
Malta
A 20-year-old man used a platform's chat service to entice a child to produce and
send sexually exploitative images. The child disclosed the following information:
"we started chatting, he pretended to be a girl, then he started send1lg live picsげ 訪is girl. he is actually a boy so琉is was allルlse. then he insisted I send him nudes
wi琉myルce and琉reating to release my o琉er nudes. I sent him one and now he
has my nudes is is琉reating to send琉em to everyonel肋ow. please he加me as
soon as Possible.'' A 30-year-old man used a platform's messaging services to entice a 15-year-old female child to produce and share sexually explicit content. The man threatened the
child:
"You have to do as 'sの"iかou don't want to get exposed" "Otherwise l will show everyone your nudes"
276
Netherlands
A 61-year-old man used a platform's messaging service to entice multiple male
children to produce and share sexually explicit imagery. Chat logs provided with
the report spanned several years and information provided in the report indicated
the man was a school teacher and therapist in a position of trust with direct access
to children. The man coerced the victims to engage in specific sexual acts,
including anally penetrating themselves with foreign objects and also asked several
victims if they had access to younger siblings. The man at times groomed the boys
by pretending to be a teenage girl or a football recruiter assessing the children's
physical fitness by requesting images:
'Do you see your bro琉erず12 ever naked?" "1. Eveグthing we ta既about, so theルct thatl勿going to scout you stays between
us.乃staツs between us asんng as l or ano坑er scoutおcorn ing to vおityou atα match.So no telling triner, parents or戸iends. You have to promise that... 2. We
t'ya cam session wherelinterview you and do a boみcheck and diがerent tests. You have to be in a room aルne.ム坑at Possible?"
"how semen in.ノrmtげthe cam''
Poland
An 18-year-old man used a platform's messaging services to entice an 11-year-old man
and
female child to create and share sexually exploitative images. After the
coerce
expose her.
enticed the child to create the initial explicit images, he continued to
threaten the child to create additional images by threatening to publicly
A 56-year-old male used a platform's messaging service to entice a 15-year-old female child. Chat logs submitted with the report indicated the man asked the child
if she enjoyed having sex and whether she performed oral, vaginal, and anal sex.
Additional information submitted with the report indicated the man lived in close
proximity to the child and had been trying to entice her over chat to meet in person so he could sexually abuse her.
A 43-year-old man used a platform's messaging service to entice a 16-year-old male child to produce and share sexually explicit content. Chat logs submitted with
the report indicated the man had enticed the child to sexually abuse and produce
exploitative images of his 12-year- old brother. Chat logs submitted with the
reports indicated the man was a success coach in a position of authority and with
direct access to children.
Romania
A 23-year-old woman in Romania used a platform's chat service to coordinate
transporting a 13-year- old child victim to an 83-year-old man in Germany so the
man could sexually abuse the child in exchange for financial compensation. Chat
logs submitted with the report indicated that the woman had access to multiple female children between the ages of 10 and 16 years old, but the 13-year-old child
277
victim was selected because she was still a virgin
',arents to the 13-Year-old virgin wants me lo give them money bげbre don 't trust to give up the girl without giving them mofeゾ’
"I have the virgin is the 13 year oM giH her parents want 5000"
"5000pr the girl and you give us and new a credit ok because the郎rl is virgin you can do wi所taんwhatever you want"
Slovaka A 21-year-old Austrian man enticed multiple female children in S 1ovakia to
produce and send sexually exploitative images of themselves over several years. After the man obtained initial images, he would threaten to publicly expose the
child to coerce them to create and send additional, and often more egregious, sexual images. One child was coerced to record video of her sexually abusing a
younger sister. Two other children expressed suicidal thoughts due to their severe
distress while being blackmailed. The man also threatened the children not to
disclose the exploitation to trusted adults or law enforcement by telling them he
would have them institutionalized or taken away from their families:
ツust so you伽ow, I toU訪em琉at you s功とr from mental illness and琉atyou offered me sexual services and that parents cannot taん care qかou, you will go
in切んds shelter"
Slovenia A S lovenian man used the chat service on an online gaming platform to send
sexually exploitative messages regarding children, including that he had sexually molested a child and raped "little kids."
§Il旦i旦 A 22-year-old S panish man enticed a 14-year-old female child in Chile to produce and send sexually exploitative images of herself. After the man obtained the
images, he blackmailed the child to produce and send additional exploitative images by threatening to "ruin her life" and disseminate her sexually explicit images publicly. Chat logs submitted with the report indicated the enticement and blackmail caused the child severe distress, and she stated multiple times that she would kill herself if the images were released. Two apparent adult women used a platform's chat service to engage in sexualized
conversations about children. One of the women disclosed she had sexually molested her 10-year-old daughter on multiple occasions and provided details of
the abuse at the request of the woman she was chatting with.
Sweden
A 31-year-old man used a platform's private messaging service to entice a 14-year- old female child to engage in sexual activity. Chat logs submitted with the report indicated the man already had enticed the child to meet in person and had sexually abused her and also indicated the man had produced a child sexual abuse video by
recording his exploitation of her.
278
ANNEX 8: TECUNOLOGIEs TO DETECT CHiLD SE XU L AB USE ONLINE
This al iex provides additional information on technologies to detect child sexual abuse
online, i.e. known material, new material and grooming506
The examples given below are some of the most widely used, and this is not intended to be an exhaustive listing. Many of these tools are made available to service providers, law enforcement and other organisations where a legitimate interest can be shown. Typically, these tools are combined with human review to ensure the maximum possible accuracy.
General considerations
. These technologies answer the question "is this content likely to be child sexual abuse,
yes or not?" not the question "what is this picture about? what is this conversation
about?" In other words, the tools look for specific indicators of possible child sexual
abuse.
. Error rates: given the costs (e.g. human moderation, legal redress) and the reputational risks for service providers, these have an incentive to ensure that the error rate is a low
as possible before they use these technologies. High error rates (e.g. incorrectly
fiagging as child sexual abuse content that it is not), would be quickly detected in the
current system by NCMEC and/or law enforcement in the EU as the ultimate recipient of the reports.
. Human moderation: human review reduces the error rate to close to zero. It is already typically in place even for the most accurate technologies such as hashing.
1. Known child sexual abuse material
Technologies used to detect known
technology is a type of digital丘ng( hashing teclmologV exist.includingf
gital frng hashing technology exist, including used tool of this type.
are typically based on hashing. Hashing Many variations and implementations of
crosoft's PhotoDNA507, which is the most widely
PhotoDNA has been in use for more than 10 years and it was developed by academics at Dartmouth College in cooperation with Microsoft. While the original PhotoDNA detects known CsAM in images, a version for detecting csAM in videos is also available508
PhotoDNA works as follows509:
Detection:
The tool first identifies images above a certain size.
The tool focuses on images only and ignores text, i.e. it does not read the body of
the email or extract any other information transmitted in the one-to-one message
(it does not recognise faces in the images, or other contextual information). In
these technologies work (minute 24:28) by Professor Hany of Microsoft's PhotoDNA to detect known images and of
Seehere for a visual explanation of how ,-lead the creation detection tool.
Fand, who lead or cc
. accessed on 14 May 2021. ming DNA
Microsoft's gru Microsoft, Photo
Microsoft, How PhotoDNA for Video is being used to fight online child exploitation.S eptember 2018. See here for a visual explanation on how PhotoDNA works.
D
獅 抑
郵 姻
other words, it does not answer the question "what is this message about?" but the
question "is this image known?"
2) Creating a unique digital signature (known as a '"hash") of the image (see figure
below)510, through the following process: Convert a full-resolution color image (top) to grayscale and lower resolution
(bottom left); Use a high-pass filter to highlight salient image features (bottom center); and
Partition the high-pass image into quadrants from which basic statistical
measurements are extracted to form the PhotoDNA hash (bottom right). This hash is unique and irreversible (the image itself cannot be re-created from the
hash). FiRure l」hasんngprocess
Matching: The hash is compared with those in a database of hashes of known child sexual
abuse material. If the image hash is not recognised, no information is kept. The main and largest database of hashes (around 1,5 million) is held by the
National Center for Missing and Exploited Children, a public-interest, non-
governmental organisation established by US Congress in 1984 to facilitate
detection and reporting of child sexual abuse material.
The criteria for an image to be converted into a hash added to the database of the
National Center for Missing and Exploited Children is the following: Children (prepubescent or pubescent) engaged in sexual acts.
The sexual contact may involve the genitals, mouth, or digits of a perpetrator; or it may involve contact with a foreian object.
3)
Fand, H ., Reimng on online abuses, Technology and Innovation, 2018.
280
An animal invo1ved in some torni it sexual behaviour with a pre-pubescent
child.
Lewd or lascivious exhibition 0f the genitalia ir anus of a pre-pubescent child.
Images depicting pubescent children contain children that have been
identified by law enforcement (therefore ensuring that they are actually
minors).
Every hash has been viewed and agreed upon as being child sexual abuse material
by two different experts at the National Center before it is included in the
database.
PhotoDNA has a high level of accuracy5 h1, PhotoDNA has been in use for more than 10
years by over 150 organisations globally512 including service providers (Microsoft, Facebook, Twitter,A pple513), NGOs (e.g. NCMEC, Internet Watch Foundation) and law enforcement in the EU (e.g. Europol, DE,SE and others). In these 10 years, the tool has been used daily and analysed hundreds of billions of images without any accuracy concerns being identified.
Other examples of hashing technology used for these purposes, and operating on similar
principles, include YouTube CsAI M atch514, Facebook's PDQ and TMK +PDQF515. In addition to these implementations of hashing technology used specifically to detect known CSAM, other variations are used in a range of applications, including the detection ofma 1ware516 and copyrighted content517.
2. New child sexual abuse material
Technologies currently used for the detection of new CSAM include classifiers and artificial intelligence (AI). A classifier is any algorithm that sorts data into labelled
classes, or categories 0f information, through pattern recognition.
Examples of classifiers include those that can detect nudity, shapes or colours. Classi丘ers need data to be trained on and their accuracy improves the more data they are fed.
"llThe rate of false positives is estimated at no more than i in 50 billion, based on testing (Testimony of
Hany Fanid, PhotoDNA developer, to House Committee on Energy and Commerce Fostering a Healthier Internet to Protect Consumers, 16 0ctober 2019).
512 Microsoft provides PhotoDNA for free. Organisations wishing to use PhotoDNA must register and
follow a vetting process by Microsoft to ensure that the tool is used by the right organisations for the exclusive purpose of detecting child sexual abuse material. The tool can be used to detect child sexuai abuse material in various services (e.g. hosting, electronic communications) and devices (e.g. by law enforcement to detect known child sexual abuse material in a suspect's device).
513 More information is available here. 514 YouTube CSAI Match 515 0pen-Sourcing Photo- and Video-Matching Technology to Make the Intemet Sa fer 516 Si korsk, Michael and Honig, Andrew, Practical Malware Analysis, February 2012; Kapersky Daily,
'The Wonders of Hashing '
, 10 April 2014. 517 TechCrunch,
'How Dropbox Knows When You're sharing Copyrighted s tuff (Without Actually Lookina At YourS tff)'. 30 March 2014.
281
Thorn' ssa fr toolう18 is one example of industry' s ability to detect child sexual abuse
material. S afer can be deployed by a company as a modular solution to identify, remove, and report child sexual abuse imagery.A company using S afr can utilize the tool' s hash
matching technology to identify known CSA IvI, and can choose to expand detection by utilizing the tool' s machine learning classification model that can detect both known and
potentially new, unreported CSAM . This classifier, developed by Thorn and integrated into Sa fr, returns a prediction for whether a file is CSAM and has been trained on datasets totalling hundreds of thousands images. It can aid in the identification of
potentially new and unknown C sAM.
Thorn's csAM Classifier can be set at a 99.9% precision rate519. With that precision rate, 99.9% of the content that the classifier identifies as CSAM is CSAM , and it identifies 80% of the total CSAM in the data set. With this precision rate, only .1% of the content
flagged as CSAM will end up being non-CSAM. These metrics are very likely to
improve with increased utilization and feedback.
Other tools making use of classifier and AI technology to detect previously new CSAM include Google's Content s afety API520, and Facebook's AI technology521.
In some cases, the search for new CSAM is undertaken if known CSAM has been found with that user. In this case, once the known CSAM is identified on an account, it uses classifiers to assess the content of the account to identi句 if it has a high probability of
containing C sAM.
In other cases, the search for new CSAM with classifiers is undertaken in parallel to the search of known CSA1522
3. Grooming (solicitation of children for sexual purposes)
Tools for the detection of grooming in text-based communications make use of
technologies solely to detect patterns, which point to possible concrete elements of
suspicion of online child sexual abuse without being able to deduce the substance of the content. While not identical in function, these tools use technology similar to the one used in spar filters523
Tools of this type include the tool developed under Microsoft's Project Artemis524
developed in collaboration with The Meet Group, Roblox, Kik and Thorn.
The technique is applied to text-based chat conversations. Conversations are rated on a series of characteristics and assigned an overall probability rating, indicating the
Thorn's Sa fr tool.
Data from bench tests.
Fighting child sexual abuse online See ll壁旦 and ll壁旦 for more information on Facebook's tool to proactively detect child nudity and
previously unknown child exploitative content using artificial intelligence and machine learning. See for example, How WhatsApp Helps Fight Child Exploitation. Examples of behavioural classifiers used are the speed/amount of users that join and leave a group, the frequency of group name change, or whether the group contains members previously banned. For more information about content spar filters see here and here and for other spar filters see hn旦 hn旦 and here.S pam filters are usually run with the receiving end-user's consent.So m spar filters look only at the subject line of the email. Microsoti shares new technique to address online grooming of children for sexual purposes
m m
伽 m
m 切
524
282
be
釦r
its
estimated probability that the conversation constitutes grooming. These ratings can used as a determiner, set by individual companies, to address fiagged conversations additional review.
deployment of this tool in its services
283
Microsoft has reported that, in its
accuracy is 88%.
ANNEX 9: ENcRYPTIoN AND THE FIGHT A GAINST CHILD SE XUAL AB USE
Overview
This annex provides further information on the role of encryption in the dissemination of child sexual abuse materials and the grooming of children, to explain the rationale behind the measure obliging companies to detect child sexual abuse (CSA) regardless of technologies employed, including encryption. It outlines the different instances where encryption is encountered in the context of the fight against child sexual abuse, and the challenges it may pose to detecting instances of child sexual abuse and combating this crime. This annex informs of developments in the EU and more broadly and gives an understanding of the work of the Commission on the different aspects of the problem.
The shift towards greater interactions and activities in the online space resulted in the
widespread and increasing use of different forms of encryption to safeguard web browsing, interpersonal communications, live streaming video chats and private messaging, and to
safeguard data in online and offiine storage solutions. Encryption has become an
indispensable tool for the protection of fundamental rights, including privacy, confidentiality of communications and personal data525. It provides a secure means of communication for
journalists, dissidents and vulnerable groups526 and is essential is securing digital systems and transaction527. All this puts encryption at the heart of digital security, fuelling developments in this area of technology and others that are reliant on it.
However, if used for criminal purposes, it can mask the identity of offenders, hide the content of their communications, and create secure channels and storage for perpetrators where they can hide their actions, including the trading of images and videos of illegal content. During a
high-level dialogue, law enforcement and the judiciary noted528 that encryption has pervaded the vast majority of their caseload and has impacted the ability to gain lawful access to electronic evidence in between 25% and 100% of their cases- depending on the crime area.
They estimated that the use of encryption technology by criminals, will continue to increase.
Europol's 2020 internet Organised Crime Threat Assessment (iocTA)529 highlighted encrypted communication as the biggest issue that has frustrated police investigations in recent years.
Children are vulnerable to multiple risks whilst online, including grooming and being coerced into producing self-generated imagery for the abuser's consumption and blackmailed to meet in person with abusers. Material produced, is often re-shared and utilised as currency by perpetrators to join online abuser platforms.
525 Existing European Union legislation specifically refers to the use of encryption as a possible measure to
ensure an appropriate level of security for the protection of the fundamental rights and strengthening cybersecurity: Article 32(l), 34(3), 6(4e), recital (83) of Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and
repealing Directive 95/46/EC; recital (60), article 31(3a) of the Law Enforcement Directive; recital (20) in
conjunction with article 4 of the ePrivacy Directive 2002/58/EC; recital (40) of Regulation (EU) 2019/881
(Cybersecurity Act). 526 Carnegie, Moving the Encryption Policy Conversation Forward, 10 S eptember 2019. 527 EUS trategy to tackle Organised Crime 2021-2025 528 Information gathered from a high-level stakeholder dialogue on encryption with prosecutors. Held with the
European Judicial Cybercrime Network (EJCN) at Eurojust on i30 November 2019. 529 Europol's Internet Organised Crime Threat Assessment (IOCTA) 2020
htり〕s】クwww.europ o L eulop a . eu/actidues~serviceジmain~reponyi皿ernet~organised~chme~thre肌ーassessme皿ー
iocta-2020
284
Ottenders perpetrating crimes it child sexual abuse are generally quite sophisticated in their
use of technology and technical capabilities including effectively exploiting various types of
encryption and anonymity530. Law enforcement has noted an increase and broader use of
encryption to store and distribute child sexual abuse material (CSAM) with impunity. The increase in use of digital technologies, including encryption, has allowed offenders from around the globe that would have probably never known each other in pre-Internet times to chat and exchange materials freely in digital safe havens. Perpetrators actively encourage 0fline abuse for the purpose of producing new 'high-value' material and normalise this crime
Ell crption げdata "at rest" and "data in motion”」
Encryption technology can be used to safeguard both data "at rest" i.e. data that is stored on
devices, external hard-drives and thumb drives in the offline environment and online, e.g. in cloud storage sites, and data "in motion" - data that is safeguarded whilst being transferred from one device to another, normally with end-to-end encryption (E2EE). These two different facets of criminals' use of encryption raise their own unique concerns.
1.Ell c rption ずdata "at rest":
Encryption of data "at rest" is relevant fir the purposes of the present initiative when it is offered by relevant service providers, such as cloud hosting providers. These providers may offer encrypted storage space to customers, either retaining access to the stored content or
only granting access to the user. The use of encryption is particularly relevant for image and video storage, as it allows the storage of CSAM in an online folder which can be accessible to several individuals, making it a popular choice for the sharing of CSAM without having to send materials.
2.Ellc iptim げdata "in motion ":
End-to-end encryption (E2EE) is used to safeguard data "in motion" and gives rise to a different set of challenges.A number of interpersonal communications service providers already make E 2EE available by default or by choice on their services. The impact on the
possibility to detect child sexual abuse is s igni丘cant.
E2EE safeguards comnrnnications by preventing third parties, as well as the online service
providers themselves, from having access to the messages. Messages are encrypted by the sender's device, sent to the recipient's device, and decoded by the recipient using a set of
public and private cryptographic keys known only to the devices involved in the communication. It is possible to intercept messages, however, they cannot be viewed or monitored by the service provider, law enforcement or any other third party. While E 2EE
implemented in communications services therefore provides increased privacy protections, as a consequence, it may also prevent companies from effectively detecting conduct that goes against their terms of service, as well as illegal activities such as the sharing ofCSAM among offenders and grooming and coercion of children fir the purpose of sexual abuse including the self-generation of CSAM . The tools currently used by industry to reliably detect known child sexual abuse materials di not work in E 2EE electronic communications.
While some service providers have created other tools to attempt to limit CSA on their
services, the use of E 2EE limits the available evidence, so that even where a service provider
530 Europol, Internet Organised Crime Threat Assessment' 5 0ctober 2020; Independent Inquiry into Child S exul Abuse, The Internet Investigation Report 2020 March 2020; Virtual Global Taskforce Online Child Se xua1 Exploitation, accessed 29 April 2021.
285
suspects that LSA is happening on their services, sutticient evulence will usually not be
available to report and thus allow law enforcement to investigate.
This becomes evident when comparing two of the most widely used messaging services, Facebook Messenger and WhatsApp, which are both part of Facebook and subject to the same overall company policy of zero tolerance for child sexual abuse. Facebook Messenger, which is currently unencrypted, is at the origin of more than half of total reports to NCMEC in 2019, with more than 11 million reports.
WhatsApp, on the other hand, is end-to-end encrypted. Detection efforts on WhatsApp are therefore based on a triangulation of unencrypted metadata and behavioural analysis to detect anomalies that may signal harmful behaviour. It is supported by information extracted from
unencrypted data e.g. names of the group chat and public profile pictures and users' reports. Facebook states that, using this approach, WhatsApp detects and bans over 300 000 accounts
per month531 on suspicion of sharing child sexual abuse material e.g. use ofcsAM in profile pictures or a group name that references CSA. In most of these cases, there is insufficient
suspicion of CSA to generate a report to NCMEC, and as a result, in 2020 the company made
only 400,000 reports to NCMEC, which amounts to 11% of instances of banned accounts.532
However, the mere banning of accounts leaves victims without any hopes of receiving help, as no law enforcement organisation is informed of the abuse and the problem is simply pushed off the platform. This example shows that current solutions for the detection of child sexual abuse in encrypted environments are not yet up to the challenge of reliably detecting child sexual abuse in a way that can also result in support for the child victim.
This development has created concerns also beyond the European Union. The United
Kingdom, United S tates, Australia, Canada, India, Japan and New Zealand raised concerns about the growing trend toward E 2EE in electronic communications, and its impact on child
safety, in an international statement on "end-to-end encryption and public safety533". The statement mentions the significant challenges E 2EE poses to public safety, especially to children who are vulnerable to exploitation and called on companies to ensure that
deployment of E 2EE is not done in a way that undermines companies' abilities to identify and
respond to violations in their terms of service including on the sexual abuse of children. They are supported by a coalition of child protection organisations who called for actions to ensure that measures to increase privacy, including the use of E 2EE does not come at the expense of children's safety534
At the same time, finding solutions is not evident. The consultation process underpinning this impact assessment to prepare a proposal for a regulation on combating the sexual abuse and sexual exploitation of children yielded a variety of different viewpoints with respect to the issue of encryption.S takeholders warned against introducing flaws into the E 2EE set-up that could create vulnerabilities that jeopardise the privacy and security of communications for all citizens. They agreed that technology, including encryption has an integral part to play in solutions that keep children safe. A number of stakeholders rejected the concept that there has to be a binary choice between maintaining privacy and protecting children, advocating for
う3lFigures obtained fiom a position document that Facebook sent to the European Commission, in response to efforts taking piace in a Commission-led expert process to identify technical solutions that could help companies detect child sexual abuse in end-to-end encrypted electronic communications. Also shared on
WhatsApp's FAQ section: How WhatsApp Helps Fight Child xplitto. 532 Wired report: Police caught one of the web's most dangerous paedophiles. Then everything went dark. 533The UnitedS ttes Department of Justice, Intemational S tatement: End-to-end Encryption and Public S afet,
11 0ctober2020. 534 Letter to Facebook from a coalition of child protection organisations and experts on concerns regarding the
company's proposals to implement E 2EE across Facebook's messaging services of 6t1 February 2020.
286
privacy-preserving solutions that protect of children in encrypted environments.S takeholderS
saw the need for frameworks that are inclusive of both existing and emerging techniques to tackle abuse and reflect the varied and dynamic nature of online communications, considering the different properties of companies that offer such services535. In aligning these concerns,
any measures taken must be rigorously tested and must be proven to be reliable and accurate. Their proportionality, necessity and limitation in scope must be guaranteed536.
To assess whether solutions were even technically feasible, the Commission set up a technical
expert process under the EU Internet Forum, in line with the EUS trategy for a more effective
fight against child sexual abuse. This process aimed to map and assess possible technical solutions which could allow companies to detect and report CSA in E 2EE electronic
communications, in full respect of fundamental rights and without creating new vulnerabilities that criminals could exploit. The process brought together technical experts from academia, industry, public authorities and civil society organisations.
The possible solutions considered would allow for the use of both existing technologies (e.g. matching of unique signatures of material - hashes - to content that has been confirmed as
CSAM) to detect CSA as well as upcoming technologies to the extent known at present, whilst maintaining the same or comparable benefits of encryption. The approach used was
purely technical with each solution assessed from a technical point of view across five
criteria; effectiveness, feasibility, privacy, security and transparency. A number of promising solutions were identified during this process that help to reconcile the specific safeguarding needs of children through detection and reporting of CSA and with the full respect of fundamental rights of privacy and data protection.
The expert process and its outcomes were presented to Justice and Home Affairs Ministers at the EU Internet Forum Ministerial meeting of 25th January 2021537 Ministers taking part in the meeting agreed on the need for further efforts to overcome the challenges that E2EE poses to the detection of child sexual abuse on encrypted platforms and noted that this process is a first step in looking for feasible solutions that provide the right balance to help combat and eradicate CSA online and offline. The expert process complements the voluntary efforts that a number of technology companies have already been engaging in and attests to the importance of better alignment and collaborative efforts to safeguard children, whilst providing proof of
concept of the existence of possible technical solutions.
The Commission has also announced that it will support research to identi句 which technical solutions are the most feasible and could be scaled up and feasibly and lawfully implemented by companies and continue to engage with key players in the technology industry who are best placed to pioneer new technologies that can contribute effectively to the fight against CSA一
are erts
The relevant sections from the paper summarising the findings of the expert process reproduced in the following section. The paper summarises the technical views of the exp and has not been formally endorsed by the Commission.
535Digital Europe- response to open public consultation on upcoming legislation to fight child sexual abuse:
detection, removal and reporting of illegal content online. 536 EDRi- general views, open public consultation on upcoming legislation to fight child sexual abuse:
detection, removal and reporting of illegal content online. 537 Press Release: EU Internet Forum Ministerial- Towards a coordinated response to curbing terrorist and child
sexual abuse content on the Internet, 26 January 2021.
287
2. Technical solutions to detect child sexual abuse in end-to-end encrypted
communications
さcoPe
This paper covers the proactive detection538 by companies of images, videos and text- based539 child sexual abuse such as grooming or sextortion. The scope of the paper is limited to one specific type of online service, electronic communications, and one specific type of
illegal content, child sexual abuse (CSA).
The focus on electronic communications is due to the fact that a large proportion of reports to the National Centre for Missing and Exploited Children (NCMEC) of instances of CSA
(around 2/3 of the 16.9 million reports received in 2019, more than 700k of which concerned the EU) originate in this type of online service. These include one to one instant messaging services and email.
This paper: defines the problem of the detection of CSA content in e nd-to-end encrypted (E2EE)
communications; and
presents a number of possible technical solutions that could allow the detection of
CSAin E 2EE communications.
A possible solution is one that allows the detection of CSA in E 2EE electronic communications using existing technologies (e.g. hashing), as well as upcoming technologies, to the extent that these may be known today.
The paper aims to provide a first technical assessment to help identify possible solutions. Substantial additional work, beyond the scope of this paper, is likely to be needed to further evaluate and eventually develop, and deploy the technical solutions across the companies' infrastructure.
Approach
The approach of the paper is purely technical. It aims to reflect in non-technical language the input from internationally recognised technical experts from academia, industry and
public authorities from around the world, who have kindly contributed with their time and
knowledge to help make progress on this matter.
う38The document focuses on detection as a first step to tackle tis complex problem. The reporting of child sexual abuse after it has been detected is not covered in this document at the moment but it is of course of utmost importance to ensure that actionable and valuable information is provided to law enforcement on a
timely basis. Also, the document covers proactive detection by companies, not lawful access by law enforcement with a warrant. The document currently does not cover either the process to develop the technical solutions (e.g. data to train and test the tools, the preparation and maintenance of the database of
hashes, etc), also of key importance. Also, the document focuses on solutions that work on real time
detection, rather than detection of CSA in messages that have already been sent to the recipient. 539The technologies and approaches required to detect text-based threats are in general different from those
required to detect images and videos. At the moment, the detection of text-based threats is more difficult and
presents a higher number of false positives than image and video detection. It is therefore not easy to bundle the assessment and recommendations for text, image and video detection. The assessment of the solutions and the recommendations presented in the paper focuses mostly on image and video detection.
288
The paper maps possible technical solutions and assesses them from a technical point of view
across five criteria (the order does not reflect any considerations on relative importance):
1. Effectiveness: how well does the solution detect and report known and unknown
csA (images, videos and text-based threats)?540 2. Feasibility: how ready is the solution and how easily can it be implemented, in terms
of cost, time and scala bility?541 3. Privacy: how well does the solution ensure the privacy of the communications?542
4.S ecurity: how vulnerable is the solution to be misused for other purposes than the
fight against CsA, including by companies, governments or individuals?543
5. Transparency: to what extent can the use of the solution be documented and be
publicly reported to facilitate accountability through ongoing evaluation and
oversiEht by policymakers and the public?544
2. PROBLEM DEFINITION
The problem that this paper aims to address is the following: given an E 2EE electronic
communication, are there any technical solutions that allow the detection of CSA content while maintaining the same or comparable benefits of encryption (e.g. privacy)?
In addition to the technical aspects of the problem, which are the focus of this paper, the
problem has important policy aspects, as it lies at the core of the debate over the privacy, cybersecurity and safety implications and trade-offs.S ome voices on the safety side of the debate push for forbidding E 2EE altogether or require the existence of generalised exceptional access mechanisms, whereas some voices on the privacy side would reject any solution that allows the detection of CSA in E 2EE communications, as they would put the
privacy of communications above anything else.
This document aims at mapping possible solutions that could ensure the privacy of electronic communications (including the privacy of children) and the protection of children against sexual abuse and sexual exploitation. The solutions explored are purely technical in nature, and this D aer does not take a Dosition on the related uolicv a sDects.
540 This includes the ability to report to law enforcement sufficient information to enable the rescue of children from OngOing abuse and the prosecution of the offenders, as well as the ability of companies to proactively stop the abuse of their infrastmcture to commit CSA related crimes. A solution is also considered more effective if it allows for the detection of CSAM through multiple technologies (e.g. image and video
hashing, Artificial Intelligence based tools, etc). 541 User experience (e.g. no reduction of performance) also determines how ready the solution is to be
implemented. 542 This refers solely to the ability of the techiiical solution to ensure that neither the company, nor any actor
other than the sender and the receiver has access to the content of the commuuication. 543This includes, e.g., the misuse by companies to detect other types of content; the misuse by governments for
mass surveillance; the misuse by individuals to cause damage exploiting possible wealmesses that the solution may inadvertently introduce in the iriiiastructure; and the misuse by individuals to compromise the
integrity of the solution to detect CSA and modify it so that it would not work as intended. It is important to note that tech-savvy offenders (who may compromise the solution) are unlikely to use systems that ailow the detection of CSA.
544Carnegie Endowment for Intemational Peace, Moving the Encryption Policy Conversation Forward, Encrvotion Workiim GrouD.S eDtember 2019. D14.
289
3. POSSIBLE SO LUTIONS
0) Baseline solutions
These are immediate solutions that require little or no technical development. They provide reference points for comparison to the other technical solutions.
a. Non-E2EE communications
In communications that are not end-to-end encrypted (but which may be encrypted with other client to server protocols such as https), the electronic service provider (ESP) has the ability to
apply various tools to detect CSA (images, videos or text) on its server. The most common ones are:
Hashing tools545: they convert the image (or video) into a unique alphanumeric
sequence (hash), which is compared with a database of hashes of known images and
videos identified as CSA material (CSA1).
Machine-learning tools: they are trained to detect features indicating that ari image or
video is likely to constitute CSAM .
Text-based tools: they detect keywords or text patterns that indicate possible CSA
(e.g. grooming or sextortion).
the message is flagged for manual review by a content authorities.
If the tools identify possible CSA , moderator or reported directly to the
Figure 1: detectionげCS二4in communications that are fot end-to-end encrypted
Device: receives non E2EE
message
ESP SE RVR
. ..
...
applies tools to detect CSA : 中 message forwarded for
ESP server:
a) Detected
review/reporting b) Not detected中 message forwarded to recipient
Device: sends non E 2EE
message
Assessment: > Effectiveness:
o High: highly effective in detecting and reporting known CSAM and text-based
threats (i.e. as effective at detecting and reporting new CSAM as the current
technology to detect it allows).
545The most widely used hashing tool is PhotoDNA, developed by Microsoft and Professor Hany Fanid in 2009.S ee here for more iiiformation on how PhotoDNA works.
290
ン Feasibility:
o High: already in use, frequently as the default option. > Privacy:
o Low: the content of the communication could in principle be accessed by the ES P
at any point (from a technical point of view). > Se curity:
o Medium/medium-low: The communication is relatively secure from unauthorised
access by governments and individuals (given the use of e.g. client-server
encryption). However, companies can access the content of the communication
for other purposes than the detection ofCSA .
> Transparency o Medium: whereas the use of tools to detect CSA can be publicly reported (i.e.
reports sent to NCMEC), it is not always clear whereas these or similar tools are
used to detect other types of content, illegal or not, as oversight mechanisms not
always exist.
b.E nd-to-end encrypted communications546
In end-to-end encrypted communications the sender and recipient utilize a public key protocol to agree on a secret session key, which no passive observer including the ES P can determine.
As such, without additional mechanisms, the server is not able to apply the tools to detect
CSA, since it does not have the private decryption key and thus no access to the content in clear.
Figure 2: detection げCSzin end-toーend enci〕ノ7ted communications
Recipient: receives and
decrypts E 2EE
message
ESP SE RVR
...
...
ESP server: cannot apply existing tools to detect CSA at the server on E 2EE messages
Device: sends E2EE message
Assessment: > Effectiveness:
546 This baseline solution does not include device, server and encryption related solutions, which will be
analysed in the rest of the document.
291
o None, as it is not possible to detect at the server CSA (images, videos and text-
based threats) included in the content of the communication.
> Feasibility: o Not applicable (detection ofCSA is not possible)
> Privacy: o High: the content of the communication can only be accessed by the sender and
the recipient of the message547.
urity: Not applicable, since there is no solution to detect CSA that can be
compromised.548
> Sec
> Transparency o Not applicable, since the detection ofCSA is not possible.
c.E nd-to-end encrypted communications with exceptional, targeted access
In this type of solutions, the electronic communications system includes the possibility of
exceptional access for the company and law enforcement (e.g. with a warrant), i.e. the
possibility to decrypt the content of the communication as the ES P has the encryption keys:
Figure 3. detectionげCSzin E 2E communications with exceptional, targeted access
Device: receives and
decrypts E 2EE
message
ESP SE RVR
ロロ■
ロロロ
ト ,
/ ― 、
EXCEPTIONAL ACCESs
ESP server: cannot apply existing tools to E2EE messages. of a specific
preventively detect CSA on Can only access content
conmuiction, via exceptional access
SE NDR DEVICE
、 o ノ
Device: sends E2EE message
Assessment: > Effectiveness:
o Low: preventive detection (i.e. to reduce the proliferation of CSA and report to
law enforcement for action as needed) is not possible. Detection of CSA is only
possible for a specific communication, via exceptional access.
> Feasibility:
547The only part of the commuitication that is not private, as in all the other solutions discussed in this
i.e. the security of
encryption in itself
document, is the fact that the sender sent a message to the recipient (metadataノtraffic data). The 'not applicable' rating is in relation to the definition of security used in this paper, solutions that allow for the detection of CSA inE 2EE communications.E nd-to-end offers a high level of security to the coniniunication.
292
548
Figure 4:a刀detection done on-device
Device: receives and
decrypts
message
cannot apply
ES P SE RVR
. ..
. . . -
ESP server: tools to detect child abuse on end-to-end
existing
ted messages
sexual
encryp
Device: applies tools to detect CSA
(e.g. hashing of images and
matching with a database of known
CSAM). IfCSA is:
a) not detected中 encrypts message end-to-end and sends to recipient
b) detected 中 sends message for review and/or reDorting
Assessment: > Effectiveness:
o Medium: it would allow the detection of known CSAM . Depending on the type of
device, the list of hashes may need to be limited to work properly.551 Updating the
hashset with new hashes is slower and thus less effective than a model where the
hashset is in the ES Ps cloud.
> Feasibility o Medium-low: it could be implemented relatively easily but it would require
significant storage space in the device with the current technology552. Updating the dataset regularly would also use computational capacity.
ン Privacy: o Medium: user data is not exposed to the ESP. The possible security issues
(compromise and manipulation of detection tools) may introduce vulnerabilities
that could decrease the mivacy of the communication.
551 That said, in the case of PhotoDNA, the additional time needed to compare hash databases of increasing size scales logarithmically, not linear. In other words, doubling the size of the database requires one exlra
comparison, not 1wice as many. 552 For example, PhotoDNA hashes could be between i to 4 million, which could take around 30MB. Adding
video hashes would take even more storage space. Feasibility may be increased by limiting the hash database to include only hashes of the most commonly encountered content or manage the dataset on a
device/operating system level.
294
ン Securit":
o Low: the solution could be easily subverted and compromised!reverse engineered to not detect or reportCSA (in particular in devices without trusted execution
enviromrients) or to detect content other than CSA . It could also be manipulated to introduce false positives to inundate the reporting systems (e.g. NCMEC) with
them. The possible leak of detection tools (e.g. hashing algorithm, hash list,
keywords list), could reduce the effectiveness of similar detection tools elsewhere.
> Transparency o Medium-low: the possible security issues could limit the reliability of public
reporting on the use of the solution and therefore the accountability.
b. On-device full hashing with matching at server
In this solution, the device converts the images and videos in the message into hashes,
encrypts the message and sends the (client to server encrypted) hashes and the full message encrypted to the server. The server compares these hashes with those in the database of hashes of confirmed child sexual abuse (matching).
If there is a hit at the server, it instructs the app server to send the full image (or video) for manual review (or reporting), If there is no hit, the server forwards the E 2EE message to the
recipient.
553The security 0f all solutions that make use of a hashing algorithm could be increased if that algorithm is
updated!modified periodically, to reduce the risk of reverse engineering. Ideally, an open-source hashing algorithm very difficult to hack would be best, but it remains to be developed.
295
Figre 5. on-device hasんng w'琉matcんng al server
Device: receives and
decrypts E2EE
message
2.ES P server: compares hashes received from the device with those in the database of hashes of confirmed CSA (matching) 1) No match中 forwards E 2EE
message to recipient 2) Match中 asks app server to send
image/video to review and/or
reporting
、
rul message
(E2EE) -
and hashes (not E 2EE、
1. Device: converts the images and videos into hashes before the message is encrypted, encrypts the full message and sends the hashes (client to server
encrypted) and the E 2EE full
message to the server.
3. App server: sends full
image/video for review if there is a match in the server and/or
revorting
Assessment: > Effectiveness:
o Medium-high: it would allow the detection of known CSAM only. It would not be
applicable to text-based threats (not possible to detect with hashing). No need to
limit the hash list, as it will be located at the server.
ン Feasibility: o High: it could be implemented relatively easily. An open-source version of the
solution could be created to be used by smaller companies who may not have
enough resources to obtain and maintain a proprietary tool.
> Privacy: o Medium-low: user data (hashes) are visible to the ESP. The possible security
issues (compromise and manipulation of detection tools) may introduce
vulnerabilities that could decrease the privacy of the communication.
> Se curity: o Medium-low: the hashing algorithm in the device could be subverted and
compromised/reverse engineered to not detect or report child sexual abuse (in
particular in devices without trusted execution environments). It could also be
manwulated to introduce false Dositives to inundate the reDortina systems (e.u.
296
Figure 6. on-device partial hasんng with remaining hashing and matcんng at server
Device: receives and
decrypts E2EE
message
ESP SE RVR
...
...
Full message 'FフFF、
and partial hashes
、 2.ES P server: finalises the partial hashes received from the device, and
compares the now full hashes with those in the database of confirmed
CSA(matching) a) No match中 forwards E 2EE
message to recipient b) Match中 asks app server to send
image/video to review and/or
reporting
(notE 2 � �
1. Device: converts the images and videos into partial hashes before the message is encrypted, encrypts the full message and sends the partial hashes (client to server encrypted) and the E2EE full message to the server.
3. App server: sends full
image/video for review and/or
reporting if there is a match in
the server
Assessment: > Effectiveness:
o Medium-high: it would allow the detection of known CSAM only. It would not be
applicable to text-based threats (not possible to detect with hashing). No need to
limit the hash list, as it will be located at the server.
ン Feasibility: o Medium: proof of concept was done and it could be already in use. Depending on
the size of the partial hash (which would determine the payload and upload time), this solution may be faster than i .b. as it would lift some of the hashing burden
from the device. The exact implementation details are important (e.g. to maximize
performance) and remain to be defined.
ン Privacy: o Medium-low: user data (hashes) are visible to the ES P and more information
about the image is exposed to the ES P through the partial hash. The possible
security issues (compromise and manipulation of detection tools), although
improve by exposing the hashing algorithm only partially to, still may introduce
vulnerabilities that could decrease the privacy of the communication.
> S ecuritv:
298
o Medium: the device contains only part of the hashing algorithm, which limits the
risks of reverse engineering and manipulation. This risk could be further mitigated
through obfuscation techniques to scramble pixels without affecting the creation
of the hash to ensure that the hash is not reversible
ン Transparency: o Medium-low: the possible security issues could limit the reliability of public
reporting on the use of the solution and therefore the accountability.
d. On-device use of classifiers
In this solution, the server produces classifiers to identify child sexual abuse (images, videos and/or text) using extensive labelled data of verified child sexual abuse and non-child sexual abuse to train the machine learning system. A classifier is a set of characteristics that can determine whether the contents of a message are child sexual abuse related. The classifiers are then fed to the sender' s device, which uses them to determine whether a message should be sent for review or reporting.
Figure 7. useげmachine learillg class夢ers
Device: receives and
decrypts E2EE
message
ESP server: 1. Trains the machine learning algorithm. 2. Feeds the classifiers to the device and keeps them up to date
solution that allows the direct detection of
known content). That said, detecting child
3. Device: applies classifiers to detect child sexual abuse before the
message is encrypted. IfCSA is:
a) not detected中 encrypts message E2E and sends to recipient
b) detected 吟 sends message for review and/or reporting
Assessment: > Effectiveness:
o Medium-low: it is basically the only 1ddition to
videos using machine learning is still not sufficiently
unknown content555 (in sexual abuse images and
and generates relatively high error rates (e.g. compared to hash
The machine learning algorithms require well-labelled data on an developed
matching).
can also indirectly lead to the identification of new content as the known images are usually found with new ones, which are confirmed as CSA during the manual review of the detected content.
299
555Hashing together
ongoing basis to make sure that the models are kept up-to-date. They also require
constant feedback on the quality of the classification, which is particularly difficult to consistently provide in the detection of child sexual abuse in an end-
to-end encrypted system. This may result in the algorithms getting outdated
relatively soon if they are not updated regularly. > Feasibility:
o Medium-low: image classifiers are already in use in cloud services by companies
(e.g. to recognize commonly occurring faces in photos or doing automatic
grouping of images) and they are also used to detect CSA . That said, significant
development is still required, in particular for the detection of images and videos
and on the possibility of running classifiers on the client side, given the size and
complexity of the models and the need for frequent updates.556 c lassiers for the
detection of text-based threats (e.g. grooming) would be more feasible.
ン Privacy: o Medium-low: the possible security issues (compromise and manipulation of
classifiers) may introduce vulnerabilities that could decrease the privacy of the
communication. In the particular case of behavioural classifiers, which determine
possible instances of child sexual abuse based on metadata from the user, the
privacy intrusion is higher than other tools such as hashing. In addition, a possibly
higher rate of false positives could result in user data (not child sexual abuse)
being reported / processed I reviewed. Also the classifiers could be misused to
identify a range of non-CSA activities.
> Security: o Medium-low: the classifiers in the device could be compromised and manipulated
to avoid detection (i.e. introduce false negatives), introduce false positives to
inundate the reporting systems (e.g. NCMEC) (or even be used by offenders to
crawl the web to search for c sA). This kind of attack could be based on
sophisticated adversarial machine learning techniques that could defeat any classifier. Being able to detect new child sexual abuse threats exposes the system to be more vulnerable to adversarial attack.
> Transparency o Medium: the use of the solution could be documented and be publicly reported to
facilitate accountability, but how the solutions works would be more difficult to
document than e.g. l.c.
2)Se rver related solutions
This type of solution consists in moving to secure enclaves in the ES P server or to third
party servers some or all of the operations done at the ES P server in communications that are not end-to-end encrypted (e.g. client-to-server encrypted).
acceptable size of an me maximum current state (
lets or in lower-bandwidth/high data- their
hands
556 Current image classifier models can range from 16 to 70MB
app mning on the device would be 4-5MB . Implementation in
impact on the functionality and costs for persons using lower-end cost environments.
300
a.Sec ure enclaves in the ES P server
In this solution, the ES P server contains a "secure enclave" that allows compute intensive
operations to happen on the cloud, for example in closed off trusted execution environments. The enclave can decrypt the user info and perform the same operations and checks as done in communications that are not end-to-end encrypted (see figure 1), while protecting the sensitive information inside the enclave:
Figure 8: secure enclaves in the EP server
Device: receives and
decrypts E2EE
message
...
Secure enclave in the ES P server: decrypts the
message and applies tools to detect child sexual abuse. IfCSA is:
a) detected 中 forwards message for review and/or
reporting b) not detected中 encrypts message end-to-end and
forwards it to recipient
、
'-
SENDR DEVICE
、 0 ノ
Device: sends
encrypted message to the enclave in the ESP server.
Assessment: > Effectiveness:
o Medium-high: it could allow the detection of known and new CSAM . No need to
limit the hash list, as it will be located at the server. This solution also opens up
possibilities to develop new technologies to detect child sexual abuse.
ン Feasibility: o Medium-low: on one hand, it is a solution that simplifies the detection process
and similar systems are already in use today for other applications (e.g. Intel's
s GX or s oftware Guard Extensions, in Microsoft's Cloud557, and other trusted
execution environments). On the other hand, only a few companies have access at
the moment to the hardware and software required in this solution, given its
operational complexity558 (although the technology may become more accessible
in a few years in varticular if it is offered as a service by the cloud rnViders).
557Microsofi has recently lmnounced the availability of Azure virtual machines fumning on S GX hardware that allows the users to write their own code to run in a secure enclave to which the service provider does not have access.
558 For example, on S GX systems there is a cost every time data is moved from the main memory into the enclave memory so it is necessary to consider the amount of data and number of times that it goes back and forth in and out of the enclave.
301
Also, there are compatibility issues to address in the design o士 the solution (i.e.
the processor in the client side needs to be able to communicate with that in the
enclave, and the enclaves need to be able to communicate among themselves). > Privacy:
o Medium-low: As the secure enclave would have access to the full content of
communications, privacy would depend strongly on the ability to trust that the
enclave, as implemented by the ESP, is secure and effective. User data (hashes or
the message) are not visible to the ES P nor are the operations to detect child
sexual abuse. The possible security issues (e.g. compromise of third-party server
by state actors) could affect the privacy of the communication.
> Se curity o Medium-low: the solution fully relies on trusting that the secure enclave works as
intended and it has not been compromised (some vulnerabilities in this type of
systems have already been found). The company making the enclave would be the
only one having the key to the inner workings of the enclave and could become a
target of bad actors, and if successful, a compromise would have a broad impact on the security of the system and give access to the encryption keys for the
communications between the sender and recipient. By accessing the enclave, bad
actors would also have access to the decryption keys for the communications
between the sender and the recipient. That said, it could be possible to attest that
the code running in the enclave has not been modified from the time it was
deployed and that the user has connected to the right enclave, carrying out the
right processes, although this feature has been compromised in the past.559 In
addition, the check could remotely check the code but not the hashes used.
> Transparency: o Medium-low: it is unclear how the use of the secure enclave could be documented
and be publicly reported to facilitate accountability through ongoing evaluation
and oversight by policymakers and the public. The wider user community will
have to rely on a trustworthy and technically competent entity to confirm the
workings of the secure enclave.
One possible way to mitigate some of the above concerns (in particular on security and
transparency) could be to send to the secure enclave the hashes not E 2EE for matching. This would e.g. eliminate the risk of leaking the private E 2EE keys if the enclave is compromised. In this case the trust in the secure enclave would be limited to protecting the hashing algorithm and its parameters.
b.S ing1e third-party matching
This solution is the same as i .b. (on device full hashing with matching done at server), but with the matchina done at a trusted third-partv server instead of at the ES P server:
559 Seehere
302
Figure 9: single訪ird-parly matcんng
Device: receives and
decrypts E2EE
message
Third-party server: 2. Compares hashes received from the device with those in the database of hashes of confirmed child sexual abuse
(matching) a) No match中 forwards E 2EE
message to recipient b) Match中 asks app server to
send image/video to review
and/or reporting
可~
Full
①
and
1. Device: converts the images and videos into hashes before the
message is encrypted, encrypts the full message and sends the hashes (client to server
encrypted) and the E 2EE full
message to the third-party server.
3. App server: sends full
image/video for review if there is a match in the third-party server.
Assessment: > Effectiveness:
o Medium-high: it could allow the detection of known csA 560. No need to limit the
hash list, as it will be located at the third-party servers.
> Feasibility o Low: scalability could be an issue, although this could be a service for a smaller
companies offered on top of the cloud infrastructure of larger ES Ps. It requires a
combination of code running on the sender's device and (third party) server and
therefore certain interdependence, which would influence e.g. the latency of
message transmission.
> Privacy: o Medium-low: user data (hashes) are not visible to the ES P and no operations to
detect CSA would occur at the ES P server. The possible security issues (e.g.
compromise of third-party server by state actors) could decrease the privacy of the
communication. That said, it is likely that the third party would have to work very
closely with or be effectively part of theES P that provides the communication
service, which may raise privacy concerns. If the third party does not work in real
time (i.e. analvsina the messaae at the time it is sent) and instead analyses the
%O The use of classifiers is in principle possible with single third parties but it wouid be part of a different solution.
303
message after it has been sent, the dependence on the ES P could be lower561
Also, the third party could be part of the client provisioning, which could reduce
the privacy concerns.
> Security: o Medium-low: in addition to the security concerns of i .b) (on-device full hashing
with matching at the server), e.g. risk of manipulation of the hashing algorithm, the third-party server could be compromised by state or individual actors.
> Transparency o Medium-low: the possible security issues could limit the reliability of public
reporting on the use of the solution and therefore the accountability.
c. Multiple third-parties matching
In this solution, based on multi-party computation (MPC), the device converts the image (or video) into a hash, breaks it into parts, encrypts them with the third party keys and sends these
parts to multiple third-parties for partial matching through the ES P server (which does not have access to the encrypted partial hashes). The app server compiles the responses from the
third-parties and determines whether a match has occurred. If there is a match, the app server sends the full image (or video) for review/reporting. If there is no match, the ES P server forwards the E 2EE message to the recipient:
to the recipient (i.e. batch processing with footnote i on the scope of the solutions).
561 The processing of messages after they have been sent
timescale) could be applied to other solutions as well (see
304
Device:
receives and
decrypts E2EE
message
Figure lo」mult勿た 琉ird-parties matcんng
I苔
Third party servers: 2. Do partial matching of the multiple hash
parts and sends info back to device.
ESP server: No action beyond routing the hashes to the third parties.
ES P
SER VER
1. Device: converts the images and videos into hashes before the message is
encrypted, breaks them into parts, encrypts them with the third-party keys and sends them through the ES P server to multiple third-parties for partial matching and sends the E 2EE full
message to the third-party server.
3. App server: compiles the responses from the third-parties and determines whether a match has occurred:
a) No match中 asks server to forward E 2EE message to recipient
b) Match中 sends message for review and/or reporting.
Assessment: ン Effectiveness:
o Medium-high: it could allow the detection of known csA 562. No need to limit the
hash list, as it will be located at the third-party servers.
> Feasibility: o Low/medium-low: the multiple round-trip requests between the device and the
servers before the message can be sent could slow performance, in particular with
slow internet connections. It reuuires a combination of code running on the
562 The use of classifiers is in principle possible with single third parties but it would be part of a different solution.
305
sender's device and (third party) server.A similar technology is already in use by
Google and online merchants563 but further research would be required to see how
it could be applied in this situation (in particular on scalability) and what would
be the costs, including computational overhead.
> Privacy: o Medium: user data (content and hashes) are not visible to the ES P and no
operations to detect child sexual abuse would occur at the ES P server. The
possible security issues (e.g. compromise of third-party server by state actors) could decrease the privacy of the communication. That said, the solution could
offer better privacy than solution 2.b) (single third party matching): if at least one
of the parties is trustworthy the hash will remain private. On the other hand, it is
possible that the larger companies, which also offer electronic communication
services, turn themselves into the third parties of this solution for the smaller
companies, which may generate some privacy issues.
ン Security: o Medium: in addition to the security concerns of i .b) (on-device full hashing with
matching at the server), e.g. risk of manipulation of the hashing algorithm, the
third-party servers could be compromised by state or individual actors. That said,
compared to solution 2.b) (single third-party matching), the risk will be lower as
bad actors would need to compromise multiple servers instead of one.
ン Transparency: o Medium: the possible security issues could limit the reliability of public reporting
on the use of the solution and therefore the accountability.
**********
Another possible server related solution would be to use classifiers running on the server,
feeding on metadata. This seems to be the approach taken by Facebook564 as it plans to switch to E 2EE by default in its Messenger service565 but the technical details remain unclear
3) Encryption related solutions
This type of solutions consists in using encryption protocols that allow the detection ofCSA in encrypted electronic communications.
a. On-device homomorphic encryption with server-side hashing and matching In this solution, images are encrypted using a carefully chosen partially homomorphic encryption scheme (this enables an encrypted version of the hash to be computed from the
encrypted image). The encrypted images are sent to the ES P server for hashing and matching
563 See i壁旦 and hn. The technology allows Google and online merchants to compute certain profile information on internet users (.g. the average age of buyers of a certain watch) without sharing all the data
they have about those users. 564 As indicated here. 565 As announced in March 2019.
306
exist. At the moment, the computational power required on the server would
render this solution expensive. > Privacy:
o Medium-low: Hashes of user data are visible to theES P.S iml1ar privacy as
solution l.b.
> Se curity: o Medium: no risk of leaking of hash database, or hashing and matching algorithm
on the client side, as all these calculations would take place at the server. The
solution does not prevent the possibility that the database of hashes could be
tampered with at the server, as the other solutions with hash lists on the server.
> Transparency o Medium-high: the use of the solution could be documented and be publicly
reported to facilitate accountability.
**********
Another possible encryption related solution would be to use machine learning and build
classifiers to apply on homomorphically encrypted data for instant classification. Microsoft has been doing research on this but the solution is still far from being functional569
べ69More information on Microsoft's work on homomorphic encryption is availableh望 .
308
4. OVERVIEW
The table below summarises the above assessments and classifies the possible solutions into 3 groups: top 3 (i.e. most promising, although some research may be needed), needs research (i.e. it could be considered but substantial research is still needed), and to be discarded (i.e. currently not worth pursuing at this point if there is a need to prioritise, but could still be of interest in the future):
Type
3. Baseline
So1ution Effectiveness Feasibility
a. Non-E2EE communications
b. E 2EE communications
4. Device related
c. Encrypted communications with
exceptional access a. All detection done on-device
b. On-device full hashing with
matching at server
c. On-device partial hashing with
remaining hashing and
matching at server d. On-device use of classifiers
5.S erver related
a.S ecure enclaves in ES P server
b.S ing1e third-party matching
6. Encryption related
c. Multiple third-parties matching
a. On-device homomorphic encryption with server-side
hashing and matching
Overall
瓜 瓜
瓜
N N
N
Needs research
Top 3
Top 3
Needs research
Top 3
Discard
Needs research
Needs research
Transparency
N/A 焚 凪 妖 焚 慕 《
) 禁 奴 ★ 妖
凪 妖 妖 妖 凪
凪
P★ サ 凪 奴 奴 妖 妖 妖 焚
★ ★
誉 ★ 妖 轟
轟 妖 妖 ★ ★
★ ★
ヤ 叢 《 姦 叢
★★★ *
309
5. RCOMMENDATIONS
On possible solutions: > Immediate: on-device hashing with server side matching (lb). Use a hashing algorithm
other than PhotoDNA to not compromise it. If partial hashing is confirmed as not
reversible, add that for improved security (1c ). > Long term:
Invest in research on secure enclaves in ES P server to make the technology more accessible (2a). Invest in research on multiple third-parties matching, leveraging existing
applications (2c) and identifying possible third parties. Invest in research on classifiers to supplement hashing and matching, but not
replace it (ld).
Invest in homomorphic encryption research with regard to image matching
(3 a).
Other considerations: > PhotoDNA update: PhotoDNA, the hashing technology most widely used, is more
than 10 years old and it may require an update now and then periodically every few
years to keep up with the latest developments (and make it less vulnerable to
manipulation, including by modifying the images to avoid detection). > Quality and integrity of hash databases: a number of solutions rely on the detection of
child sexual abuse through hashing technology. The quality of this detection (and therefore the effectiveness of those solutions) depends on the quality and integrity of
those databases.
> Industry standards for detection: the creation of industry standards for the detection
tools (e.g. image and video hashing) could facilitate the development and deployment of coherent and interoperable solutions across industry.
> Open source tools: open source tools could also facilitate the development and
deployment of solutions across industry. However, substantial research may be
required to produce open source tools that cannot be manipulated to reduce their
effectiveness or be misused. At this moment, all solutions considered are based in part on "security by obscurity", that is, it is required for the security and effectiveness of
the solution that the opponent does not know the full details of the scheme. The
scientific state of the art is not yet sufficiently mature for open tools.
ン Open competition: an open competition with a substantial prize570, could encourage not only the development of open source tools and industry standards, but also the
development of new possible solutions to detect and report child sexual abuse in end-
to-end encrypted electronic communications.
> Reporting mechanisms: when describing the solutions, the paper does not analyse in
detail what happens after child sexual abuse is detected, i.e. review and reporting
570 For example, similar to the open competitions organized by N1ST on cryptography or by the EU-funded
prqjects NSSIE and ECRYPT (eSTREAM).
310
mechanisms. These mechanisms depend on national legal obligations. These can have
an influence on the effectiveness of some solutions (e.g. training of machine learning
classifiers, which rely on a stream of well-labeled material to remain effective).
>Industry standards for reporting and transparency: when using hash databases, it would
be useful to know not only the total number of reports sent to relevant statutory bodies
from matches, but also the matches not sent to statutory bodies but removed based on
the terms of service, and matches not sent to statutory bodies nor removed.
The effectiveness of a hash database is currently only known to the company using it. It could be useful to have a third party perform regular testing/auditing using a sample non-CSAM match similar to the EICAR test file in the anti-virus industry.
> Safety by design: the development of technical solutions that could strike a balance
between ensuring the privacy of electronic conm-iunications (including the privacy of
children) and the protection of children against sexual abuse and sexual exploitation is
facilitated when that balance is aimed at from the start. from the desian staae.
311
RFERNCES
1. Preneel, B., The Never-lldi1lg Crypto Wars, presentation, imec-COSI KU
Leuven, 16/09/2019.
2.S nap Inc.,S nap Inc. Response to S en. Blackburn, 17/07/20 19.
3. Weaver, N.,Ell c rption an d Combating Child Exploitation Imagery, Lawfare, 23/10/2019.
4. WhatsApp , WhatsApp Encryption Overview, Technical white paper, 4/4/20 16.
5. Pfefferkorn, R., William Barr and Winnie The Pooh, Center for Internet and
S ociety, 7/10/2019.
6.S tan釦rd Internet Observatory, βαたncing Trust and Sqlとか on End-toーEnd
Encrypted Platforms, 12/09/2019 (Stanford workshop). 7.S tan釦rd Internet Observatory, Mitigating Abuse in an End-to-End World,
11/01/2020 (New York workshop). 8.S tanford Internet Observatory, Mitigating Abuse in an End-to-End World,
17/02/2020 (Brussels workshop). 9. Bursztein, E ; Bright, T.; DeLaune, M.; Elifff, D.; 'su, N.; Olson, L.;S heha, J.;
Thakur, M.; Thomas, K.; Rethin肋ng 訪e DetectionずChild S exua1 Abuse Imagery on the Internet, Proceedings of the 2019 World Wide Web Conference (WWW
'19), 13-17 May, 2019,S an Francisco, CA, USA.
10. Levy, I.; Robinson, C.; Principles pr a More Informed Exceptional Access
Debate; Lawfare, 29/11/2018.
11. Fand, H.; Facebook's planpr end-to-end encryption sacr夢cesalotげsecurityル
just a little bit ofprivacy; Fox News, 16 June 2019.
12. Carnegie Endowment for International Peace; Moving the Encryption Policy Conversation Forward; Encryption Working Group,Se ptember 2019.
13. Millican, J.; E 2EE pr Messen ger: goals, plans and thinking; Facebook; Real
World Crypto 2020, January 8-10, 2020.
14. Dalins, J.; Wilson, C.; Boudry, D.; PDQ & TIK + PDQF - A Test Drive of Facebook's Perceptual Hashing Algorithms; Australian Federal Police and
Monash University; December 2019.
15. Harold Abelson, Ross J. Anderson, S teven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green,S usan Landau, Peter G.
Neumann, Ronald L. Rivest, Jeffrey I.S chi11er, Bruce S chneier, Michael A.
S pecter, Daniel J. Weitzner, Keys under doormats. Commun. ACM 58(10): 24-26
(2015).
Device related solutions
1. Mayer, J., Content Moderation pr End-to-End Encrypted Messaging; Princeton
University; 6 0ctober 2019, 2. Callas, J., Thoughts on Mitigating Abuse in an End-t-Eld World; 14 January
2020,
3. Portnoy, E., Why Adding C lient-id Scam ing BreaksE nd-to-End Encryption,
Electronic Frontier Foundation, 1 November 2019, 4. Green, M., Can e nd-to-end encrypted systems detect child sexual abuse imagery? -A Few Thoughts on Cryptographic Engineering, 8 December, 2019.
5. Green, M., Client-side CAM detection: technical issues and research directions,
presentation at S tanfrd Internet Observatory event in New York, 11/01/2020.
6. Weaver, N.,S ome Thoughts on Client id S canningpr CAM, presentation at
S tanfrd Internet Observatory event in New York, 11/01/2020.
7.S tamos, A., Written testimony before U.. HouseげRepresentatives Committee on
Homeland Se uri1y on " rt夢cial Intellなence and Counterterrorism. Possibilities
and Limitations",June 25, 2019.
Server related solutions
Makri, E., Rotaru, D., Nigel P.S mart, N.P., Vercauteren, F., EPIC:β flcient Private
Image Clss批ation (or. Learning from the Masters); KU Leuven, Belgium; S ax
University of Applied S ciences, The Netherlands; University of Bristol, UK; 2017,
Dowlin, N., Gilad-Bachrach, R., Laine, K., Lauter, K., Naehrig, M., Wernsing, J.;
CiyptoNets. Appケing Neural Networks to Encrypted Data with High Throughput and
Accuracy; Princeton University, Microsoft Research; 2016.
Liu,LLu, Y., Juuti, M., Asokan, N., Oblivious Neural Network Predictions via
MiniONN transformations; Aalto University; 2017.
Juvekar, C., Vaikuntanathan, V., Chandrakasan, A., GAZELLE. A Low Latency
Frameworkfor S ecur Neural Network Inference; MIT; 2018.
Riazi, M.S .,S onghori, . M., Weinert, C.,S cheider, T., Tkachenko, O.,
Koushanfar, F., Chameleon. A Hybrid S ecure Computation Frameworkpr Machine
Learning Applications; UC S an Diego and TU Darmstadt, Germany; 2017.
Riazi, M.S .,S amragh, M., Lauter, K., Chef, Hai., Koushanfar, F., Laine, K., XONN:
XNOR-based Oblivious Deep Neural Network Inference; UC Sa n Diego and
Microsoft Research; 2019.
Portnoy, E., Azure CondentiaにImputing Heralds the Next Generation げ
Encryption in the Cloud; Electronic Frontier Foundation; 18 Se ptember 2017.
Frankle, J. et al.; Practical Accountability ofS ecret Processes; Massachusetts
Institute of Technology; Proceedings of the 27th USENIX S ecurity S y posium;
August 2018.
Hastings, M.; General purpose frameworks pr S ecur Multi-Party Computation;
University of Pennsylvania; Real World Crypto 2020, January 8-10, 2020.
Damgrd, I, Nielsen J.B., Cramer, R.,Se cure Multiparty Computation and Se crt
S haring. C ambrid2e University Press. 2015.
313
Encryption related solutions
1. Fanid, H.,S ingh, P., Robust Homomorphic Image Hashing, Dhirubhai Ambani
Institute of kformation and Communication Technology, University of California,
Berkeley and Dhirubhai Ambani血 stitute of information and Communication
Technology, Gandhinagar, Gujarat, India, 2019, 2. I1iashenko, I., Optimisations offully homomorphic encryption, KU Leuven, 2019, 3. Minelli, M., Fully homomorphic encryption for machine learning, PSL Research
University, 2018.
4. European Commission, Putting privacy at the heartげbiometric systems, 2011,
5. Yakoubov, s., 4 Gentle Introduction to Yao 's Garbled Circuits, Boston
University, 2017.
6. Tarek Ibn Ziad, M., et al., Cryptolmg: Prilノacy Preserving Processing Over
Encrypted Images, University of California, Los Angeles, 2019.
7. Gentny, C. FulケHomomorphic Encryption Using Ideal Lattices. In the 41st ACM
S ymposium on Theory of Computing (STOC), 2009.
314
ANNEX 10: EU CENTRE TO PREVENT AND COUNTER CHI LD SE XUL ABUsE
In the EU strategy for a more effective fight against child sexual abuse, the
Commission committed to work towards the creation ifa European centre to prevent and
counter child sexual abuse ti enable a comprehensive and effective EU response against child sexual abuse online and o ftline571. The purpose of this annex is to comprehensibly screen and assess in detail all possible options for the Centre, and determine the preferred one to be incorporated in the options of the report. It also provides additional information
on the Centre as a measure.
First, it explains the part of the problem and the relevant problem drivers in the impact assessment that a centre would address, followed by its added-value and specific
objectives. Then, the annex analyses the various possibilities to set up the centre: what
are the implementation choices? (section 3); what are the impacts of each choice?
(section 4); how do the choices compare? (section 5).
Finally, the annex discusses the preferred implementation choice resulting from the
previous analysis: what are the advantages, disadvantages and trade-offs of this choice?
(section 6). The preferred choice is then integrated into the policy options considered in
the report.
1. RELEVANT PROBLEM DR1VERS
The Centre is relevant to all the problem drivers identified in the impact assessment:
1. Voluntary actionりonline service providers to detect online child sexual abuse has
proven isllガkient
Box X in section 2.1.1. of the report lays out the current system to detect and report online in the EU, which relies on the action of a private entity in a third country (NCMEC in the Us), and on US legislation requiring service providers to report to NCMEC CSA online that they may become aware of in their systems, rather than to law enforcement directly.
Section 2.2.1. describes how (lack of) voluntary action by service providers in the EU drives part 0f the problem. At the moment, there are no obligations in EU law fir service
providers to detect, report or remove CSA online. Neither there is an EU Centre that would be recipient 0f the reports from service providers or that could serve as a facilitator of the detection and removal processes.
2. Inefficiencies in public-private cooperation between online service providers, civil
society organisations and public authorities hamper an effective fight against7Sl4
First, when setting up channels for the reporting of suspected child sexual abuse from service providers to Member States' authorities, and for the information about what is illegal from Member S tates' authorities to service nroviders. direct connections between
57l EU strategy for a more effective fight against child sexual abuse,COM (2020) 607, 24 July 2020, p14
315
each Member S tate and each provider are not efficient. It is more efficient to pass
through a central point, as is evident from the following diagram:
Rather than requiring a number of connections that is equal to (Member S tates*Service
Providers), as shown on the left, the creation of a central facilitator reduces the number of connections to (Member S tates +Se rvice Providers), a significant efficiency gain. In
addition, there are also security considerations to take into account here, given that the information to be transmitted is of highly sensitive data.A reduction in the number of connections and in complexity reduces the possibilities for data leaks and the attack surface.
secondly, it would not be efficient for each Member S tte to provide its own information to each service provider about what is illegal on their territory. There are
large overlaps across all Member S tates because of the harmonised definitions on child
pornography created by Directive 93/2011/EU. There may be some additional aspects, such as the provocative posing of children or the criminalisation of drawings or cartoons
depicting child sexual abuse, where Member S tates may differ, but these are very limited in number.
Third, while each provider and each Member S tate could be required to provide transparency reporting and make available its processes and data for auditing, it is much more difficult to create a robust overview of the entire system in this way. For example, "known" CSAM is often detected on several platforms, yet it would not be possible to trace the spread of one image across service providers without a central overview of which image is reported.S uch information would be helpful both in devising effective
responses, and in learning about the functioning of the criminal networks behind.
These three issues are relevant here because they are difficult to address through measures solely targeting service providers or Member S tates and their authorities. There are limits to a pure "network approach", in particular when it comes to ensuring coordination and transparency.
While a US Centre already exists which provides some of these services for US authorities (NCMEC), the EU cannot rely on this for support to its own authorities,
especially when expanding detection and reporting within the EU where it would not be
appropriate to require reporting to a third-country entity.
316
services in the EU, ensure the accuracy and relevance of such reports,
and forward these to law enforcement for action.
o Removal: the centre would also support law enforcement by facilitating the work of hotlines on the notice and takedown of child sexual abuse
material. In particular, it would noti句 service providers of the existence of
child sexual abuse material in their services, and these would be required to remove it within a short time.
To be able to carry out these functions, the centre would need the appropriate legal basis to allow it to process personal data and child sexual abuse
material, as well as the necessary human and technical resources. In particular, the centre must be able to implement strong security measures to avoid any data breaches. The legal basis should also allow it to cooperate closely with entities in the EU and beyond, in particular with regard to data exchanges. The centre would ensure accountability and transparency in the process of
detection, reporting and removal of child sexual abuse online. This would include
the collection of data for transparency reports; providing clear information about
the use of tools and its effects; supporting audits of data and processes; the centre
would help to ensure that there is no erroneous takedown of legitimate content, or
abuse of the search tools to report legitimate content (including misuse of the
tools for purposes other than the fight against child sexual abuse); and possibly
support users who feel that their content was mistakenly removed. The roles of
the centre ensuring accountability and transparency of the detection and reporting
process make it a fundamental component of the new legislation.
Boxl」independence げmecentre
The centre would serve as a key facilitator of the work of service providers in detecting, reporting and removing the abuse (including by ensuring transparency and
accountability), and of the work of law enforcement in receiving and investigating the
reports from service providers. To be able to play facilitator role, it is essential that the centre be independent from
potentially overriding private and political interests. Even a perception of partiality could undermine the goals the centre would set out to achieve. Therefore is crucial that the centre is not directly linked to:
service providers, as the centre would serve both as the source of reliable information about what constitutes CSA online, providing companies with the sets of indicators on the basis of which they should conduct the mandatory detection, and as a control mechanism to help ensure transparency and
accountability of service providers, including possibly helping to resolve
complaints from users; and
law enforcement, as the centre must be neutral to be able to play the role of facilitator and ensure that it maintains a fair and balanced view of all the rights at
stake, in particular between the fundamental rights of children and those of the rest of internet users
318
2.S upport Member S tates in putting in place usable, rigorously evaluated and
effective prevention measures to decrease the prevalence of child sexual abuse in the
EU.
The results of the monitoring of the implementation of the Child S exua1 Abuse
Directive indicate that Member S ttes face challenges to put in place prevention measures. These challenges occur at all stages: before a person offends for the
first time, in the course of or after criminal proceedings, and inside and outside
prison. Building on the work of the prevention network of researchers and
practitioners572, the centre would support Member s ttes in putting in place
usable, rigorously evaluated and effective multi-disciplinary prevention measures to decrease the prevalence of child sexual abuse in the EU, taking into
account differing v ulnerbilities of children according to their age, gender,
development and specific circumstances.
The centre would provide support and faciliate Member S ttes action on the
various types of prevention efforts, both those focused on the child and his or her
environment and on decreasing the likelihood that a child becomes a victim, as
well as those focused on potential offenders and on decreasing the likelihood that
a person offends.
It would facilitate coordination to support the most efficient use of resources
invested and expertise available on prevention across the EU, avoiding
duplication of efforts. A hub for connecting, developing and disseminating research and expertise, it would facilitate the exchange of best practices from
the EU and globally, and encourage dialogue among all relevant stakeholders
and help develop state-of-the-art research and knowledge, including better
data. To perform that hub function effectively, the centre would be able to
cooperate closely in this areas with entities in the EU and beyond, including
through partnership agreements and joint initiatives. It would also provide
input to policy makers at national and EU level on prevention gaps and possible solutions to address them.
3.S upport Member S tates to ensure that victims have access to appropriate and
holistic support, by facilitating efforts at EU level.
The centre would work closely with national authorities and global experts to
help ensure that victims receive appropriate and holistic support, as the Child
s exua1 Abuse Directive and the Victims' Rights Directive573 require574. In
particular, the centre would facilitate the exchange of best practices from the EU
and globally on protection measures for child victims and serve as a hub of
This prevention network is another initiative of the EU strategy for a more effective fight against child sexual abuse,COM (2020) 607, 24 July 2020, p9.
573 Directive 2012/29/EU of 25 0ctober 2012 establishing minimum standards on the rights, support and
protection of victims of crime, OJ L 315, 14.11.2012. This Directive complements with general victims' rights the specific provisions for victims of child sexual abuse contained in the Child Se xu1 Abuse Directive.
574To ensure a coherent approach to EU victims' rights policy, the centre could also cooperate with the Victims' Riehts Platform set un under the EU S trteev on victims' ri2hts (2020-2025} coM /2020/258 fra!.
319
expertise to help coordinate better and avoid duplication of efforts. To perform
that hub function effectively, the centre would be able to cooperate closely in this
area with entities in the EU and beyond, including through partnership
agreements and joint initiatives.
It would also carry out research (e.g. on short and long-term effects of child
sexual abuse on victims) to support evidence-based policy on assistance and
support to victims.
The centre would also support victims in removing their images afd videos to
safeguard their privacy, including through proactively searching materials online
and notifying companies, in cooperation with civil society organisations such as
the INHOPE hotlines.
The centre would also serve to ensure that the voices of child victims are heard
and taken into account in policymaking at EU and national level, raising awareness of children's rights and of child victims' needs.
***
'he specitic objectives tor the Centre are coherent with the intervention logic ot the
larger initiative that the impact assessment focuses on, including the problem, problem drivers, and the general and specific objectives. As set out above for the problem drivers, the related objectives are difficult to attain through measures targeting Member S ttes and service providers alone, given the limits in efficiency, security, and accountability.
The problem drivers in the impact assessment basically indicate that service providers are not doing enough (problem driver 1), Member S ttes are not doing enough (problem driver 3), and that Member S ttes and service providers (and NGOs) are not cooperating well in what they are doing (problem driver 2). There is therefore a clear need to 1) do more (i.e. help make and make new efforts), and 2) do it more efficiently (i.e. cooperate/coordinate better on existing efforts). And the specific objectives for the centre are to help do more and help do it more efficiently on detection, reporting and removal of child sexual abuse online and on prevention and assistance to victims.
Considering the above, Table i below shows the intervention logic for the centre as a retained measure within the intervention logic for the larger initiative. In particular, it shows the specific objectives for the centre and the implementation choices to achieve those objectives.
320
ini万alive
Table l」intrventon logicpr 'he Centre as a measure within the la
Implementation choices
(Centre)
Legislative Non-
legislative
D C B A
Specific objectives
(Centre)
Specific objectives General
objective
Problem drivers Problem
Setup an EU
Centre to
prevent and counter child
sexual abuse within FRA
Set up an EU Centre to
prevent and counter child sexual abuse
with some functions in
Europol and
others in an
independent organisation under Member State la'v
Setup anEU Centre to
prevent and counter child sexual abuse as an
independent EU body (decentralised a2enc)
Setup an EU Centre focused on
prevention and assistance to victims
through
practical measures
1. Help ensure that victims are rescued and assisted as soon as possible and offenders are brought to
justice by facilitating detection, reporting and removal ofCSA online
2.S upport Member S tates in putting in place usable,
rigorously evaluated and effective prevention measures to decrease the
prevalence of child sexual abuse in the EU
3.S upport Member S tates to ensure that victims have access to appropriate and holistic support, by facilitating efforts at EU level
4. Ensure the effective
detection, removal and
reporting of online child sexual abuse where
they are currently
missing
5. improve legal certainty,
protection of fundamental rights, transparency and
accountability
6. Reduce the
proliferation and effects of child sexual abuse through increased coordination of efforts
Improve identification,
protection and support of victims of child sexual
abuse, ensure
effective
prevention, and facilitate
investi2ations
Voluntary action by online service providers to detect online child sexual abuse has
proven insufficient
Inefficiencies in public- private cooperation between
online service providers, civil
society organlsations and
public authorities hamper an effective fight against child sexual abuse
Member S ttes' efforts to
prevent child sexual abuse and to assist victims are limited, lack coordination and are of
unclear effectiveness
SOllledulk' sexual abuse i
crimes are
not
adequately addressed intheEU 2
due to
challenges in their
detection,
reporting and action, as well as
insufficient 3
prevention ard
assistance to victims
321
3. IMPLEMENTATION CHO I ES
3.1. What is the baseline from which implementation choices are assessed?
The baseline from which implementation choices are assessed is the baseline that
corresponds to the subset of issues outlined above, i.e. those problem drivers and
objectives where measures targeting Member S ttes or service providers alone would not
prove efficient.
In this baseline scenario:
with regard to detection, reporting and removal of CSA online, the inefficiencies
in the cooperation between public authorities, service providers, and civil society
organisations would likely continue, or increase, given the expected continued
growth of online interactions. Even if legal obligations to detect, report and remove
are imposed on service providers, it is unclear to where they would need to report, what would be the conditions under which the detection would take place, and
whether there would be any entity helping ensure transparency and accountability of
the process. The ability of law enforcement authorities to investigate crimes and
rescue victims will not significantly improve in the baseline scenario. In addition, the
legal fragmentation in the internal market would likely continue to increase as
Member S tates take their own measures to deal with the increasing challenge;
with regard to prevention, the network announced in the EU S trategy for a more
effective fight against child sexual abuse would continue to develop and expand. Its
activities could contribute to foster exchange of good practices and lessons learned
and enable coordination between initiatives in Member S tates and third countries. As
the prevention network grows, it would become more and more difficult to manage without dedicated resources. At the moment the network is at an incipient stage and is
managed as part of the activities of a policy unit in the European Conmiission (DG HOME , D4). This is not sustainable in the long run given the resources required to
motivate, encourage, structure and support meaningful network exchanges. As the
network grows, its management could be outsourced to a contractor, e.g. as in the
Radicalisation Awareness Network, with periodic calls for proposals to renew the
outsourcing contract. However, this would not ensure long-term sustainability of the
activities of the network. Furthermore, the scope of the activities that the network
could carry out would be limited (included limited coordination of efforts, leading to
potential gaps and duplication of efforts), given the limited dedicated resources that
such a set up would allow;
with regard to victims' assistance, Member S ttes would continue to enforce or
implement the corresponding provisions of the Child Se xua1 Abuse Directive and the
Victims' Riahts Directive575. In addition, the EUS trtegv on victims' ri ghts576 (2020-
0f 25 0ctober 2012 establishing minimum standards on the rights, support and of crime, 0.1 L 315, 14.11.2012. This Directive complements with general
322
575 Directive 2012/29/EU
protection of victims
2025) will set up a Victims' Rights Platform bringing together all EU level actors
relevant for victims' rights. However, it is unlikely that these measures would avoid
the duplication of efforts and the existence of gaps across Member S ttes in the
support of victims of child sexual abuse. No additional EU-level action on victim
support would mean that the quality and accessibility of victim support is not
expected to improve significantly. In particular, it is unlikely that victims of child
sexual abuse would be able to receive the necessary assistance to have the images and
videos of their abuse removed swiftly to reduce the negative impact on their
wellbeina.
Baseline costs
In the baseline scenario, no action would be taken, and no new structures established.
Hence, no additional costs would be incurred. However, no EU action means that there would be no additional efforts to achieve greater efficiency and cost savings.
On prevention and assistance to victims, action would continue to be taken independently by Member S ttes, academic institutions and civil society institutions. These actions may contribute to the reduction in relevant offences and better support for victims. However, if no action is taken to facilitate the flow of information between stakeholders, pooling resources and avoiding overlaps, these efforts are likely to continue to be fragmented, duplicating existing research and actions while insufficiently covering other areas.
The use of EU funds in the form of project grants (union actions and national actions) would not be significantly improved. This could lead, for example, to duplication of
proiects across the EU.
The baseline option would not address the limited nature, lack of coordination and unclear effectiveness of Member S ttes' current efforts to prevent child sexual abuse and assist victims. As a result, the overall negative economic impact of child sexual abuse ja not expected to improve.
3.2. Overview of all choices analysed
Given the above considerations, it became evident that a central entity was needed, as the existing entities or networks thereof could not be expected to address the problem drivers and meet the specific objectives.
The process to determine the implementation choices started with a mapping of existing entities and their present functions in order to identi句 possibilities to build on existing structures and make use of existing entities, or simple use them as possible references or benchmarks. For the mapping purposes, the examples were divided in two main types, depending on whether they required specific legislation to be set up: 1) entities that do not require specific legislation to be set up:
c) Centre embedded in a unit in the European Commission (DG HOME, e.g. Radicalisation and Awareness Network, RAN).
victims' rights the specific provisions for victims of child sexual abuse contained in the Child S exua1 Abuse Directive.
576 EUS trtegy on victims' rights, 24 June 2020,COM /2020/258 final
323
d) Entity similar to the EU centre of expertise for victims of terrorism.
2) entities that require specific legislation to be set up: e) Centre embedded in an existing entity:
o EU body:
Europol FRA
o Other:
National or international entity (public or private such as an NGO,
e.g. a national hotline or 1NHOPE network of hotlines).
d) Centre set up as a new entity: o EU body:
Executive agency (e.g. European Research Executive Agency,
REA, European Education and Culture Executive Agency
(EACEA)) Decentralised agency (e.g. European Monitoring Centre for Drugs and Drug Addiction (EMCDDA), European Institute for Gender
Equality (ElGE), European Union In tellectual Property Office
(EUIPO)). 〇 Other:
National entity:
Foundation set up under national law (e.g. Academy of
European Law (ERA), set up under German law);
Member S tate authority (e.g. new Dutch administrative
authority to combat CSA and terrorist content online, under
preparation); International entity:
Inter-governmental organisation (e.g. European S pace
Agency (EsA), European Organisation for theS afet of
Air Navigation (EUROCONTROL));
Joint undertaking (public-private partnership, e.g. hmovative Medicines Initiative, Clean S ky Joint
Undertaking).
Non-governmental organisation (e.g. CEN/CENELEC,
EuroChild).
The mapping also included combinations of the above, i.e. with some functions of the Centre under one entity and other functions under another, e.g. a combination of Europol and:
an independent entity set up under national law;
FRA;
a unit in the Commission; or
and NGO (e.g. a hotline).
324
Finally, the mapping also included three relevant entities outside of the EU, which carry
out similar functions to those intended for the EU centre, and which could provide useful references in some areas (e.g. costs, organisational issues, etc):
US National Centre for Missing and Exploited Children (NCMEC); Canadian Centre for Child Protection (C3P); and
Australian Centre to Counter Child Exploitation (ACCCE).
The following section presents in detail the above mapping of existing examples of entities that could serve as a reference for the centre, and which will serve as the basis to determine the choices retained for further assessment (described in section 3.3.) and those discarded early (described in section 3.4.). Each reference is analysed in terms of the legal basis, funding, governance, operational capacity, location and estimated time to
setup.
1) entities that do not require specific legislation to be set up: a) Centre embedded in a unit in the European Commission (DG HOME, e.g.
Radicalisation and Awareness Network, RAN577)
Legal basis: no legal personality, administrative decision required to integrate it
in an existing unit or create a new unit. In the case of RAN, the Commission
announced its creation under objective 2 of the Commission Communication on
the Internal S ecurity S trtegy (CII [2010] 673).
Funding: operational expenditure supporting policy implementation + possible a
framework contract under IS F to support activities of the unit (as in the case of
RAN).
Governance: RAN is managed by DG HOME, with administration and logistics outsourced to a contractor.
Operational capacity: a unit in DG HOME could potentially serve as a hub of
expertise, offer a platform for national authorities and experts to exchange
knowledge and experience, and manage networks, projects and information
exchange platforms. In the case of RAN, it organises thematic working groups for
frontline practitioners to share their knowledge, experiences and approaches with
one another, and peer review each other's work. RAN also produces publications, which are shared with its network of frontline practitioners. Location: RAN does not have a physical location, it is overseen by DG HOME, with the contractor based in Brussels.
b) Entity similar to the EU Centre of expertise for victims of terrorism578
Legal basis: no legal personality, pilot project set up by the Commission and run
by a consortium of victim support associations led by Victim S upport Europe.
Funding: two-year pilot project funded by the European Parliament, implemented
by DG J UST under public procurement (EUR 1 million for 2 years).
for more information. for more information.
325
577
578
Governance: Executive committee made of a project manager from the
consortium running the project, representatives from DG J UST and HOME,
representatives of victim support organisations (in practice the centre is governed like any project funded by the Commission).
Operational capacity: provides training and handbooks, serves as a hub of
expertise. It also offers a platform for national authorities and victim support
organisations to exchange knowledge and experience, maintains a database with
information on experts in different fields.
Location: no physical location, overseen by DG JUsT/HOME, project coordinator based in Brussels.
2) entities that require specific legislation to be set up: a) Centre embedded in an existing entity: o EU body - Europol:
Legal basis: Europol regulation (Regulation (EU) 2016/794) would need to be
updated to cover all tasks assigned to the Centre. No separate legal personality for
the Centre.
Funding: financed by the EU budget (around 200 M EUR/year). Funding would
need to be topped up by around 25 M EUR/year. Governance: through a Management Board with one representative from each EU
Member S tate taking part in the Europol Regulation and one representative from
the European Commission. Denmark has an observer status.
Operational capacity: o Around 1300 staff (including staff with employment contracts with
Europol, Liaison officers from Member S ttes and third states and
organisations, S econded National Experts, trainees and contractors). o Can host databases ensuring data protection. o Has the capacity to create specialised Centres that are able to create
focused teams; developing good practices for crime prevention; providing
training and capacity building measures at national level; build a set of
specialised intelligence so that the centres act as knowledge hub per type of crime.
o Europol provides for notice and takedown services collaborating with
online service providers on terrorist content online.
o For the specialised Centres a Programming Board can be created allowing for collaboration with a specific set of stakeholders that know best a
certain type of crime; Europol can cooperate with third countries and other
inter-governmental organisations; Europol has the possibility to conclude
memorandums of understanding (MoUs) for collaboration with other EU
decentralised agencies.
Location: The Hague (the Netherlands).
326
o Fundamental Rights Agency (FRA):
Legal basis: FRA's founding regulation would need to be updated to cover all
tasks assigned to the Centre. No separate legal personality.
Funding: financed by the EU budget (around 25 M EUR/year). Funding would
need to be doubled (increase by around 25 M EUR!year).
Governance: through a Management Board with one representative from each EU
Member S tte taking part in the Europol Regulation and one representative from
the European Commission. Denmark has an observer status.
Operational capacity: FRA publishes policy briefs and research in the are of
fundamental rights, and serves as an advisor in that area to EU institutions, Member S tates and other stakeholders.
Location: Vi elma (Austria).
o Other: National or international (e.g.a national hotlines or INHOPE
network of hotlines)
Legal basis: INHOPE is an association of hotlines from multiple countries,
governed by Articles of Association and Rules and Regulations. The original Dutch version of the Articles of Association (Deed 25th May 2018), only the text
of the Dutch notarial deed executed in the Dutch language prevails. Member
hotlines have to comply with a Code of Practice.
Funding: financed by the Commission. Under CEF Telecom in the 1FF 2014 -
2020: EUR 76.4 r or approx. EUR 11 r per year. Funding goes via grants (EUR 63.3 r) to theSa fer Internet Centres (composed of awareness raising, helpline,
Hotline), and via service contracts (lEUR 13.1) to coordination and support
providers. Governance: Members vote to elect a President who leads an elected Executive
Committee, also known as the Board. The Board, which currently consists of six
people, is also charged with the management and administration of the
Association. Hotlines receive reports on instances of CSAI. If the content is
qualified as illegal and is hosted in an EU country, hotlines noti句 Internet S erice
Providers for the swift removal of the content, and report the case to the relevant
law enforcement agency for victim identification purposes. INHOPE's research
role is focused on statistics about the reports it receives and its work.
Operational capacity: hotlines receive reports on instances of CSAI. 'f the
content is qualified as illegal and is hosted in an EU country, hotlines notify Internet S ervice Providers for the swift removal of the content, and report the case
to the relevant law enforcement agency for victim identification purposes.
Location: Netherlands. LNHOPE brifas toaether 50 hotlines from 46 countries.
327
b) Centre set up as a new entity.
o EU body: Executive agency (e.g. European Research Executive Agency (REA),
European Education and Culture Executive Agency (EACEA)):
Legal basis: Regulation (BC) No 58/2003
Funding: dedicated funding from EU budget.
Governance: the policy of executive agencies is steered by parent DGs
(according to their annual Work Programme). As an executive agency's sole
task is to implement EU Programmes, there is no need for direct policy steer/advice from Member S ttes or other relevant actors.
Operational capacity: the functions of executive agencies are limited by
Regulation (EC) No 58/2003.
Location: European Union.
oEU body: Decentralised agency
Possible examples: European Monitoring Centre for Drugs and Drug Addiction (EMCDDA)
Legal basis: Regulation (BC) 1920/2006.
Funding: received stable funding under the Commission's budget (15 million
EUR/year). Received funding from [PA for concrete actions (e.g. actions in
Balkan countries). It can receive funding from other sources: payments for
services rendered, contributions from organisations (international, NGOs,
governmental) /third countries. Currently receives donations from Norway and Turkey.
Governance: supported by two statutory bodies (Management Board and
S cientifc Committee) to advise and assist in the decision making process.
Operational capacity: staff of 100 people. Provides the EU and its Member
S tates with a factual overview of European drug problems and a solid
evidence base, allows sharing best practice and new areas of research.
Focused on collecting, analysing and reporting - provides tools such as
publication database, statistics compilations, country reports etc. Cooperates with relevant networks, third countries (candidate and potential candidates to
the EU, European Neighbourhood Policy (ENP) area countries), and regional and international organisations: as well as with Europol on monitoring of
drugs problem.
Location: Lisbon, Portugal
European Institute for Gender Equality (ElGE)
Legal basis: Regulation (EC) No 1922/2006.
Funding: received stable funding under Commission budget, 8 M EUR/year. It can receive funding from other sources: payments for services contributions
from organisations or third countries and Member S tates.
328
Governance: governed by a Management Board: Member S ttes on rotation
basis and European Commission.S upported by Experts' Forum as Advisory
Body.
Operation capacity: staff of 45 people. ElGE cooperates with EU, national
and international institutions and organisations. It focuses on support to
research and policy-making, maintains statistics database.
Location: Vilnius, Lithuania.
European Union Intellectual Property Office (EUIPO)
Legal basis: Regulation (EU) 20 17/1001.
Funding: EUIPO does not appear in the EU budget, as its expenditure is
covered by a 240 M BUR revenue made yearly. Their main source of income
comes from registrations and trade-market design. Governance: the governance structure of the BUPl consists of a
Management Board and a Budget Committee, each composed of one
representative from each Member S tate, two representatives from the
Commission and one representative from the EP.
Operational capacity: the EUIPO is responsible for the observatory of the
infingement of IP rights. It also assists enforcement authorities with training, awareness raising and tools. They cooperate with DG HOME and have formal
service level agreements with 14 DGs and 7 agencies and bodies (Eurojust,
Europol, CEPOL, TAXUD, and OLAF). They have seconded staff in Europol for this coordination.
Location: Alicante, S pain.
〇 Other legai forms
Private entity under national law of an EU Member S tate: Foundation -
Academy of European Law (ERA)
Legal basis: foundation having legal personality under the German Civil
Code, Para.§ 80 to 88. Established at the initiative of the European Parliament.
Funding: o Public foundation supported by donations from EU Member S tates,
regions (e.g. DE Federal S tates), city of Trier, private entities; o Recipient of an operating grant under the Jean Monnet programme; o Budget in 2019 - EUR 8,4 million (around 5 million EU contribution).
Governance:
o Governing Board (2 members form EU institutions - EP and CJEU, 1
member/MS and relevant associations), Board of Trustees (advisory
body), Executive Board;
329
o The Commission is represented in the Board of Trustees, which
provides advice to the main governance bodies of the institution; o ERA entails Member S ttes' governments as Members of the
Governing Board and the Executive board (represents the organisation at international Fora).
Operational capacity: o Delivers training in European law - organisation of conferences,
seminars, summer courses
o Permanent staff (83) deals mostly with organisation, administration, finance and communication. Cooperates with experts in the field to
deliver training. Location: Trier, Germany
Member S tate Authority (e.g. ne'v Dutch Administrative Authority, under
preparation)
Legal basis:
o Legislation establishing the authority is under preparation; o Independent public law administrative body; o Established to enforce take-down of CSAM, in cooperation with
hotlines and law enforcement.
Funding: provided by the Ministry of Justice.
Governance: TBC, but the Ministry of Justice will have no influence over the
management of the authority, appointment of director.
Operational capacity (envisaged): o Receive notification of CSAM , issue notice of take down to
companies and follow-up the removal of CSAM; o Enforce administrative fines, issue transparency reports; o Conduct proactive search for CSAM; o Access to database of hashes.
Location: the Netherlands.
o Others: international entity.
Inter-governmental organisation (e.g. European S pace Agency (EsA), European Organisation for the S afety of Air Navigation (EUROCONTROL))
Legal basis:
o ES A:ESA Convention.
〇 EUROCONTROL: International Convention, to which the EU accessed
through Protocol on the accession.
Funding: o Contributions from members and associates, EU funding possible (e.g.
annual contribution to EsA); o ESA's budget: 6.49 billion EUR; Eurocontrol budget 865 M EUR.
330
. Governance:
(governing body);
(high-level S tate
is represented in the Council
Permanent Commission
ESA: Each Member S tate
EUROCONTROL: the
O o
representatives) and the Provisional Council.
Operational capacity: o Can cooperate with a numb
o Can handle sensitive data
security Location: European Union.
3rof stakeholders including the EU;
(though not personal data), High level of
Joint undertaking (public-private partnership, e.g. Innovative Medicines
Initiative, Clean S k Undertaking)
Legal basis: Council Regulation based on Article 187 TFEU or on Article 157
TEC (now Article 173 TFEU).
Funding: contributions from members and EU (contribution set out in the
founding regulation, paid the general budget of the Union allocated to the relevant
programme). Governance: governing Board consisting of founding members.
Operational capacity: limited to publishing open calls for proposals and managing
grants. Location: EuroDean Union
i'on-governmenta1 organLzatlon (e.g.し Li'/しEN E L1し, Lurol.111U)
Legal basis: registered as non-profit/non-governmental organisation under
Member S tate law (i.e. AISBL- a non-profit international association with legal
personality based Belgian Code of companies and associations);
Funding: donations contributions, sale of goods and services, investments, EU
funding possible through project grants). Governance: Member S tates and the EU institutions could not be part of their
governance. To avoid questions about their independence, NGOs are unlikely to
add other stakeholders in their governance as well.
Operational capacity: o Capacity to conduct reporting activities, e.g. annual data reports; o S ome NGOs have database hosting capacities.
Location: European Union
Organisations outside the EU
US National Centre for Missing and Exploited Children (NCMEC)
Legal basis: not-for-profit corporation, with specific roles recognised under US
federal law (18 U.S. Code§ 2258A).
331
Funding: budget of approximately 26 M EUR!year, over half of it covered by US
Federal Government funding, with the remainder coming from private contributions and other sources.
Governance: board of Directors (including public representatives, industry
members, ex-law enforcement).
Operational capacity: o US companies are obliged by law to report instances child sexual abuse
to NCMEC; o NCMEC serves as clearinghouse, receiving, filtering and forwarding
reports to relevant law enforcement in the US and globally. Location: Alexandria (Washington D.C), USA
Canadian Centre for Child Protection
Legal basis: Registered as national charity under Canadian law.
Funding: large part of donations are from international foundation (Oak
Foundation, Children's hivestment Found Foundation).So me funding comes
from private donors, very limited funding from the private sector, subject to strict
conditions to avoid conflict of interests.
Governance: the Board of Directors is composed of volunteers from a variety of
disciplines, including law enforcement, education, psychology, medicine, law,
finance, and public service.
Operational capacity: o The Canadian Centre offers crisis assistance, works with survivors,
prepares educational and prevention materials; o It receives reports of CSAM via cybertipline, and runs Project Arachnid -
web crawler and platform to reduce the availability of CSAM;
Location: Winniiea. Manitoba. Canada
Australian Centre to Counter Child Exploitation (ACCCE)
Legal basis:
Australian Federal Police tasked with creating a hub of expertise and
specialist skills needed to detect, disrupt, prevent and investigate child
exploitation; o Coordinates and support to Australia's law enforcement efforts, supports
investigations. Brings together key stakeholders, allows cross-pollination of resources, knowledge and skillsets between stakeholders.
Funding: Funded from the federal government's budget - AUS$68.6m (approx. 44 M EUR) over 20 18-2022.
Governance: Board of Management consists of representatives of federal and
state police, Office of the eSafety Commissioner, Australian Criminal Intelligence
Commission, Department of Home Affairs.
Opertional caracitv:
332
o Reporting - provides a platform to report inappropriate, harmful or
criminal activities that have occurred to children online, including CSAM ,
grooming but also cyberbullying and other.
o Triage of reports of child exploitation. o Intelligence inputs. o S pecialist investigative capability: victim identification, Covert Online
Engagement Team (in this aspect fulfils role similar to Europol's
Cybercrime Centre). o Prevention and online child safety:
Research - can commission studies.
Prevention - online safety education resources.
Victim protection - resources on counselling and support for
victims.
o Cooperates with government agencies (including law enforcement, relevant departments of the government), state authorities, victims
associations, foreign law enforcement.
Location: Brisbane. Australia.
333
3.3. Description of implementation choices
Following the mapping of all possible choices, these were analysed in detail to select the final choices to be retained for comparison.
The analysis considered in particular factors such as legal and operational capacity, governance, financial sustainability, independence, accountability and transparency, and
operational synergies, structured along two groups of considerations:
Functions that the centre could take, closely linked to its specific objectives: o S upport prevention efforts.
o S upport victims.
o Contribute to the detection, reporting and removal ofCSA online.
Forms that the centre could take to best fulfil the above functions, and which are
determined by: o Legal status: both the legal basis to set up the centre (if any) and the
legislation to allow it to perform its functions (e.g. processing of personal
data). o Funding: the sources that would allow the centre to ensure long-term
sustainability and independence of the centre, while avoiding conflict of
interest.
o Governance: it should ensure 1) proper oversight by the Commission, and
other relevant EU institutions and Member S ttes; 2) participation of
relevant stakeholders from civil society organisations, industry, academia, other public bodies, for example through advisory groups; 3) ensuring neutrality of the centre from overriding Drivate and Dolitical interests.
These two considerations are closely interlinked: the level of ambition for the functions, whether the centre should take on all three of them and to what degree, determines the choice of the optimal form to enable those functions. in turn, the choice of the form, excludes or enables the centre to take on certain functions. The analysed implementation choices reflect different levels of ambition.
334
4: set up an EU Centrepcused on prevention and assistance lo victims through practical
meaSures
This choice proposes a centre that is set up through non-legislative (practical) measures. It would take on functions mostly of prevention and assistance to victims, and the form of an EU-funded coordination hub, managed by the Commission with
possible support from a contractor (similar to the Radicalisation and Awareness
Network, RAN579). This choice constitutes policy measure 2 in the impact assessment.
Functions:
Prevention.
o Facilitate the implementation of the practical measures on prevention of
measure 1, including supporting Member S ttes on the implementation of the
relevant provisions of the Child S exua1 Abuse Directive (e.g. through expert
workshops), and serving as a hub of expertise to support evidence-based
policy in prevention. For example, it could develop and manage an online
platform where professionals working on prevention could find information
relevant to their work.
〇 S upport the further development of the prevention network introduced in the
EU strategy for a more effective fight against child sexual abuse. The centre
would ensure the coordination and support for the network by e.g. facilitating the planning of its work, preparing future publications, creating and
maintaining a database of good practices, gathering statistics. This would
allow the network to grow to its full potential. o Help develop research on prevention, including on the effectiveness of
prevention programmes o Facilitate dialogue among all relevant stakeholders, within and beyond the
EU, on prevention efforts.
Victims' assistance.
o S imi1ar1y to its role in prevention, the centre could facilitate the
implementation of the practical measures on assistance to victims of measure
1, including supporting Member S tates on the implementation of the relevant
provisions of the Child S exua1 Abuse Directive, serving as a hub of expertise to support evidence-based policy development in assistance to victims, for
example, through an online platform where professionals working on
assistance to victims could find information relevant to their work.
o S et up an online platform where victims can find information on support resources that are available to them in their area or online.
o Help develop research on assistance to victims, including on the
effectiveness of short-term and long-term assistance programmes, as well as
on victims' needs in the short- and lona-term.
r9 See here for more information about RAN.
335
o Facilitate dialogue among all relevant stakeholders, within and beyond the
EU, on victims' assistance efforts.
o The lack of legislation underpinning the set-up of the centre would prevent it
from conducting proactive searches of CSAM based on victims' requests for
help to have their images and videos taken down.
Detection, reporting and removal.
o In addition to the main functions on prevention and assistance to victims, the
centre could also facilitate the implementation of the practical measures on
detection and reporting of measure 1.
o These practical measures could include developing codes of conduct and
standardised reporting forms for service providers, improving feedback
mechanisms and communication channels between public authorities and
service providers, and facilitating through funding and coordination the
sharing between service providers of databases of hashes and detection
technologies. It could also include support to service providers to implement
safety by design, e.g. by validating design features aimed at protecting children from sexual abuse, such as more sophisticated age verification or
parental controls.
o The centre would not be able to take a more active role in the detection,
reporting and removal process in the absence of a legal basis for the
processin2 of personal data involved in the process.
lorm:
. Legal status.
o As a coordination hub managed by the Commission, the centre would not
have its own legal personality. ・ Funding.
o The centre in this form would be funded under the Internal S ecurity Fund.
The support to running the centre would require a framework contact, which
could be supplemented by project grants to relevant stakeholders on a case by case basis.
・ Governance.
o The centre would be under the direct responsibility of the Commission.
The Commission would steer the activities of the centre, while possibly
delegating the implementation of specific activities and the day-to-day
management to a contracted entity. In this scenario the contractor would take
on the administrative activities such as drafting work-plans, organising
meetings, maintaining an online platform and cauying out other support activities as needed.
o This form would guarantee alignment between centre's work and
Commission's policies and actions. At the same time, while there could bea
336
possibility of input from stakeholders e.g. through an advisory group, there
would be no formal governance structure in which they could participate.
丑set up an EU Centre加prevent and counter child sexual abuse as an independent EU
boむ(decentralised agen〕リ
This choice proposes a centre as a new, independent EU body in the form of a decentralised agency. Functions:
Prevention.
o Facilitate the implementation of the practical measures on prevention of
measure 1, including supporting Member S ttes on the implementation of the
relevant provisions of the Child S exua1 Abuse Directive (e.g. through expert
workshops), and serving as a hub of expertise to support evidence-based
policy in prevention. For example, it could develop and manage an online
platform where professionals working on prevention could find information
relevant to their work.
o S upport the further development of the prevention network introduced in the
EU strategy for a more effective fight against child sexual abuse. The centre
would ensure the coordination and support for the network by e.g. facilitating the planning of its work, preparing future publications, creating and
maintaining a database of good practices, gathering statistics. This would
allow the network to grow to its full potential. o Help develop research on prevention, including on the effectiveness of
prevention programmes o Facilitate dialogue among all relevant stakeholders, within and beyond the
EU, on prevention efforts.
o Fund or help facilitate funding (e.g. improve the update of EU existing EU
funding) of prevention initiatives.
Victims' assistance.
o S imi1ar1y to its role in prevention, the Centre could facilitate the
implementation of the practical measures on assistance to victims of measure
1, including supporting Member S tates on the implementation of the relevant
provisions of the Child S exua1 Abuse Directive, serving as a hub of expertise to support evidence-based policy development in assistance to victims, for
example, through an online platform where professionals working on
assistance to victims could find information relevant to their work.
o S et up an online platform where victims can find information on support resources that are available to them in their area or online.
o Help develop research on assistance to victims, including on the effectiveness
of short-term and long-term assistance programmes, as well as on victims'
needs in the short- and long-term.
337
o Facilitate dialogue among ai relevant stakeholders, within and beyond the
EU, on victims' assistance efforts.
o Most agencies are limited to coordination and exchange of information;
however, it is possible to make specific provisions to allow them to process
personal data.
o The legal basis of new agency could enable it to receive requests of support from victims to have their images and videos taken down and conduct
proactive searches of CSAM following these requests, in cooperation with
hotlines where needed.
o The Centre would carry out the non-legislative actions on assistance to
v1ct1ms一
Detection, reporting and removal ofCSA online.
o An agency ensures independence from private inftuence and is well-placed to
take on the role of ensuring transparency and accountability of the detection,
reporting and removal process. o The legislation establishing the Centre as a few, independent entity, would
provide the legal basis to carry out all the functions described concerning the
detection, reporting and removal of CSA online, in particular with regard to
processing of personal data and child sexual abuse material.
o In particular, the legal basis should contain provisions to allow the agency to
process personal data and host databases of indicators of CSA online. This
would allow it to notably prepare and maintain these databases, process the
reports from service providers, and contribute to the removal process by
searching CSAM proactively. o The new entity would seek to build on existing efforts and avoid unnecessary
disruption and duplication. It would focus on supporting what is working well and contributing to address the existing gaps in the process. This means
that the centre would cooperate closely with a wide range of stakeholders
active in the detection, reporting and removal process, including service
providers, public authorities and civil society organisations, in the EU and
third countries.
o The Centre would work with a wide range of stakeholders including law
enforcement (Europol, and national law enforcement agencies), NGOs (e.g.
hotlines), service providers, and academia. In particular, the centre would
work very closely with Europol, facilitating its current activities of analysis and channelling of reports to Member S ttes for action, and with the network
of hotlines, to facilitate removal and build on their expertise and experience,
especially when it comes to the specificities of the national context, e.g.
concerning what is considered illegal in their jurisdiction above and beyond the core definitions in the CSA Directive, or concerning efforts by service
providers established in their iurisdiction.
338
The new entity would also be able to effectively ensure accountability and
transparency, thaiiks to its fully independent status and its expertise in the
detection, reporting and removal process.
lorm:
. Legal status.
o The Centre would have its own legal personality as a decentralised EU
agency, with a legal basis set up under this initiative.
・ Funding. o The Centre would be funded mostly by the Commission. As an EU agency it
would have its own budget line. The funding would come from the budget
managed by the Directorate-General for Migration and Home Affairs.
o To minimise the strain on the EU budget, the Centre may be able to receive
additional funding from other sources such as Member S tates, not-for-profit donor organisations, and the private sector under strict conditions to prevent
any conflict of interests or loss of independence, overseen by the governance
body. . Governance.
o The Centre would be supervised by the Commission as part of its
management board. To ensure that the centre maintains its quality, and in
particular its neutrality and a balanced consideration of all the relevant rights at stake, it will be subject to periodic reporting to the Commission and to the
public. The governance structure would also ensure participation of all the
relevant stakeholders representing the different interests and rights at stake
(including both children's rights and internet users' privacy rights), while
strictly avoiding conflicts of interests, for example through their participation in management and advisory bodies.
o The Centre would be subject to the highest standards with regard to
cybersercurity and data protection, and will be under the supervision, inter
ahia, of the data orotection authorities of the Member S tate hosting it.
C: set up an EU centre to prevent and counter child sexual abuse with someルnctions in
Europol and others in an independent organisation under Member S tate law
This choice proposes a "hybrid" centre with a structure split in two: some functions in
Europol and other functions in a few, independent entity with its own legal personality. Given its current expertise, Europol would retain the functions concerning detection, reporting and removal of CSA online, and the new entity would focus on
prevention and assistance to victims.
Functions:
Prevention.
o The part of the centre located in a new independent body would carry out the
non-leaislative actions on orevention described in choice A .
339
o The centre would be able to build on Europol's experience on prevention
activities focused on decreasing the likelihood that a child falls victim of
sexual abuse through awareness raising campaigns. o For the other main type of prevention activities, focused on decreasing the
likelihood that a person offends, the Centre, as an entity separate from
Europol, would have more autonomy to develop new expertise, currently not existing in Europol. It would also be able to fund or help facilitate
funding of prevention initiatives.
Victims' assistance.
o The part of the centre located in a new independent entity would carry out the
non-legislative actions on assistance to victims described in choice A .
Detection, reporting and removal ofCSA online.
o The part of the centre under Europol would carry out this function.
o Europol's legal basis would be expanded to enable it to support the
detection, reporting and removal process as described above (specific
objectives), with all the necessary conditions and safeguards. o This could include enabling Europol to receive requests of support from
victims to have their images and videos taken down and conduct proactive searches of CSAM following these requests, in cooperation with hotlines
where needed.
lorm:
. Legal status.
o The part of the centre under Europol would operate under the Europol
Regulation. The Regulation would need to be modified so that Europol's mandate can cover the additional functions concerning detection, reporting and removal ofCSA online.
o The part of the centre under a new entity would have its own legal personality as an independent organisation established under a Member S tate's law.
・ Funding. o The part of the centre under Europol would be funded through Europol's
budget, which would need to increase to provide for extra staff and
equipment. 〇 The part of the centre under a new entity would operate under a Member
S tate law, entailing that its employees, they are assumed to be employed under the provisions of the national law in the host country (i.e. not EU staff).
o This new entity should be funded mostly by the Commission, to ensure the
centre's independence, in particular from potentially overriding private and
political interests. This would entail an additional supervision by the
European Court of Auditors.
o The centre could be funded through the Internal S ecurity Fund. Initially, the
centre could be launched as a specific action, and later though a national ISF
340
programnie 0f the Member S tte where it is estblshedう0, or under a direct
grant if appropriate. This part of the centre would be able to receive additional
funding from other sources such as Member S tates, not-for-profit donor
organisations, and the private sector under strict conditions to prevent any conflict of interests. This would ensure financial sustainability without
compromising the functions of the centre, while minimising the strain on the
EU budget.
Governance.
o The centre under Europol would be integrated in the current governance structure in Europol.
o The part of the centre under a new entity would have the governance structure determined by its legal personality under a Member S tte's law. In
any case, it should allow the participation of the Commission in its governing
body and ideally include also key stakeholders such as public authorities, civil
society organisation and companies, to facilitate coordination and
cooDeration. while strictly avoidina conflicts of interests.
D: set up an EU Centre lo prvent and counter child sexual abuse within the F dmental
Rights Agency (FRAノ
This choice would require legislation to set it up, notably expanding FRA's legal basis to cover the relevant aspects of the detection, removal and reporting process, and the victims' support to have their images and videos removed.
Functions:
Prevention.
o FRA could facilitate the implementation of the practical measures on
prevention of measure 1. It could collect and analyse data and provide advice
in the area of prevention; it could produce materials such as handbooks and
guidelines and possibly run awareness raising campaigns. o FRA would also be well-equipped to perform the research-related roles of
the EU centre.
Victims' assistance.
o FRA could facilitate the implementation of the practical measures on
assistance to victims of measure 1.
o FRA has considerable expertise in the area of child rights and has contributed
to the EU s trategy on the rights of the child.581 Its work also covers victim's
rights. o Currently FRA addressed the areas of victims' and child rights on a project
basis. If the centre were to become a part of FRA, it would need to bea
permanent element of its structure.
on 'European Return and )fl Fund (AMIF)
580 In a similar way as the initial funding was provided for the s pecific Acti
Reintegration Network (ERRIN)' under the Asylum, Migration and Integrati( 58l EUS trtegy on the rights of the child,COM (2021)142 finai.
341
Detection, reporting and removal ofCSA online.
o Facilitate the implementation of the practical measures on detection and
reporting of measure 1.
o The legal status of FRA would need to be modified to allow it to support the processing of reports of child sexual abuse, to host a database of CSAM , and the support to victims who wish to have their images and videos removed
from the internet.
o With regard to ensuring transparency and accountability of efforts around
combating child sexual abuse, FRA would be able to build on its expertise in
ensuring the fundamental rights of citizens in policy making.
Form:
Legal status.
o The centre would operate under the FRA Regulation582. The Regulation would need to be modified so that FRA's mandate can also cover all the
centre functions.
Funding. o The centre would be funded through FRA's budget, which would need to
increase to provide for extra staff and equipment.
Governance.
o The centre would be integrated in the current governance structure in FRA
which includes Member S tates and the Commission. The involvement of the
Commission in the governance would obviously be less direct than in choice
A. FRA has a S cientific Committee which guarantees the scientific quality of
the FRA's work. Additional mechanism to involve relevant stakeholder
arous would need to be created. e.a. thouah new advisory aroups.
3.4. Choices discarded following initial analysis
Entities that di not require pc夢c legislation to be set up, e.g. a designated Unit in the
CommissionのG HOME)
The advantage of creating a dedicated unit for the fight against child sexual abuse in DG HOME would be that no specific legal basis would be required. Considering the pressing issue of child sexual abuse, creating a unit that can be implemented relatively quickly and
operate in the near future.
The main drawback of this type of implementation choices is that extent to which they could undertake the intended functions is limited due to the lack of legal basis. It would focus on implementation of practical measures through facilitating coordination and
exchange of best practices. It could not support operational cooperation between
582 Council Regulation (EC) No 168/2007 of 15 February 2007 establishing a European Union Agency for Fundamental Rights.
342
d law enforcement nor the analysis of materials. The effectiveness of this
its expected impact would therefore be low. providerI solution
Creating a designated unit could also result in the increased logistical, financial and
policy coordination within the Directorate-General.
Another limitation would be that it could become quite difficult for external stakeholders and interests groups to participate in the processes of such unit, raising transparency and participatory issues. This would likely result in a limited buy-in from
key actors in the field, and limit the impact this choice for the centre could make.
Centreル仰embedded in Europol
The centre would operate under the Europol Regulation governing its mandate. It would
require expanding Europol's legal basis so that Europol's mandate can also cover all the centre functions.
Functions:
Prevention.
o The centre under Europol would carry out the non-legislative actions on
prevention described in choice A .
o The centre would be able to build on Europol's experience on prevention activities focused on decreasing the likelihood that a child falls victim of
sexual abuse through awareness raising campaigns583. o For the other main type of prevention activities, focused on decreasing the
likelihood that a person offends, new expertise would need to be developed as Europol does not carry out such prevention activities.
. Victims' assistance.
o The centre under Europol would carry out the non-legislative actions on
prevention described in choice A . These are activities not directly related to
Europol's core mandate of supporting law enforcement in Member S ttes.
o Europol's legal basis (currently under revision584) would need to be expanded to enable it to receive requests of support from victims to have their images and videos taken down and conduct proactive searches of CSAM following these requests, in cooperation with hotlines where needed.
Detection, reporting and removal ofCSA online.
o Europol's legal basis would be expanded to enable it to support the
detection, reporting and removal process as described above (specific
objectives), with all the necessary conditions and safeguards. o This would notably mean that Europol's legal basis would need to be changed
to enable it to receive reDorts directly from service Droviders. and to make
States as well as UK Norway and
currently under
:Say NO! campaign covering all EU
le Commission proposal was adopted in December 2020 and is
negotiation between the European Pariiament and the Council.
343
583
584
available to them databases it indicators on the basis it which they should
conduct the detection of child sexual abuse material (known and few). o The centre would be able to build on the existing capacities and processes in
Europol's European Cybercrime Centre (EC3) to receive the reports of child
sexual abuse from online service prViders585.Ec 3 also reviews the reports, and eventually enriches them with intelligence before forwarding them to the
18 Member S ttes that have chosen to not receive the reports directly from
NCMEC.
o The centre under Europol would not be able to take on the function of
ensuring accountability and transparency on efforts by companies to tackle
child sexual abuse online. This function, which is not directly linked to
supporting law enforcing operations, requires a high degree of independence.
Being part of a law enforcement agency, which would keep its current key role of processing the reports from service providers before forwarding them
to Member S tates for action, the centre may not be seen as a neutral party in
the process by online service providers and the public.
Funding. o The centre would be funded through Europol's budget, which would need to
increase to provide for extra staff and equipment.
Governance.
o The centre would be integrated in the current governance structure in
Europol, which includes Member S tates and the Commission. The
involvement of the Commission in the governance would obviously be less
direct than in choice A . Currently, there is no mechanism to involve other, non-law enforcement stakeholders in the governance and management structure (although advisory groups with various stakeholders exist at
working level).
Europol (provided that the Europol Regulation is modified so that its mandate can also cover all the centre functions), would have the potential to take on the role linked to
detection, reporting and removal of CSA online, as it already takes part in the process of
handling the reports. The centre could facilitate the work of national law enforcement
agencies, alleviating their workload linked to handling of the reports.
On the other hand, the creation of the centre as part of a law enforcement authority can limit the impact of the actions taken on prevention and victim support.So me tasks would be too far from Europol's core mandate: some of the envisaged functions within
prevention and assistance to victims are significantly different from the core law enforcement mandate of Europol. This would require significant capacity building efforts in Europol and the creation of teams that would work on very different tasks from those
585 As explained in the problem definition (see also amiex 6), online service providers currently send their
reports to NCMEC, which determines whether they concem the EU, arid if so, forwards them to US law enforcement (Homeland S ecurity Investigations) for further transmission to Europol or Member S ttes directly. Europol's current legal basis does not allow it to receive the reports directly from NCMEC.
344
of the rest of the organisation. This notably includes research on prevention (e.g. on the
process by which a person with a sexual interest in children may end up offending) and assistance to victims (e.g. on the long-term effects of child sexual abuse).
Whereas the Centre would be able to build on the established procedures of Europol, being part of a larger entity which covers multiple crime areas may limit the visibility of EU efforts in the fight against CSA. Moreover, the imbalance created by inserting such an entity in a law enforcement agency could create an obstacle to its smooth OperatiOn, it would be difficult to justify that Europol expands its mandate to cover prevention and assistance to victims only in the area of child sexual abuse. This could lead to Europol gradually deviating from its core law-enforcement mandate and covering prevention and assistance to victims in multiple crime areas, becoming a "mega centre" of excessive
complexity to be able to attend to the specificities of the different crime areas adequately.
A further disadvantage lies in the inherent conflict between Europol's mandate as an
organisation to support criminal law enforcement and the role it would need to play in
ensuring transparency and accountability of the whole process, including where service
providers and other actors are concerned.S ervice providers have expressed legal concerns about a reporting obligation and exchanging data with law enforcement
directly. An example of such potentially problematic cooperation would be receiving the database of indicators (e.g. hashes) from law enforcement on which to conduct the
mandatory detection of CSA online. Apart from legal concerns, there is a risk of a
perception of partiality, which can hinder open cooperation with the service providers, but also with key stakeholders in the area of prevention and assistance to victims.S uch concerns are likely to limit the positive impact of this choice.
In addition, the risk of not appearing as a neutral facilitator could also be seen on the
prevention function when it comes to prevention programmes for offenders and people who fear that they might offend. Europol's capacity to reach out to persons who fear that
they might offend could be limited by the distrust that its core law enforcement task could generate among those people.
Centre partly in Europoルndpartly in another entiか Some of the choices analysed considered a hybrid option of establishing part of the Centre in Europol and part in another (new or existing) organisation. This set-up would allow using the advantage of Europol's expertise and current role in the fight against child sexual abuse, and have another entity perform the functions for which Europol is less or no experienced or are not part of Europol's mandate (i.e. assistance to victims and
prevention initiatives for offenders and people who fear that they might offend).
Europol and Fundamental Rights Agency
Functions:
Prevention.
o Actions relating to prevention would be mostly performed by FRA in this
scenario (see section 3.3.4.), while coordinating actions already conducted
by Europol (e.g. some awareness-raising campaigns). Victims' assistanCe.
345
Actions relating to assistance to victims would be performed by FRA in
this scenario (see section 3.3.4.), except for the support to victims in
removing images, which would be carried out by Europol.
Detection, reporting and removal ofCSA online.
to enable it to
Centre would
in Europol's
In this scenario, Europol's legal basis would be expanded cess. The
processes
orting and removal pro
existing capacities and support the detection, be able to build on
European Cybercrime Centre (EC3) to receive the reports of child sexual
abuse from online service providers.
This set up would have a number of drawbacks. First, splitting the centre between two entities poses coordination risks, and a possible limitation of the synergies that would otherwise occur if all the functions were under the same entity. Additionally, splitting the roles of the centre is contrary to the concept of holistic response set out in the EU
trategy for a more effective fight against child sexual abuse. In addition, both agencies have specific mandates and missions, which are only partially compatible with the new tasks they would be given, creating a risk of competition between different and at times
mutually exclusive objectives the agencies have to accomplish, such as the tension between providing independent expert advice (e.g. on fundamental rights) and taking on an operational role.
The resources of both agencies would have to be increased, and additional expertise would need to be brought in. As explained above in the case of FRA, a shift in the focus of the agency would be needed. In both organisations, embedding parts of the centre in their structure could cause a certain disruption to adapt to the new tasks.
. Europol and a unit in the Commission
Functions:
Prevention.
o Actions relating to prevention would be mostly performed by the
Commission, including coordination of actions already conducted by
Europol. Victims' assistance.
o Actions relating to assistance to victims would be mostly performed by the Commission.
o Europol would receive and process requests from victims to remove
images and videos pertaining to their sexual abuse from the internet,
provided its legal basis is expanded.
Detection, reporting and removal of CSA online.
o This option would build on Europol's experience and capacity to support the detection, reporting and removal process, requiring a significant
expansion of resources. In this scenario, the Commission would take on
the role of ensuring transparency and accountability in the efforts
aaainst child sexual abuse.
346
This choice would suffer from potential coordination issues stemming from dividing the
work of the centre between two entities, which exist at different levels in the institutional
setup. It would also require a significant investment within the Commission to make available the necessary resources.
Europol and an NGO (e.g. hotline)
Functions:
Prevention.
o While both organisations would bring in value with regard to support to
the detection, reporting and removal process, neither of them is well-
suited to take on the role of a facilitation and knowledge-sharing hub on
prevention. None of the existing hotlines currently serves as a hub, and the
overall network structure of INHOPE has been kept light-weight. The
expertise and resources would need to be significantly expanded. As such
activities are out of the normal scope of organisations considered, adding the necessary new functions could disturb the existing structures of the
organisation. Victim's assistance.
o Hotlines, if granted the possibility to conduct a proactive search, could
also receive requests from victims who want their images removed from
the internet, or cooperate on such requests with Europol. o The role of a knowledge hub on victim's assistance would suffer from
similar drawbacks as in the case of prevention.
Detection, reporting and removal of CSA online.
o This option would build on Europol's experience and capacity to support the detection, reporting and removal process. If the NGO involved is a
hotline able to perform analysis of reports, it could also contribute to this
process. Legislation would be needed to allow proactive search by hotlines in this case.
In terms of structure and governance, in case of an NGO, and particularly a hotline, the EU and other relevant stakeholders may have a limited role in governance, limiting the
possibility for steer from the EU. Additionally, this scenario would suffer from potential coordination issues stemming from dividing the work of the centre between two entities.
EU executive agency
This choice would imply creating the centre as an executive agency established under
Regulation (EC) No 58/2003586. Executive agencies are established for a specific period the European Commission to manage specific activities related to EU
586 Council Regulation (BC) No 58/2003 of 19 December 2002 laying down the statute for executive
agencies to be enirusted with certain tasks in the management of Commi.mity programmes, 0.1 L 11, i61.2003.
347
This choice was discarded because an executive agency carrOt address the array of
functions that the potential Centre will require. In particular, an agency created for a finite time period cannot create of sustainable, long-lasting mechanisms needed to achieve the policy objectives of this initiative.
Establishing an EU centre as partげa Member S tate authoriか
Functions:
Prevention
o S uch an entity could be tasked with becoming an effective hub for
connecting and disseminating expertise. It could have the potential to
cooperate with all relevant stakeholders and take on the role of the
prevention functions of the centre.
Victims' assistance
o It would be able to conduct proactive search of images and videos on
behalf of the victim. However, as a national authority of an EU Member
S tte, there could be limitations on its capacity to carry out its work at EU
level.
Detection, reporting and removal ofCSA online
o An independent public law administrative body would be able to work
closely with hotlines and law enforcement. It would be well-suited to
collect data on efficiency and times required to take down content, and
work with service providers. o The possibilities of such entity to process personal data may be limited.
Also, depending on the condition in the Member S tate where it is
established, the function of receiving reports and maintain a database of
indicators could fall under national law enforcement. This could limit its
capacity to work across the EU with service providers and other
stakeholders, given possibly jurisdiction issues.
This choice was therefore discarded mainly due to possible limitations to work at EU level while being a national entity.
Joint
EU public/private partnerships are based on Articles 187 TFEU thereof and take the form of joint undertakings. For the partnerships before the Lisbon Treaty, the legal basis was Article 157 TEC (now Article 173 TFEU) on Industry. The objective of such legal structures is to facilitate investment in knowledge and innovation in Europe. As a result, this legal form could only cover some aspects of the centre's role in relation to research and innovation (if the centre was tasked to conduct research).
The operational role of Join Undertakings is limited to publishing open calls for
proposals and managing grants.
348
This choice was discarded because it would not allow the Centre to take on some of its
envisaged core functions, in particular facilitating detection, reporting and removal of CSAonline.
Centre as partげthe INHOPE ne twork/ national reporting加timne in a Member S tte.
Functions:
Prevention
o The 1NHOPE network/national hotlines could to some degree facilitate the
implementation of the practical measures on prevention. o However, hotlines specialise on processing of reports of CSAM, and the
research role of INHOPE is focused on statistics about the reports it
receives and its work. Currently it does not have the resources and
expertise to become a hub of expertise and coordination on prevention. Victims' assistance
o There is some potential in facilitating the implementation of the practical measures on prevention of measure 1, although the capacity of
iNHOPE/national hotlines to become a hub of expertise and coordination
on assistance to victims is limited given the lack of experience and
existing resources.
o Hotlines, if granted the possibility to conduct a proactive search, would
be able to receive requests from victims who want their images removed
from the internet. It would require the establishment of legal basis and a
significant increase in resources. On the other hand, national hotlines
could be a natural point of contact for survivors.
Detection, reporting and removal ofCSA online
o The number of reports received from the public is not nearly as high as the
number of reports from service providers.587 If the EU-based service
providers were obliged to report to 1NHOPE hotlines, the volume of
report would be higher than what the network could effectively handle
under the current resources.
In terms of structure, INHOPE is an international association of hotlines. Its governance is in the hands of members of the association. This greatly limits the possible steer from the EU or other relevant stakeholders (e.g child's rights and victims' associations
NGOs). While INHOPE is supported by the EU, it does not focus its activities on
Europe only, and needs to accommodate the needs of members globally. This could limit the effectiveness of the centre as a European organisation.
587According to the INHOPE 2020 report, there were 1,038,268 content URLs exchanged via ICCAM
globally. While this number does not specify how many reports were received by hotlines in relation to those URLs, it has to be noted that this number encompasses the whole world. The only hotline to
perform proactive monitoring in Europe, the Internet Watch Foundation, indicated that, in 2020, pgiy 13% of CSAM it detected resulted from user reports. NCMEC received around 300,000 reports received from users, out of a total of 21.7 million reports, equalling 1.4%.
349
Therefore, while iNHOPE would be a key partner for the centre, it does not appear to be
best placed to take its role.
4. IMPACTs 0F IMPLEMENTATION CHO I ES
3.5. Qualitative assessment
[he qualitative assessment o士 the implementation choices considers their social, economic, and fundamental rights impacts.
Socia1 impact
All proposed measures except the baseline scenario would improve, to differing degrees, the capacity of all relevant actors in the EU to respond to this crime and mitigate its social consequences. Any improvement of this capacity could also lead to improved deterrence for criminals, better protection of victims and improved security for children.
The impact level differs based on the whether the Centre would coordinate and support existing stands of work, or whether it would take on a leading role in the fight against child sexual abuse, opening up new areas of work that could have a positive impact on
society in general.
Under all options except the baseline, support for the implementation of safety and
privacy by design features by online service providers provided by the centre could
considerably improve the protection of children online. The Centre could also provide feedback to poicymakers, both on prevention-related issues and as an advocate for victims. This would increase the social impact of the Centre in the long-term, ensuring that future policy can be based on a more solid body of evidence and hence may offer
improved solutions that better address actual problems.
Choice 山 EU Centre on prevention and assistance to victims
Establishing an EU centre on prevention and assistance to victims would help to improve coordination and facilitate the implementation of practical measures in these areas. The centre is expected to bring a limited impact in terms of enhanced cooperation and
exchange of knowledge and best practices in the field of prevention and assistance to victims. It could also lead to some improvements in the feedback given to policy makers.
The support for practical measures on victim support and prevention would be expected to have a positive impact on the ability of society and authorities to prevent these crimes and on the experience of survivors of child sexual abuse, as they might have easier access to information about available resources, and these resources might be strengthened through exchange of best practice among Member S ttes, facilitated by the Centre. Simi1ar positive impacts could be expected from the support for development of codes of conduct and safety by design. However, the positive impact would be expected to be limited due to the limited degree of standardisation that would likely result from purely voluntary practical measures, especially in view of the sensitivity both of the content to be identified and of the impact on the rights of all users.
This measure is therefore expected to have a positive social impact overall, albeit only to alimited extent
350
Choice 丑 Se t up an EU Centre to prevent and counter child sexual abuse as an
independent EU boみ
[his choice would improve the ability of relevant public authorities to respond to cases of online child sexual abuse, leading to more victims being rescued and more cases of crime prevented. The centre could facilitate the work of national law enforcement
agencies, alleviating their workload linked to handling of the reports.
Maintaining a single, reliable database in the EU of known CSAM would also improve the ability of service providers to detect it in their systems. Europol has a good capacity to host such a database, also in view of the necessary security and data protection procedures and the chamiels it has already set up for cooperation with national law enforcement and with NCMEQ
Assisting victims in removing images and videos depicting their abuse from the internet would address a gap in the current efforts. It could significantly improve their
psychological well-being by reducing the stress of knowing that images and videos
depicting their abuse are circulating online. The positive social impact of this choice would be that victims can focus on recovery rather than pursuing service providers to demand the removal, potentially causing retraumatisation and legal jeopardy given the
illegal nature of possession of CSAM. Victims may also be more inclined to turn to an
independent organisation, without links to law enforcement, for help. The centre would have also a greater impact in realising the potential of the network of hotlines for victim
support. In addition, it would add credibility to the transparency and accountability tasks if these can be performed by a separate organisation whose mission is dedicated to
ensuring such transparency and accountability.
This option would also likely improve the rate and speed of take-down of CSAM , and
help to fully realise the potential of the currently underutilised network of hotlines,
thereby improving the cooperation between civil society organisations, service providers and public authorities.
An advantage of this option is that it encompasses all of the centre's roles, allowing processes to be streamlined in one entity only. One designated entity taking up different tasks in the fight child sexual abuse facilitates processes and can potentially increase their
efficiency. It can reduce the burden on law enforcement and allow them to focus on those tasks only they can perform, and it can provide a reliable and independent point of contact for service providers as well. In addition, one entity taking up several tasks related to the fight against child sexual abuse increases the visibility of such entity and could encourage victims to take all steps necessary for their recovery and fighting offenders. Creating a dedicated agency would also improve the centre's visibility and send an important message about the dedication of the EU as a whole to combating child sexual abuse more effectively and to ensuring that rules apply online as they do ofnine. It would place the EU at one level with those leading the fight against child sexual abuse
worldwide, such as the United S tates with NCMEC.
One disadvantage of this option may be that a completely new entity would lack an established network of expertise, organisations and communication channels at first, Dotentiallv reducina the efficiency of its oDerations in the beaimiina. However, this
351
disadvantage most likely only concerns the first months after the creation of a centre and
expertise and networks can be quickly built up, also based on cooperation with Europol.
In summary, this implementation choice would contribute to increased security of EU
citizens, children in particular; it would also serve to diminish criminals' feeling of
impunity. In reducing revictimisation caused by the recirculation of CSAM , maintaining a database would facilitate the task for service providers and have a significant positive impact on victims. If the centre is to be a new independent entity, this option can also
fully address the need for an improved framework for prevention efforts to decrease the
prevalence of child sexual abuse, especially where measures targeting would-be or repeat offenders are concerned, and would provide the important transparency and
accountability functions a centre would need to perform.
This choice is considered to have a highly positive impact on society by improving the
security of EU citizens and contributing to the well-being of victims of child sexual abuse.
Choice C:Se t up an EU Centre to prevent and counter child sexual abuse with some
ルnctions in Europol and others in a separate organisation under Member S tate law
In this choice, the impact on the ability of relevant public authorities and service
providers to respond to cases of online child sexual abuse would be improved when
compared to the baseline.
The choice would result in animprovement in terms of decreasing the prevalence of child sexual abuse through prevention, and enhanced support for victims of child sexual abuse
through a holistic multi-stakeholder approach. It would also relieve Europol from the burden of having to assume all the tasks, allowing it to focus on the strictly operational elements of facilitating the detection, verification and investigation of child sexual abuse. This would reduce the pressure on Europol as an organisation and also reduce - but not eliminate - the associated risk of the task's getting deprioritised among Europol's many competing important objectives.
Dividing the tasks of the centre between two entities could limit its overall impact by creating and additional burden of coordination and a potential for inefficiencies. For
example, charging one entity with the operational aspects of the centre's tasks and another one with ensuring transparency and accountability of the process would be
highly complex an ineffective. Therefore, the part of centre under another organisation would solely focus on prevention and assistance to victims, without playing any role in the detection, reporting and removal process. This would severely limit this choice's
impact in terms of ensuring accountability and transparency.
If given additional specialised resources, Europol would be well-suited to cover law enforcement support aspects of the Centre's work, and to perform the coordination roles; at the same time,a significant effort would be needed to ensure cohesion between the activities in all strands of work, which may run counter to the objective of establishinga centre which acts as a hub/one-stop-shop.A centre split between two entities would risk not having the same public impact as an dedicated and unified body, where the leadership of the organisation would be solely dedicated to this topic and could focus on the precise tasks of the centre, as well as on positioning the centre in the maze of relevant stakeholders within the EU and beyond. Other concerns relate to the ability to coordinate between the two seDarate bodies: the risk of the task's beina derrioritised in a larae
352
organisation with many important tasks; and the fact that transparency reporting and
accountability measures based in an agency with a law enforcement mandate may not be
perceived as being sufficiently independent.
The impact of the centre's work in assisting victims in removing images and videos related to their abuse from the internet would be positive, similarly to option A .
The overall societal impact of this choice is deemed to be moderately positive, as it would improve the security of EU citizens, contribute to the prevention, investigation and
prosecution of child sexual abuse crimes, and to the well-being of victims of child sexual abuse.
Choice D:Se t up an EU centre to prvent and counter child sexual abuse within the
Fundamental Rights Agency (FRAノ
In this choice, provided that FRA is given a legal basis that can cover all of the centre's
function, the centre would contribute to improved processing of reports, likely leading to in investigations and eventually also in identifying and rescuing a positive impact on society.
an increase in removals, victims. This would have
The focus of FRA on fundamental rights could reinforce the recognition of ensure transparency and accountability of companies' of the outcome of the
independence, which is key to efforts to detect CSA online and enforcement. This would help gain trust and buy-in necessary for the success of the centre's actions.
Similar1y to choice A, this choice would offer the possibility to carry out all relevant functions in the same place (contribute to the detection of CSA online, support victims and facilitate prevention) and liaise with all relevant stakeholder groups.
However, to effectively work with all relevant stakeholders, new structures and networks would have to be established. While the main task of FRA include also strengthening cooperation between fundamental rights actors, its main focus is helping policy makers
by collecting and analysing data and providing independent advice. The main focus of the EU centre to prevent and counter child sexual abuse is to become a practical knowledge and coordination hub; input for policy purposes would be an important but
secondary role. The EU centre is expected to support practitioners from all relevant
backgrounds in an operational manner, from education to law enforcement. This includes
e.g. collecting information on effectiveness of programmes for offenders. While there is a link to protecting fundamental rights, the main focus would need to be on practical and scientific expertise about the subject in an operational perspective. Addressing the needs of this stakeholder group on a regular basis would require a significant shift in the set-up of the agency. The expertise currently available in FRA would have to be expanded to cover other issues linked specifically to child sexual abuse, for example in the area of
prevention of offences.S imi1ar1y, the cooperation with Europol and national law enforcement would have to be created anew一
Being part of larger entity could also limit the ability of the centre to dispose of its own resources and dedicate them exclusively to the fight against child sexual abuse, as it could be constrained by other needs and priorities of the larger entity. It may also limit the visibility of the centre, as child sexual abuse is only one of the many tasks FRA deals with. The risk of locating the centre in FRA is therefore that it will be overshadowed by
353
lfollow up of the reports by law from key stakeholders, which is
activities to tackle other types of crime, both internally and externally, limiting the
overall impact the centre would have.
'f the operational functions were assigned to another entity, namely Europol, once more, the disadvantages of close cooperation with law enforcement that would be required to fulfil its tasks might call into question its overall status as an independent provider of
impartial advice and expertise on all fundamental rights. (See section 3.4 for a more detailed discussion on the possibility to embed the EU centre in Europol and another
organisation).
In summary, this implementation choice would improve the ability of relevant public authorities to respond to cases of online child sexual abuse, leading to more victims being rescued and more cases of crime prevented.A possible limitation of the positive impact of this option would be the necessity to shift the role of the existing agency and build up new networks among relevant stakehodlers, including law enforcement.
ECO" olilie impact
The assessment of the economic impact focuses mostly on the costs which would be incurred if there is a decision to establish a Centre, both for its creation and for carrying out its duties on an ongoing basis. However, it is important to note that the costs incurred
by establishing the Centre would be accompanied by benefits in terms of limiting the societal cost of child sexual abuse. Economic costs include those of police and judicial services (e.g. criminal prosecution, correctional system), social services, victim support services (e.g. community organisations), victim compensation programmes, education, health, and employment costs.
Choice J :Se t un an EU Centre on vrevention and assistance to victims
Compared to the baseline scenario, the practical measures to set up a Centre as a hub of
knowledge and information would enhance coordination in the areas of prevention and assistance to victims. This would have a positive impact on the Member S ttes, which could reduce duplication and improve effectiveness by making use of existing research and best practices established in other countries. This, in turn, would allow for more efficient use of financial resources to build further on existing research and experience and implement initiatives on a more widespread and evidence-based basis.
The cost of supporting the centre in this form, including its activities with networks of
experts and practitioners, actions to increase capacity and exchange good practices, could be covered under the Internal Sec urity Fund. The economic impact of these actions is deemed to be limited.
Such practical measures could be accompanied by increased funding through relevant
progranmes (e.g. 1sF, Horizon Europe), adding additional costs to the EU budget. Improved coordination between relevant authorities of Member S tates and other stakeholders would help to ensure that EU funds are used to the benefit of a broad range of actors and therefore bring real value.
The centre could also stimulate efficient uptake of EU funds through coordination and better information sharing, which would have a positive impact on the Member S ttes (which would be able to streamline EU fundina to Drioritv arcas') and the EU budaet
354
avoiding supplication of parallel projects in the same
(better use of EU funding, e.g area).
Choice 丑 Se t up an EU Centre to prevent and counter child sexual abuse as an
independent EU boみ
Establishing the EU centre as a new independent EU body would require higher initial
expenses. However, compared to choice B , as all the activities of the centre would be a
part of one organisation, this choice would allow minimising administrative costs by avoiding duplicate structures. When setting up a new EU body, there is also room for some degree of budget diversification, allowing funding from Member S tates, and
potentially private entities (NGOs, such as foundations and charities, industry) under strict conditions to preserve the independence of the centre. This could alleviate the strain on the EU budget.
On the other hand, a more efficient and coordinated system of handling the reports would
likely lead to a net reduction of costs and necessary resources for each report for both service providers and law enforcement authorities. In addition, the existence of a reliable set of indicators of what is illegal in the EU and its Member S tates, as well as the access to reliable technologies free-of-charge should create efficiencies, as service providers can
rely on independently verified information for the whole of the S ing1e Market.
Furthermore, the reduction of reporting channels in the EU would reduce costs of
potentially needing to comply with several different national framework.
The centre could also stimulate efficient uptake of EU funds through coordination and better information sharing, which would have a positive impact on the Member S tates
(which would be able to streamline EU funding to priority areas) and the EU budget (better use of EU funding, e.g. avoiding supplication of parallel projects in the same
area).
The centre's activities could reduce duplication of efforts to combat CSA, leading to cost
saving in the long-term, and serve to reduce the long-term societal and economic impact of these crimes. The positive impact for Choice A is expected to be somewhat greater than that of the other analysed choices, as this option would relieve law enforcement of all tasks that can be accomplished elsewhere and at the same time would provide an
independent counterpart to service providers.
Overall, setting up a completely new entity would incur significant costs in the
beginning. However, these initially high costs have to be viewed against the cost savings the centre would trigger, namely limiting the risk of duplicating efforts and streamlining of activities in an economic marmer. Moreover, the centre's contribution to the fight against child sexual abuse would lead to decreasing the economic costs of this crime in the long run.
Choice C:S et up an EU Centre to prevent and counter child sexual abuse with some
functions in Eurovol and others in a sevarate o rganisatim under Member S tate law
As in implementation choice B , this choice would require increased funding for Europol, at somewhat lesser levels than choice B . Additionally, under this implementation choice a new organisation would be created with responsibility for parts of the functions of an
355
EU centre. While it would to a large extent be funded by the EU, the new entity could
also receive funding from additional sources. This additional funding could include:
contributions from the Member S tates and third countries,
contributions from industry,
contributions from not-for-profit organisations and charities.
Initially, the Centre would likely be funded entirely, or almost entirely, by the EU. With
time, this proportion could change. In a comparable example, approximately 60% of the
budget of NCMEC is provided by the US government.
The drawback of this choice is that splitting the centre among two organisations could lead to duplication of services providing administrative and logistic support (with each
organisation having its own financial, human resources and communication units, for
example), ultimately leading to higher costs.
In terms of the economic impact on service providers and on society as a whole, the same considerations as for Choice A apply to a large extent. There may be a positive economic
impact on society as a whole of the prevention measures targeting potential offenders, which may be more effectively supported by a centre that is independent of law enforcement.
In short, despite the costs associated with creating and running the centre, the effect it would have on the fight against child sexual abuse would lead to a positive economic
impact on society though decreasing the economic costs of this crime in the long run.
Choice D:Se t up an EU Centre to prevent and counter child sexual abuse within the
Fmdmental Rifhts4gencv (FR4)
Embedding the centre within FRA would require an increase in its funding to provide for the cost of additional activities, including increasing staff numbers to handle the workload related to all functions of the centre.
With regard to detection, setting up and maintaining the infrastructure for the database
(both hardware and software) would incur significant one-time costs, as well as more limited running costs. While, the annual and initial costs may be lower than creating a new body, they would still be substantial. e.g. to find, hire and train a number of dedicated non-law enforcement experts, and to carry out the centre functions (including manually reviewing the reports from companies to filter false positives, determining the
jurisdiction best placed to act).
With regard to the actions on prevention and support to victims, the costs incurred would be higher compared to the baseline, and comparable to choice A and B (e.g. supporting Member S tates on prevention and assistance to victims would require expertise that is not
currently present in FRA).
In all areas, the centre's work could reduce duplication of efforts to fight CSA, leading to cost savings in the long-term. The actions proposed in this choice would also contribute to reducing the economic impact of child sexual abuse on society in general through reductions in crime as a result of the centre's functions in support of law enforcement and service providers.
356
In short, similarly to previous options, the potential for decreasing the economic costs
of this crime in the long run is high and counterbalances the costs associated with
creating and running the centre.
Fundamental Rights impact
This section examines the impact of establishing a European Centre to prevent and counter child sexual abuse on fundamental rights as laid down in the Charter of Fundamental Rights of the European Union. Children, users of the services at issue and
providers of such services were identified as relevant right holders for the centre:
1. Rights of the children: fundamental rights to human dignity and to the integrity of
the person, the prohibition of inhuman or degrading treatment, rights to respect for
private and family life and to protection of personal data, as well as the rights of the
child.588
2. Rights of the users whose data is accessed: the rights to respect for privacy
(including of communications, as part of the broader right to respect for private and
family life), to protection of personal data and to freedom of expression and
information.589
3. Rights of the service providers: the freedom to conduct a business.590
0verall, none it the options c onsidered tor the centre wou1d have any signiticant negative impact on any fundamental right. Rather, one can observe a strengthening of
protection of specific fundamental rights - such as the rights of the child and the rights of all users to data protection - in line with the importance of the role of the centre in
providing legal certainty about what is illegal content, in facilitating swift analysis and
processing of reports, in improving prevention and victim assistance, and in ensuring accountability and transparency. The analysis shows that the Centre's own impact is limited from a fundamental rights perspective, but that it serves as an important safeguard to ensure that the measures strike a fair balance between the different rights at stake.
ChoiceA:Se t up an EU Centre on prevention and assistance to victims
Alimited positive impact on fundamental rights may be expected from better coordination of efforts on prevention and assistance to victims of child sexual abuse. Under this choice, there would be no improvement with regard to the rights of victims of
ongoing abuse in need of rescue, and those who wish to have their images removed from the internet. The rights of the persons affected by CSAM detection measures
implemented by service providers would remain as in the baseline.
Overall, the analysis suggest that this choice would serve to minimally improve the orotection of fundamental riahts.
588 Art. 1, 3, 4, 7, 8 and 24 of the Charter, respectively. 589 Art. 7, 8 and 11 of the Charter, respectively. 590 Art. 16 of the Charter.
357
ChoiceB:Se t up an EU Centre to prevent and counter child sexual abuse as an
independent EU boみ
This option would contribute to improved processing of reports, likely leading to an increase in removals, in investigations and eventually also in identi句ing and rescuing victims. This could have a significant positive impact on the fundamental rights of victims of ongoing abuse. The establishment of an EU database would also facilitate
prevention by stopping crimes from happening in cases where imminent abuse
(grooming) is detected, positively impacting the rights of people who may become victims of child sexual abuse.
The benefit of this options would be improved coordination of efforts in relation to overall prevention and assistance to victims of child sexual abuse, leading to positive impact on the fundamental rights of persons who are or may become victims of crime.
The centre would serve as a safeguard in the process of detection and reporting of CSA online. In case of potential false positives, companies would not be reporting innocent
persons to law enforcement directly. The creation of transparency and accountability processes, which depend on the centre, serves as a safeguard to mitigate the impact on fundamental rights of users resulting from a detection obligation.S imi1ar1y, the creation of a single database of reliable indicators and facilitation of access to reliable technology via a centre can mitigate the impact on the freedom of the provider to conduct a business and contribute to balancing the impact on the fundamental rights of users by supporting service providers in improving the accuracy of their technologies. Overall, a positive impact on fundamental rights may be expected with respect to the relevant fundamental
rights of all three categories set out above.
Choice C:S et up an EU Centre to prevent and counter child sexuaルbuse with some
ルnctions in Europol and others in a separate organisation under Member S tate law
The impacts of this option with regard to improving investigations and rescuing victims are as in option A .
In this choice, the part of centre under an independent organisation would solely focus on
prevention and assistance to victims, without playing any role in the detection, reporting and removal process. This could potentially limit the positive impact on fundamental
rights, and the centre's effectiveness as a safeguard ensuring transparency.
Overall, a moderate positive impact on fundamental rights of all groups affected is to be
expected.
Choice D.S et up an EU Centre to prevent and counter child sexual abuse wi訪in the
Fundamental Rights Agency (FRAノ
The impact of this choice is expected to be positive, as in choice A . The expertise of FRA in the field of fundamental rights would be an additional benefit.
With regard to ensuring transparency and accountability of efforts around combating child sexual abuse, while FRA focuses on ensuring the fundamental rights of citizens in nolicv makina. it does not intervene in individual cases. It has not Dreviouslv enaaaed
358
in overseeing the efforts of industry, including overseeing the development and
implementation 0f technologies. Including these functions into FRA would contribute to the shift of the agency's structure and change its role from an independent and neutral observer into an actor in the field.
3.6. Quantitative assessment
Costs
[he quantitication of the costs and benehts of the policy measures/policy options is
limited by the lack of data, and requires the use of a number of assumptions: 1. The estimate of recurrent and one off costs related to the functioning of the Centre are
based on the budget of EU agencies and other organisation similar in size to what is
predicted for the Centre:
Name S taff Budget Funding sources
(approx.) (approximately) Fundamental Rights Agency 105 24,3 lEURlyear EU budget (FRA)
European Monitoring Centre lii 19 M EURlyear EU budget for Drugs and Drug Addiction
(EMCDDA) The European Union Agency 40 30M EiIR!year 10 M EUR EU subsidy, for Law Enforcement Training other sources of funding (CEPOL) include EU project
funding European hstltute for Gender 40 8 M EURlyear EU budget Equality (EIGE), US National Center for 300 15 M EUR/year591 US government funding Missing and Exploited and voluntary donations Children (NCMEC) Canadian Centre for Child 45 4 M EUR/year. S upported by the Canadian Protection government and private
donors
2. The cost estimates make the following assumptions: S taff costs o Detection, reporting and removal:
The same number of staff would be required to analyse the estimated surge in reports (x8 compared to 2020) in all options.
Europol currently has dedicated staff from law enforcement to cross-match and enrich the reports. This staff will not be able to be repurposed to contribute to the tasks of reviewing the reports to ensure that they are actionable that the EU centre would carry out.
591 NCMEC, 2019 Aiinual Report, when costs relating to missing child case management/information and case analysis are excluded.
359
The staff costs for admin staff in charge of overheads (HR, accounting,
management) would be lower in the Europol+ and FRA options, given the
existing set ups. o Prevention and assistance to victims:
Non-EU staff (28 posts) could be envisaged for these functions across all
options. They could be funded by a grant to a separate organisation (NGO, Foundation) selected via a call for proposals instead, so that there is no
impact on the future EU budget (e.g. pensions, etc). The operational staff would be the same in all options, as these would be new functions in all cases.
The staff costs for admin staff in charge of overheads (HR, accounting, management) would be lowest for FRA, as it could benefit from economies of scale in the existing setup and with the detection, reporting and removal function.
The staff corresponding to the prevention and assistance to victims functions
in all options could be non-EU staff and be covered by a call for
proposals/grant, and would not have impact on the future EU budget (e.g.
pensions, etc).
Infrastructure
Initial costs are estimated at 3 lEUR to set up the databases of indicators, and 1 - 2 lEUR relating to the selection and fitting out of its premises where necessary.
Annual costs: include notably the costs of running and maintaining the databases of indicators.
Operational expenditure: It includes the costs from carrying out the facilitation of detection, reporting and removal (support to companies and law enforcement), as well as the
support to Member S tates on prevention and assistance to victims (e.g. studies, etc). The centre would not have its own research budget for prevention and
assistance to victims. This would be provided through calls for proposals throuah funds like Horizon Eurone.
3. The estimate assumes that the centre would take about two years to become
operational and up to four years to reach its full size and operational capacity. Therefore the costs related to personnel and logistics are projected to increase
gradually in the first years, reaching a stable level after year 4.S ome other costs, such
as expenditure related to staff recruitment and training may be higher in the early
stages of setting up the centre. The continuous costs estimates refer to the situation in
which the Centre is fully operational.
360
4. The one off and recurring costs related to the creation of an EU database of hashes of
known csAM are based on a Commission study592
The estimates in this section provide an idea of the order of magnitude of costs and benefits and therefore should not be taken as exact forecasts.
The following sections discuss the cost estimates for each of the implementation choices.
Choice A:S et up an EU Centre on prevention and assistance to victims
This choice assumes the creation of a centre through non-legislative measures.
The cost of non-legislative measures, namely creating a centre as a hub without a legal personality is estimated based on assumption that it would take 4 full-time equivalent units in the Commission to coordinate the hub. The cost of i FTE is based on the
following assumptions:
the average of the salaries in the EU of whose activities are classified under S ection
O (public administration) in the NACE Rev. 2 statistical classification of economic
activities in the European Community593.
This cost includes compensation of employees, plus taxes, minus subsidies;
An additional 25% is added to account for overheads (i.e. expenses not related to
direct labour, such as the cost of office equipment.)
The value is 38.50 EUR/hour
Annual cost Sa1ary FTEs
320 286 BUR 38.50 EUR/hour 4
The operational activities of the hub could be supported by a framework contract of estimated value 10 M BUR! year. This estimate is based on existing framework contacts, such as the one supporting the Radicalisation Awareness Network594. The specific tasks to be carried out by the hub would be specified in the framework contract.
These tasks could include the development of activities and good practices by networks of practitioners, policy makers and researchers. The cost of this task is estimated at 3M EUR!year, or 30% of the contract value. Administrative, logistical and technical su1Dort for the work of the hub is also exDected to reDresent a sianificant cost due to the
”り2Stud on options for the creation of a European Centre to prevent and counter child sexual abuse,
including the use of 'CT for creation of a database of hashes of child sexual abuse material and connected data protection issues, 2021, p.67
593Eurostat, NACE Rev. 2 -S ttistical classification of economic activities, accessed 27 April 2021. 594The aiinual costs of RAN are 7,5 ME UR for the practitioners network and 7,5 M EURyear for the
policy support network. They are implemented through two framework contracts of 3OMEUR each for 4 years.S ee for example European Commission, Technical S upport to Prevent and Counter Radicalisation, accessed 21 May 2021.
361
hub's highly decentralised nature. These costs, which would cover the organisation and
reporting on events such as study visits and working groups, are also estimated at 3M EUR!year.
Facilitation of coordination and research activities could be another significant task for the hub, however due to the maximum duration of framework contracts of 4 years, the hubs abilities in this regard would be limited to focus on short-term research. The cost for this task is estimated at 1.51 EUR!year, or 15% of the value of the contract.
The hub could also organise cross-cutting thematic events bringing together stakeholders of different types, going beyond the topics of individual working groups. These could include a steering Committee to provide strategic guidance and evaluation of the hub's overall work. The cost of this task is estimated at 2M EURlyear, 20% of the value of the contract.
Finally, due to the decentralised nature of the hubs operations, the maintenance of an EU website dedicated to the hub's activities is estimated at 5% of the contract value, or 0.51 EUR!year.
Each of the above costs are assumed to be divided evenly between the hubs functions in relation to assistance to victims and prevention. The estimated costs of this choice are summarised in Table2.
The total (continuous) cost of this choice is therefore estimated to be 10.321 EURlyear. There would not be any one-off costs.
Tabl 2: Estimated costs of! plmenttion Choice J伊UR mi llimめ;eaり
C0 Total
1 r 1.50
r i .50
roフ5
r i. io
rO.25
C5
Development of activities and good practices
Administrative, logistical and technical support Research activities
Thematic events
Website
Total
, r i .50
r i .50
roフ5
r i. io
ro.25
C5
Development of activities and good practices
Administrative, logistical and technical support
Research activities
Thematic events
Website
Total
ro.32
C0
C10.32
Commission staff costs
Tota'
Grand total
362
Choice B :Se t up an EU Centre to prevent and counter child sexual abuse as an
independent EU body This choice assumes the creation of a Centre as a new EU body (i.e. decentralised
agency) which would perform all of the roles considered in this Annex.
The Centre as an EU agency would incur on initial costs of a total of EUR S million: EUR 3 million to set up the databases of indicators + EUR 2 million for the
building. The costs of establishing databases of indicators of child sexual abuse online are based upon a Conmission study and bilateral consultations with operators of similar databases595
This choice estimates an annual cost of EUR 25.7 million per year after the initial
ramp-up. The cost estimates for this choice (as well as choices C and D) are based on cost structures of similar organisations in the EU (FRA, EMCDDA, etc) and similar Centres around the world (e.g. NcMEC596,)597. The costs estimates include the costs of reviewing manually all the reports submitted. Cost estimates relating to the Centre's functions in the areas of prevention and victim support are also informed by the costs of NCMEC 's activities in the areas of community outreach and training, which respectively develop and disseminate prevention materials and provide training to relevant professionals.
The following table gives an overview of all the costs to cover all the functions of the Centre: prevention, assistance to victims and facilitation of the process to detect,
report and remove CSA online:
S tud on options for the creation of a European Centre to prevent and counter child sexual abuse,
including the use of 'CT for creation of a database of hashes of child sexual abuse material and connected data protection issues, 2021, p.67
596 S ee in particular National Center for Missing and Exploited Children, 2019 Audit Report, 31
December 2018 and 2019. 597 Staff costs include staff wellbeing programmes, in line with best practices in other serious crime areas
such as terrorism (see for example here and here). For reference, these programmes represent 15% of staff costs in the Internet Watch Foimdation.
363
Tabl 3: Estimated costsげI1plementtion Choiceβ
' Staff expenditure of the Centre
Sa1aries & allowance (3.000.000 (5.000.000 (10.000.000 (13.000.000 (15.000.000 (15.000.000 (15.000.000 (15.000.000 (15.000.000 (15.000.000
Expenditure relating to staff recruitment (600.000 (600.000 (600.000 (200.000 (50.000 (50.000 (50.000 (50.000 (50.000 (50.000
Mission expenses (300.000 (300.000 (300.000 (500.000 (600.000 (600.000 (600.000 (600.000 (600.000 (600.000
Socio-medical infrastructure & training (150.000 (200.000 (200.000 (200.000 (250.000 (250.000 (250.000 (250.000 (250.000 (250.000
Total staff costs (4.050.000 (6.100.000 (11.100.000 (13.900.000 (15.900.000 (15.900.000 (15.900.000 (15.900.000 (15.900.000 (15.900.000
Infrastructure and operating expenditure of the Centre
Rental of buildings and associated costs (900.000 (900.000 (900.000 (900.000 (900.000 (900.000 (900.000 (900.000 (900.000 (900.000
'CT (not related to database) (800.000 (700.000 (700.000 (700.000 (700.000 (700.000 (700.000 (700.000 (700.000 (700.000
Databases of indicators
・ Technical maintenance (0 (200.000 (300.000 (400.000 (500.000 (500.000 (500.000 (500.000 (500.000 (500.000
・ Allowance for annual hardware licensing (50.000 (50.000 (100.000 (100.000 (100.000 (100.000 (100.000 (100.000 (100.000 (100.000
・ Annual hosting for databases (50.000 (100.000 (150.000 (200.000 (300.000 (300.000 (300.000 (300.000 (300.000 (300.000
Movable property and associated costs (30.000 (50.000 (70.000 (80.000 (100.000 (100.000 (100.000 (100.000 (100.000 (100.000
Current administrative expenditure (50.000 (50.000 (70.000 (80.000 (100.000 (100.000 (100.000 (100.000 (100.000 (100.000
Audits (500.000 (500.000 (500.000 (500.000 (500.000 (500.000 (500.000 (500.000 (500.000 (500.000
Total infrastructure costs (2.380.000 (2.550.000 (2.790.000 (2.960.000 (3.200.000 (3.200.000 (3.200.000 (3.200.000 (3.200.000 (3.200.000
0perational expenditure
Operational activities (e.g. technical meetings with stakeholders) (500.000 (1.000.000 (1.500.000 (2.000.000 (2.000.000 (2.000.000 (2.000.000 (2.000.000 (2.000.000 (2.000.000
Support to expert networks恥 oordination activities, meetings) (500.000 (1.000.000 (1.500.000 (2.000.000 (2.600.000 (2.600.000 (2.600.000 (2.600.000 (2.600.000 (2.600.000
Translation and interpretation (300.000 (300.000 (400.000 (400.000 (500.000 (500.000 (500.000 (500.000 (500.000 (500.000
Publishingand research dissemination (50.000 (150.000 (200.000 (300.000 (500.000 (500.000 (500.000 (500.000 (500.000 (500.000
Communication (incl, campaigns) (500.000 (600.000 (700.000 (1.000.000 (1.000.000 (1.000.000 (1.000.000 (1.000.000 (1.000.000 (1.000.000
Total operational expenditure (1.850.000 (3.050.000 (4.300.000 (5.700.000 (6.600.000 (6.600.000 (6.600.000 (6.600.000 (6.600.000 (6.600.000
TOTAL EXPENDITURE (8.280.000 (11.700.000 (18.190.000 (22.560.000 (25.700.000 (25.700.000 (25.700.000 (25.700.000 (25.700.000 (25.700.000
3 64
Choice C:S et up an EU Centre to prevent and counter child sexual abuse with some
functions in Europol and others in a separate organisation under Member S tate law
This scenario assumes the creation of a centre with some roles performed by Europol and some by a separate organisation established under Member S tte law.
Europol would carry out the tasks of facilitating the detection, reporting and removal of CSA online. The independent organisation would carry out the tasks of facilitating Member S tates' action on prevention and assistance to victims.
Costs relating to central administration providing supporting services to the prevention and assistance to victims functions are expected to be higher under this implementation choice. These increases are due to the creation of a new, independent organisation, which will be unable to benefit from the existing structures and resources of Europol.
The costs estimates include the costs of reviewing manually all the reports submitted
in this form would incur on initial costs ifa total of EUR 5 million:
million under Europol (EUR 3 million to set up the databases of indicators +
million for the building); and
million under the independent organisation (building).
The Centre . E UR4
EUR1
. EUR1
This choice estimates an annual cost of EUR 24.1 million per year after the initial
ramp-up.
functions of the to detect, report
The following table gives an overview of all the costs to cover all the Centre: prevention, assistance to victims and facilitation of the process and remove CSA online:
365
Table 4: Estimated costsげ!mplem entatio n Choice C (Europol coponenり
Staff expenditure of the Centre
Sa1ares & allowance f3.000.000 f5.000.000 C6 .000.000 f7.000.000 f8.000.000 f9.700.000 f9.700.000 f9.700.000 f9.700.000 f9.700.000
Expenditure relating to staff recruitment f400.000 f400.000 f400.000 f200.000 Gl.ibO Gl.ib0 Gl.ibO Gl.ibO Gl.ib0 Gl.ib0
Mission expenses GOO.000 GOO.000 GOO.000 GlO.b0i GOO.000 GOO.000 GO0.bbi GO0.bbi Gl0.Obi Gii.110
Socio-medical infrastructure & training C 150.00O C 200.00O C 200.000 毛 200.00O C 250.00O C 250.00O C 250.000 毛 250.00O C 250.00O C 250.000
Total staff costs f3.850.000 G.900.000 C6 .900.000 f7.900.000 f8.900.000 モ 10.600.000 C 10.600.000 C 10.600.000モ 10.600.000 C 10.600.000
Infrastructure and operating expenditure of the Centre
Rental of buildings and associated costs GlO.b0i GlO.b0i GOO.111 GOb.b0O GOb.b0i GIO.011 GOb.b0O GOb.b0i GlO.b0i GOO.110
'CT (not related to database) GIO.011 GOO.110 GOO.111 GOb.b0O GOb.b0i GIO.011 GOb.b0O GOb.b0i GlO.b0i GOO.110
Databases of indicators
・ Technical maintenance CO C 200.00O GOb.b0i 毛 400.00O GlO.b0i GlO.b0i GOb.b0O GlO.b0i GlO.b0i GOb.b0O
・ Alowance for annual hardware licensing Gl.Obi GO.bbi C 1oo.000 毛 100.00O C 1oo.00O C 1oo.00O C 1oo.000 毛 100.00O C 1oo.00O C 1oo.000
・ Annual hosting for databases Gl.ibO C 1oo.00O C 150.000 毛 200.00O GOO.000 GlO.b0i GOb.b0i GlO.b0i GlO.b0i GlO.b0i
Movable property and associated costs Gl.Obi GO.bbi 毛 70.00O Gl.Obi C 1oo.00O C 1oo.00O C 1oo.000 毛 100.00O C 1oo.00O C 1oo.000
Current administrative expenditure Gl.Obi GO.bbi 毛 70.00O Gl.Obi C 1oo.00O C 1oo.00O C 1oo.000 毛 100.00O C 1oo.00O C 1oo.000
Audits C 200.00O C 200.00O C 200.000 毛 200.00O C 200.00O C 200.00O C 200.000 毛 200.00O C 200.00O C 200.000
Total infrastructure costs f1.480.000 f1.750.000 f1.990.000 モ 2.160.000 f2.400.000 f2.400.000 f2.400.000 f2.400.000 f2.400.000 f2.400.000
0perational expenditure
Operational activities (e.g. technical meetings with stakeholders) Gl.Obi f100.000 f100.000 f200.000 f200.000 GlO.b0i GOb.b0i GlO.b0i Gii.110 GlO.b0i
Support to expert networks恥 oordination activities, meetings) Gl.Obi GO.bbi Gl.Obi f70.000 f70.000 f100.000 f100.000 f100.000 f100.000 f100.000
Translation and interpretation Gl.Obi GO.000 f100.000 f200.000 Gii.010 f400.000 f400.000 f400.000 f400.000 f400.000
Publishingand research dissemination Gl.Obi f150.000 f200.000 GOb.b0i Gli.011 GlO.b0i GOb.b0i GlO.b0i Gii.110 GlO.b0i
Communication (incl, campaigns) GlO.b0i GlO.b0O f700.000 f1.000.000 C 1.000.00O C 1.000.000 f1.000.000 f1.000.000 C 1.000.00O f1.000.000
Total operational expenditure f700.000 f980.000 f1.150.000 f1.770.000 f2.070.000 f2.500.000 f2.500.000 f2.500.000 f2.500.000 f2.500.000
TOTAL EXPENDITURE モ6 .030.000モ 8.630.000 f10.040.000 f11.830.000 f13.370.000 f15.500.000 f15.500.000 f15.500.000 f15.500.000 f15.500.000
366
Choice C (separate entity componenり
Year 3 I
ルble 5: Estimated costsげImplementation Year 2
f1.000.000毛 2. 000.000 G. 000.000 G. 500.000 G. 500.000 G. 500.000 G. 500.000 G. 500.000 G. 500.000 G. 500.000
f200.000 f200.000 f150.000 f100.000 f50.000 f50.000 f50.000 Gl.ib0 Gl.ibO Gl.ib0
Gl.Obi Gl.Obi E 1oo.00O E 150.000 モ 200.00O C 200.00O Gll.000 Gll.000 モ 200.00O C 200.000
Gl.Obi C 1oo.00O E 1oo.00O E 1oo.000 毛 150.00O C 150.00O C 150.00O E 150.000 毛 150.00O C 150.000
Total staff costs f1.300.000 f2.350.000 f3.350.000 f3.850.000 f3.900.000 f3.900.000 f3.900.000 f3.900.000 f3.900.000 f3.900.000
Staff expenditure 0f the Centre
Sa1 ari es & allowance
Expenditure relating to S taff recruitment
Mission expenses
Socio-medical infrastructure & training
f410.000
C1oo. 000
f400. ill
f111.111
f400.000
f101.000
f400.000
C1oo.000
f400.000
C1oo. 000
f400. ill
f111.111
f410.110
f111.001
f411. iii
f51.000
C400. 000
Gi.000
f400.000
GI. 000
f0 モO CO EO :O モ0 モO CO EO E0
f0 モO CO EO :O モ0 モO CO EO E0
f0 モO CO EO :O モ0 モO CO EO E0
GO.bIO Gl.Obi E 70.000 Gl.Obi モ 100.00O C 1oo.00O C 1oo.00O E 1oo.000 モ 100.00O C 1oo.000
Gl.Obi Gl.Obi E 70.000 Gl.Obi モ 100.00O C 1o0.000 C 1oo.00O E 1oo.000 モ 100.00O C 1oo.000
GlO.b0i GlO.b0i Gu0.000 Gu0.000 Gll.000 GlO.b0i Gu0.000 Gu0.000 Gll.000 GlO.b0i
f1.030.000 f1.050.000 f1.090.000 E 1.160.000 f1.200.000 f1.200.000 f1.200.000 f1.200.000 f1.200.000 f1.200.000
Infrastructure and operating expenditure of the Centre
Rental of buildings and associated costs
ICT(not related to database)
Databases of indicators
・ Technical maintenance
・ Allowance for annual hardware licensing
・ Annual hosting for databases
Movable property and associated costs
Current administrative expenditure
Audits
Total infrastructure costs
0perational expenditure
lperationalactivities(e.g.technicalmeetingswithstakeholders) f150.000 f150.000 f200.000 f200.000 Gll.000 GlO.b0i GlO.b0i GlO.b0i Gll.000 GlO.b0i
Support to expert networks仕 oordination activities, meetings) GlO.b0O f1.000.000 C 1.500.000 f2.000.000 f2.000.000 f2.000.000 f2.000.000 f2.000.000 f2.000.000 f2.000.000
Translation and interpretation GlO.b0O GlO.b0O f400.000 f400.000 Gll.000 GlO.b0O GlO.b0i GlO.b0i Gll.000 GlO.b0O
Publishingand research dissemination GO.bIO f150.000 f200.000 GlO.b0i GlO.b0i GlO.b0O GlO.b0i Gli.000 GlO.b0i GlO.b0O
Communication (incl, campaigns) Gl.ibO C 100.00O f:100.000 f150.000 f200.000 f200.000 f200.000 f200.000 f200.000 f200.000
Total operational expenditure f1.050.000 f1.700.000 f2.400.000 f3.050.000 G.500.000 f3.500.000 f3.500.000 f3.500.000 G.500.000 G.500.000
G.380.000 C 5. 100.000 E6 .840.000 f8.060.000 C 8.600.000 C 8.600.000 f8.600.000 f8.600.000 C 8.600.000 C 8.600.000 TOTAL EXPENDITURE
367
Choice D:Se t up an EU Centre to prevent and counter child sexual abuse within the
Fundamental Rights Agency (FRA)
This scenario assumes the creation ifa Centre fully integrated in the Fundamental Rights Agency. The Centre would carry out all the functions envisaged on prevention, assistance to victims, and facilitation of detection, reporting and removal ofCSA online.
The costs estimates include the costs of reviewing manually all the reports submitted.
The Centre in this form would incur on initial costs of a total of EUR 4 minion】 EUR3 million to set up the databases of indicators + EUR i million for the building.
This choice estimates an annual cost of EUR 23.7 million per year after the initial
ramp-up.
Table 6: Estimated costsげノmplementatim Choice D (en tre m dr FRAノ
Staff expenditure of the Centre
Sa1aries & allowance f3.000.000 f5.000.000 f10.000.000 f11.000.000 f13.000.000 f13.000.000 f13.000.000 f13.000.000 f13.000.000 f13.000.000
Expenditure relating to staff recruitment f600.000 f600.000 f600.000 f200.000 f50.000 f50.000 f50.000 f50.000 f50.000 f50.000
Mission expenses f300.000 f300.000 f300.000 f500.000 f600.000 f600.000 f600.000 f600.000 f600.000 f600.000
Socio-medical infrastructure & training f150.000 f200.000 f200.000 f200.000 f250.000 f250.000 f250.000 f250.000 f250.000 f250.000
Total staff costs f4.050.000 f6.100.000 (11.100.000 (11.900.000 (13.900.000 (13.900.000 (13.900.000 (13.900.000 (13.900.000 (13.900.000
Infrastructure and operating expenditure of the Centre
Rental of buildings and associated costs f900.000 f900.000 f900.000 f900.000 f900.000 f900.000 f900.000 f900.000 f900.000 f900.000
'CT (not related to database) f800.000 f700.000 f700.000 f700.000 f700.000 f700.000 f700.000 f700.000 f700.000 f700.000
Databases of indicators
・ Technical maintenance fl f200.000 f300.000 f400.000 f500.000 f500.000 f500.000 f500.000 f500.000 f500.000
・ Allowance for annual hardware licensing f50.000 f50.000 f100.000 f100.000 f100.000 f100.000 f100.000 f100.000 f100.000 f100.000
・ Annual hosting for databases f50.000 f100.000 f150.000 f200.000 f300.000 f300.000 f300.000 f300.000 f300.000 f300.000
Movable property and associated costs f30.000 f50.000 f70.000 f80.000 f100.000 f100.000 f100.000 f100.000 f100.000 f100.000
Current administrative expenditure f50.000 f50.000 f70.000 f80.000 f100.000 f100.000 f100.000 f100.000 f100.000 f100.000
Audits f500.000 f500.000 f500.000 f500.000 f500.000 f500.000 f500.000 f500.000 f500.000 f500.111
Total infrastructure costs (2.380.000 (2.550.000 (2.790.000 (2.960.000 (3.200.000 (3.200.000 (3.200.000 (3.200.000 (3.200.000 (3.200.000
0perational expenditure
Operational activities (e.g. technical meetings with stakeholders) f500.000 f1.000.000 f1.500.000 f2.000.000 f2.000.000 f2.000.000 f2.000.000 f2.000.000 f2.000.000 f2.000.000
Support to expert networks恥 oordination activities, meetings) f500.000 f1.000.000 f1.500.000 f2.000.000 f2.600.000 f2.600.000 f2.600.000 f2.600.000 f2.600.000 f2.600.000
Translation and interpretation f300.000 f300.000 f400.000 f400.000 f500.000 f500.000 f500.000 f500.000 f500.000 f500.000
Publishingand research dissemination f50.000 f150.000 f200.000 f300.000 f500.000 f500.000 f500.000 f500.000 f500.000 f500.000
Communication (incl, campaigns) f500.000 f600.000 f700.000 f1.000.000 f1.000.000 f1.000.000 f1.000.000 f1.000.000 f1.000.000 f1.000.000
Total operational expenditure (1.850.000 (3.050.000 (4.300.000 (5.700.000 (6.600.000 (6.600.000 (6.600.000 (6.600.000 (6.600.000 (6.600.000
TOTAL EXPENDITURE (8.280.000 (11.700.000 (12.190.000 (20.560.000 (23.700.000 (23.700.000 (23.700.000 (23.700.000 (23.700.000 (23.700.000
βenがt
The quantification of benefits is based on the estimated reduction of CSA crimes that could be attributed to the functions carried out by the Centre.
The EU Centre will facilitate action of Member S tates and service providers in preventing and combating CSA , and support victims. This will generate cost savings, by, e.g. helping avoid duplication of efforts and facilitating a more effective and efficient use of resources. In addition, the Centre's tasks would contribute to a reduction of the prevalence of CSA, and therefore cost savings caused by those crimes.
It is not possible to quantify exactly what those benefits would be. in particular, it is not
possible to isolate precisely the effects of the Centre from the effects of the other policy measures, in particular the obligations on service providers to detect, report and remove CSA online. This section focuses therefore on estimating those benefits as a reduction of the annual costs of CSA in the EU that could be attributed to the Centre only.
To estimate how each implementation choice could reduce crime, the qualitative scores on the social impact (enhanced security through more effective fight against crime, prevention leading to decreased prevalence of CSA ) obtained in the assessment of each implementation choice were translated into percentages of decrease of child sexual abuse crimes.
The social impacts of the various implementation options for the centre are determined based on how effectively they would enhance security by helping increase the capacity to detect,
report and remove child sexual abuse online, prevent these crimes, and increase the assistance to victims. This assumption was used for the sole purpose of comparing the options. Therefore, the total value of bene丘ts derived from a reduction of crime for a given implementation must be
interpreted in relation to the other options, rather than as an accurate estimate of the actual reduction of crime that a given policy option would cause.
Seethe quantitative comparison of benefits below for an estimates of the benefits based on the effectiveness ratings.
5. COMPARIsON 0F IMPLEMENTATION CHO I ES
Qualitative comparison
The following criteria are used in assessing how the implementation choices would
potentially perform, compared to the baseline:
Effectiveness in achieving the specific objectives: a) Help ensure that victims are rescued and assisted as soon as possible and offenders
are brought to justice by facilitating detection, reporting and removal of CSA
online.
b)Su pport Member S tates in putting in place usable, rigorously evaluated and
effective prevention measures to decrease the prevalence of child sexual abuse in
the EU.
c)S upport Member S tates to ensure that victims have access to appropriate and
holistic support, by facilitatin2 efforts at EU level.
Efficiency: cost-benefits assessment of each policy option in achieving the specific objectives, including financial and administrative costs.
Coherence with relevant initiatives at national, EU and international level, using all the relevant policy instruments (legislation, coordination and funding):
The tables below summarise the qualitative scores for each main assessment criteria and each
option. The options are compared below through listing positive (+), negative (-) and 'no-
change' (~) impacts compared to the baseline (with > indicating more costs in relation to
baseline).
Tabl 7: qualitative comparison
Criteria
げimplementation
A
choicespr the Cen tr
B
Effectiveness
Efficiency
+ +++
Costs >>>
D
++
>>>
++
++
『ン +
う
C H
う
Benefits + +++ ++
Coherence + ++ +
Effectiveness
This criterion, closely linked to the social impact, concerns how effectively the various
implementation choices would achieve the specific objectives, including helping increase the capacity to detect, report and remove child sexual abuse online, prevent these crimes, and increase the assistance to victims.
a) Help ensure that victims are rescued and assisted as soon as possible and offenders are
brought to justice by facilitating detection, reporting and removal ofCSA online.
371
Lhoice A wou1d he the least ettective in reaching this objective, as the Lentre in this
choice would not address the functions of facilitating detection, reporting and removal of CSA online, for which legislation is required. Choices B , C and D would cover these functions. Under choice C, the Centre could benefit from Europol's expertise in the fight against CSA online, including the
existing processes and relationships with stakeholders. On the other hand, its ability to
appear as a neutral facilitator of the detection, reporting and removal process may be
limited, given that it would be part of law enforcement. Choices C and D, as EU agencies independent from both service providers and law
enforcement, could effectively play that facilitator role.
b)Su pport Member S tates in putting in place usable, rigorously evaluated and effective
prevention measures to decrease the prevalence of child sexual abuse in the EU.
The four choices would be able to achieve this objective effectively. c)S upport Member S tates to ensure that victims have access to appropriate and
holistic support, by facilitating efforts at EU level.
The four choices would be able to achieve this objective, including by offering the
possibility for the centre to support victims who want their images proactively removed from the internet. They would also harness the potential of the network of hotlines to improve support to victims. However, in choice C, this process could be more complicated as the centre would be split between two separate entities. The part of the centre which would be a suitable partner for work with victims, victims' association and hotlines would be an independent entity, which would not be involved in proactive search for CSAM. This separation of the centre roles between two entities increases the risk of silos and therefore the risk of inefficiencies.
fciencア
Lists
Choice A is the most cost effective, as it covers oniy part of the envisaged functions for the Centre.
Choices B, C, and D have very similar costs, both one-off and continuous. For one-off cost, the difference between the most expensive and the cheapest option is EUR i million. For continuous costs, the difference between the most expensive and the cheapest option is EUR 2 million. Whereas there are some savings by using an existing entity (e.g. Europol, FRA), these are offset by the need to build new functions, notably on prevention and assistance to
victims, or expand on similar ones, like Europol's capacity to support detection, reporting and removal of CSA online.
Benefits
As discussed earlier, the main benefits are those linked to a reduction of CSA crimes, and therefore costs caused by its negative consequences on victims and society. This is directly correlated with the efficiency of each choice. Therefore, the ratings for benefits are the same as those for efficiency.
372
Coherence
Legislation
All choices would be coherent with existing and planned legislation at EU level relevant for the fight against CSA . In particular, the Centre in all the implementation choices would
support Member S ttes on the implementation of the prevention and assistance provisions of the CSA Directive, as well as the relevant ones from the Victims' Rights Directive. The Centre under all the implementation choices would also facilitate compliance with the future
Digital S ervices Act, in relation to the provisions relevant to CSA online, notably the notice and takedown requirements.
Coordination
The main role of the Centre is to facilitate the efforts of both Member S tates and service
providers in preventing CSA, assisting victims, and detecting, reporting and removing CSA online. All the choices allow the Centre to fulfil that role in a way that would ensure coherence with existing coordination mechanisms at national and EU level. In choice C, the
ability of the Centre to ensure coherence with existing initiatives could be somewhat limited
by its separation into two different entities, which could cause inefficiencies in coordination within the Centre itself.
Funding
The Centre in all the implementation choices would ensure coherence with existing funding mechanisms, as part of its facilitation efforts.
Quantitative comparison
Overall costs
The tables below summarise the one-off and continuous costs estimates for
implementation choices (table 8), and a detailed overview of the choices
legislation (table 9):
the retained that require
Table 8: one-iがand cotinuolls costspr the implementation choicesげthe Centre伍UR millioり
IMPLEMENTATI0N CHOICE
ONE-OFF C 0STS CONTINUOUS (ANNUAL)
C0STS
Elo.3
E25フ
f24. i
e23フ
0 !、〕
!、〕 4,
(』し (』し
(』し (も
A B
C D
373
ルble 9: summary げestimated costspr the c加ices that require legislation lo set up the EU centre
1. EU body (e.g. agency)
2. Europol + separate entity
Europol
S taff
(number of people) Detection, reporting, removal
Prevention
Assistaiice to victims
Operational staff
Overheads staff
Operational staff
Overheads staff
Operational staff
Overheads staff
3. FRA
7 0 5
10 2
10 2
99
13,9
3,2
6戸
23,7
v・‘
m
。し ‘
此 肋
m 4 m
4 2 8
na 、、
一J
D・
。し
(
、
。
705
NA
75 103
3,9 10,6
14,5
2,4 1,2 3,6
2,5 � 3,5 6
15,5 � 8,6
24,1 7015
10 4
10 4
Total staff (number of people) 598 113
Staff (MEUR/year) 15,9
infrastructure (MEUR/year) initial costs
Annual costs
Operational expenditure (MEUR/year)
5 32
」
Total annual costs (lEUR) 25フ
Total initial costs (lEUR)
598 28 posts corresponding to the prevention and assistance to victims functions in all options could be non-EU staff and be covered by a call比r proposals/grant. They would therefore not be part of the EU establishment plan and would not have impact on the future EU budget (e.g. pensions, etc).
Overall benグts
Following the rationale described in section 3.2, and taking into account the qualitative scores on effectiveness, a quantitative estimate of the benefits could be the following: o The qualitative scores range from O (baseline) to +3 (choices C and D) (see Error!
Reference source not found.l i below). o The qualitative scores range from+ to+++ . The model assumes that the decrease of
crime could be proportional to this rating, as+ (3%),++ (6%) and+++ (9%). o The total annual cost ofCSA in the EU is EUR 13.8 billion.
Tabk lo」annual estimated benグtspr琉e policy options伍UR billioり
面p認農tlon Qu驚嵩;器釦rEsn器誌器晋意『mBenefitImplementation Qualitative score for Estimated decrease of crime ~f chich~ics~cial impact arid its s~ i tal c~sts 繋ductionabuse + 3% E 0.41
B ++ 9% E 1.23B
。 サ 6% E 0.89 し
D サ 6% E 0.89
Table 11: annual estimated net benグtspr the policy options伍UR billioり
A B C D
Overall costs E 0.l03 E 0.257 E 0.241 E 0.237
0verall benefits E 0.41 E 1.23 E 0.89 EO .89
Total (savings) (f.307) (0.973) (0.649) (EO.653)
Given the limitations caused by the lack of data, the calculation of benefits as a reduction
of crime was carried out for the main purpose of comparing the options. In consequence, the total value of benefits must be interpreted in relation to the other options, rather than
as an accurate estimate of the actual reduction of crime that the preferred policy option would actually cause. That said, based upon this analysis, implementation choiceB would offer the greatest benefits in the form of reduction of crime.
6. PREFERRED IMPLEMENTATION CHOICE
On the basis of the assessment, the identified preferred choice is choice B , which includes:
. the creation of the EU centre in the form of a decentralised EU agency: o providing support to the development and dissemination of research and
expertise and facilitating coordination on prevention; o providing support to the development and dissemination of research and
expertise and facilitating coordination on victims' assistance;
o suDDortina victims in removina their imaaes and videos from circulation:
o supporting the detection, reporting and removal of CSAM by receiving
reports in relation to child sexual abuse from companies, maintaining a
database of indicators to detect child sexual abuse online; o providing a structured oversight role to ensure accountability and
transparency on efforts to tackle child sexual abuse online.
Main advantages
Effectively achieves the 2enera1 and specific objectives
Choice B would effectively achieve the strategic objectives of the EU intervention. The form of the centre proposed in this choice would bring the best improvements in all
envisaged areas of the centre's activity. Effectively, it proposes the most efficient
approach for a coherent and holistic approach to the problem of CSA in the present and the future.
In terms of support to law enforcement and industry, choice B proposes solutions to
improve processing of reports of CSA and maintain systematic information on child sexual abuse material at EU level. It allows for a transparent and independent oversight of the efforts to combat CSA, and improvement of cooperation between
public authorities, civil society organisations and service providers, in particular by realising the full potential of hotlines.
Furthermore, it would contribute to improving dissemination of expertise and research on prevention and assistance to victims at EU level, ultimately leading to supporting and
developing practical initiatives at Member S tate level. It also a cconmodtes the
possibility to support victims who want their images removed from the internet,
offering a possibility to effectively address the issue of secondary victimisation.
Finally, the advantage of choice B over other options is that it includes all the services the centre would provide in one organisation, avoiding creating needs for additional coordination between different institutions which could potentially drive up costs, lead to confusion for external organisations and victims seeking help, and potentially slow down
processes・
All in all, choice B offers a possibility to create an EU centre which would have a
significant impact on the fight against CSA in the EU. It would become the main point of reference for all aspects of this crime in the EU and an accessible contact point for victims. It would also become the main point of contact for international cooperation, allowing the EU to join the lead the fight against child sexual abuse.
The centre as an independent organisation would be a good fit for similar organisations around the world working in the area of child protection and victim assistance(e.g. the Canadian Centre for Child Protection), and would be a natural counterpart for
cooperation with them.
There are examples showing that this type of organisation is able to perform similar function as those envisaged for the Centre. Both NCMEC in the United S tates and the Canadian Centre for Child Protection have a similar legal personality (not-for-profit corporation and national charity respectively), and have a proven record of successful and close cooDeration with law enforcement while not beina a p ublic authority
376
themselves. Additionally, independent organisations can have advanced technical
capability, including database hosting capacity.So me of the world's most important databases of CSAM are hosted within a not-for-profit organisations (e.g. NCMEC, Internet Watch Foundation599).
In addition, the creation of a dedicated EU Centre as an EU Agency would send an
important message about the dedication of the EU to combating child sexual abuse more
effectively. It would place the EU at one level with those leading the fight against child sexual abuse worldwide, which have made the same choice of creating one independent centre. It would also ensure independence from all stakeholders, allowing the centre to
cooperate with all on the same terms. It would promote visibility, and ensure that all resources of the organisation are dedicated to one single objective.
Respects subsidiaritv and proportionality
Subsidiarit: Choice B offers the highest added value of EU action. In particular, it facilitates Member S ttes' action, enables the exchange of best practices and reduces
dependence and increases cooperation with third countries. It addresses the fragmentation and inefficiencies of cooperation between law enforcement, public authorities, private sector and civil society, varying level of resources and expertise in EU Member S ttes.
Proportionality: Choice B complies with a legitimate purpose, which is tackling child sexual abuse and exploitation online and offline based on massive numbers of crimes in this area. It corresponds to explicit calls for a more coordinated approach at EU level and does not go beyond what is necessary to achieve the objectives identified for the EU intervention. Considering the increasing trends and threats of child sexual abuse over the
past years, choice B is also proportionate with regard to anticipated future developments in this crime area.
Protects fundamental rights
Choice B protects fundamental rights to human dignity, to the integrity of the person, and the fundamental rights of the child, among others, by boosting efforts to better
prevent and protect children from sexual abuse and better support victims.
Additionally, choice B provides an important and effective safeguard that can help ensure and continuously verify that the impact on the rights of users to data protection and
privacy of communications is limited to what is necessary, and support a fair balance between the different rights at stake.
Main disadvantages
Implies more extensive preparation efforts and higher costs
Choice B includes establishing a new organisation, which would incur higher initial and
running costs than if the centre were established as part of an existing entity. It also creates additional workload in the preparatory phase with regard to finding the most suitable legal form and a Member S tte that could host it. Overall, the need to assemble
resources, equipment and personnel will incur high implementation costs.
599Internet Watch Foundation, Victims are rescued with the help 0f your reports, accessed 28 April 2021
377
Trade-offs
Coherent and holistic approach implies higher costs
Choice B would enhance the overall response to the threat of child sexual abuse at EU
level, but the EU budget and/or the Member S ttes would face additional expenses linked to the establishment of a new organisation. Whereas this choice seeks to streamline Member S tates efforts and ensure efficient use or resources in the big picture and in the long run, it is clear that additional human, technical, and financial efforts are
required to provide a central point for improving prevention, support of victims, and the detection and reporting mechanisms. Considering the increasing number of child sexual abuse material online, the high costs to implement such a Centre which could respond to future threats more adeciuately than present mechanisms appears reasonable.
Anewly established entity' s overall efficiency might suffer from a lack of an established network and communication channels in the beginning, meaning investments by Member
becomes fully operational in
initiative, that no comparable
States will take some time to pay off until this centre that this is a pioneering date and that global examples exist about the success of
practice. However, considering entity can be found in the EU to such Centres (e.g. NCMEC), the risk of making high investments for an unknown, new initiative appears worthwhile.
378
ANNEX 11:S 1E TEsT
1. Identification of affected businesses
SMEsare among the service providers affected by the measures described in this impact assessment, although it is known that almost 95% of reports of child sexual abuse online from service providers are made by a single large provider (Facebook), while just 5
providers are responsible for 99% of such reports600. This shows thatsMEs account only for a small proportion of the current reporting.
Estimates suggest that at least 10 000 service providers concerned by the proposal could be SME s. h this regard, 45% of these SME s are micro-enterprises and 40% constitute medium-sized businesses601. Even though sM Es only accounted for a small proportion of the reports, their services are at a particular risk of being misused for child sexual abuse online, since they tend to lack the capacity to hire trained staff or deploy state-of- the-art technology to fight malicious content on their services.
2. Consultation of S 1E S takeholders
1.1 S 1E stakeholders provided feedback to the血 ception Impact Assessment and
participated in the open public consultation through four industry associations:
ETNO (European Telecommunications Network Operator' s Association)
E uroISPA (one of the largest 'umbrella' associations of Internet S ervices Providers in
the world, which includes a significant number of SM Es)
ACT - The App Association (representing more than 5,000 app companies and
information technology firms across the mobile economy.)
Interactive So ftware Federation of Europe (ISFE) - European Games Developers Federation
And directly as individual micro, small and medium enterprises:
jurmatix Legal Intelligence UG
Markus Hopfenspirger Malop.Net
AiBA (spin-off company under establishment and administration of NTNU
Technology Transfer AS)
S afer Together
60ONational Center for Missing and Exploited Children. 2020 Reports by Electronic S ervice Providers
(ESP) (rnissingkids.org). 601 Estimates based on data available in the Dealroom database, https---dealroom.co!.
379
lpen-Xchange AG
Mega Limited
Yubo
The Computer & Communications kdustry Association (CCIA)
. Bumble
Severa1 of the above listed stakeholders raised concerns regarding the potential administrative burden and compliance costs for S MEs, and suggested a differentiated
approach that takes into consideration the different circumstances of the various
providers in order to avoid a one-size-fits-all approach. Although some stakeholders
expressed support for obligatory detection, one stakeholder pointed out that while larger providers have the means to put in place mandatory detection systems, this is not always the case for SME s.S ome stakeholders expressed concerns regarding reporting obligations, which might also impose burdensome requirements on S MEs, in particular with regard to reporting to a central authority (since SME s find it easier to report to national authorities). It was also pointed out that sanctions should be proportionate to the
violation, especially for smaller players.
Nevertheless, several stakeholders recognised the need for legal clarity, and expressed general support for establishing obligations to detect, remove and report child sexual abuse conditional to ensuring the necessary flexibility and a differentiated approach. It was also highlighted that all providers should be allowed to make use of the available automatic technical tools to detect CSAM and preventing its distribution.
3. Measurement of the impact on SMEs
The different measures have been found to have the following impacts on SME s:
Base万ne scenario
The baseline scenario disincentives action by S MEs against child sexual abuse online. In this scenario,SME s face legal uncertainty in relation to voluntary measures they may wish to implement against child sexual abuse online. Furthermore, certain SME s will be
impacted by the expiry of the Interim Regulation after 3 years following its entry into
application, which will result in a prohibition of such voluntary measures in their services. As such, the main impacts on SME s in the baseline scenario are conditions which tend to discourage action against child sexual abuse online, preventing SME s who wish to do so from making their services safer.
Non-legisた功'e measures
Given that the practical measures are largely voluntary in nature and do not require participation by all service providers,SME s can participate where they deem the measures to be cost-effective in view of their individual business model, corporate social
responsibility and other factors. Therefore, the economic impact of the practical options does not go beyond the necessary and should not disfavour S MEs. On the contrary, SMEs should benefit from standardised processes and improved feedback mechanisms
3so
and conmmnications channels, as well as practical support in the form 0f enhanced
sharing of technologies and databases. The possibility to opt in to these practical measures may alleviate the cost burden for SME s, increase legal certainty of their actions when tackling illegal content and contributing to ensure a level-playing field with larger companies.
Legisたtive measures
All the legislative options (B, C, D and E ) would have an impact on SMEs .
Option B could provide greater legal certainty for SME s who wish to undertake
voluntary measures. While these measures would be voluntary in nature, the
requirements and safeguards in the legislation could represent a burden to those SME s
considering implementing them.
Options C, D and E contain obligations to detect child sexual abuse online which would have higher impact on SME s than options A and B .
SMEs will be subject to the same obligations as larger providers. As the report indicates,
they are particularly vulnerable to exploitation of illegal activities, including CSA, not least since they tend to have limited capacity to deploy state-of-the-art technological solutions to detect CSAM or specialised staff. Even though companies may have unequal resources to integrate technologies for the detection of CSAM into their products, this
negative effect is outweighed by the fact that excluding them from this obligation would create a safe space for child sexual abuse and therefore defeat the purpose of the
proposal.
The implementation of technologies for the detection of such abuse may create new barriers and present a burden to SME s. While the EU Centre would make technologies available to SME s without charge, the continuous operation of those technologies could also lead to increased costs.SM Es would also experience an increased burden in relation to ensuring the appropriate human resources for the process of detection, reporting and removal of CSA online, including responding to follow-up requests from law enforcement authorities. The additional costs would imply that SMEs might have less funds at their disposal for research and innovation, increasing their competitive disadvantage towards large companies.
It is not possible to quantify exactly these costs since they would depend on the level of abuse that they would be exposed to. And this depends on the services they offer, and whether the degree to which they can be subject to effective and efficient mitigation measures, rather than the size of the company. For example, a S 1E with a small number of employees may offer a service with millions of users, which is particularly prone to be misused for CSA online, whereas a larger company may offer relatively niche services where the possibilities of misuse to commit CSA online are very limited.
4. Assessment of alternative mechanisms and mitigating measures
The following mitigating measure was considered:
● Exempting SMEs from scopeげone or more measures on obligations to detect,
report and remove cんM sexual abuse material online and切detect and repm solicitation of children on万ne.
381
1h15 mitigating measure has not been retained, since such an exemption would risk
creating a gap that could easily be exploited by offenders moving to services offered by SME s.S ma11er services becoming instrumental to the spread of child sexual abuse crimes would result in the infringement of the fundamental rights of
victims, impacting the ability to pursue the specific objectives of the intervention.
The following mitigating measures were retained:
Obligation for the competent national authorities to take into account the size and
financial and technological capabilities of the provider when enforcing the
Regulation, including in relation to the risk assessment, detection obligations and
penalties. SME s would be able to request free support from the EU Centre to conduct the risk
assessment.
S upport from the Centre and the Commission in the form of:
o guidance, to inform SME s about the new legal framework and the obligations incumbent on them. This guidance could be disseminated with the help of
industry associations; and o specific training, delivered in collaboration with Europol and the national
authorities.
S upport from the Centre in the form of:
o Tools free of charge to detect and facilitate reporting and removal of CSA
online; o Human review of the reports, so that service providers (in particular SME s),
do not need to dedicate resources to it.
382
Pikk 61 / 15065 Tallinn / 612 5008 / [email protected] / www.siseministeerium.ee
Registrikood 70000562
Austatud partnerid
26.05.2022 nr 5-1/29-1
Esitame arvamuse avaldamiseks määruse
ettepaneku laste seksuaalse kuritarvitamise
ennetamiseks ja selle vastu võitlemiseks
Euroopa Komisjon avaldas 11. mail 2022. a määruse ettepaneku, millega kehtestatakse eeskirjad
laste seksuaalse kuritarvitamise ennetamiseks ja selle vastu võitlemiseks. Ettepaneku
eesmärgiks on luua selged eeskirjad koos tingimuste ja kaitsemeetmetega, et tõhusalt ära hoida
internetipõhiste teenuste kasutamist laste seksuaalse kuritarvitamise eesmärgil. Uued eeskirjad
aitavad ära hoida laste seksuaalse kuritarvitamise jätkumist, materjali taasilmumist internetti ja
tuua teo toimepanijad kohtu ette.
Täpsemalt näeb eelnõu ette järgmist:
- Pannakse paika infoühiskonna teenuse pakkujate (veebimajutusteenuse pakkujad,
isikutevahelise sideteenuse pakkujad) kohustused laste seksuaalset kuritarvitamist kujutava
sisu vastu võitlemisel riskihinnangute koostamise ning riskide maandamise meetmete
võtmise kaudu;
- Luuakse pädevatele asutustele võimalus esitada teenusepakkujatele vajadusel kolme
erinevat liiki korraldusi: sisu avastamiseks, mahavõtmiseks ja blokeerimiseks ;
- Luuakse laste seksuaalse kuritarvitamise vastase võitluse Euroopa Liidu keskus (EL amet).
Keskus toimiks eksperditeadmiste keskusena, annaks usaldusväärset teavet tuvastatud
materjalide kohta, võtaks vastu ja analüüsiks teenuseosutajate esitatud teateid, et teha
kindlaks ekslikud teated ja takistada nende jõudmist õiguskaitseasutusteni, edastaks kiiresti
asjakohased aruanded õiguskaitseasutustele ning pakuks tuge ohvritele.
Soovime Teid kaasata Vabariigi Valitsuse seisukohtade kujundamisse antud ettepaneku
kohta. Selle jaoks edastame määruse ettepaneku koos sellele koostatud mõjuhinnanguga
Teile arvamuse avaldamiseks. Materjalid on lisatud käesolevale kaaskirjale.
Arvestades Vabariigi Valitsuse seisukohtade koordineerimise ja kinnitamise ajaraami, oleme
tänulikud, kui esitate oma esmased küsimused ja kommentaarid hiljemalt 7. juuniks 2022. a.
aadressile [email protected]. Eelnimetatu teenib eesmärki tagada kõnealuse
eelnõu osas Eesti Vabariigi esmane huvikaitse 10. juunil toimuval Euroopa Liidu Nõukogu
õiguskaitse töörühma kohtumisel.
2 (2)
Teie ametlikke seisukohti ootame hiljemalt 27. juuniks 2022. a. aadressile
Palume Teil seisukohtade kujundamisel silmas pidada, et Euroopa Komisjoni esitatud eelnõu
võib nii liikmesriikide kui ka institutsioonide vaheliste läbirääkimiste tulemusel muutuda ning
Eestil, nagu ka teistel liikmesriikidel, on võimalus eelnõu sisu läbirääkimiste käigus mõjutada.
Seega palume Teil arvamuse avaldamisel peegeldada nii seda, mis on eelnõus asjakohane ja
peaks kindlasti säilima, kui ka seda, millele tuleks kõneluste käigus erilist tähelepanu pöörata.
Lugupidamisega
(allkirjastatud digitaalselt)
Veiko Kommusaar sisejulgeoleku-, korrakaitse- ja migratsioonipoliitika asekantsler
Lisa: 1. Määruse ettepanek, millega kehtestatakse eeskirjad laste seksuaalse kuritarvitamise
ennetamiseks ja selle vastu võitlemiseks
2. Ettepaneku mõju analüüs
Barbara Haage 6125131
9068/22 FL/ml
JAI.1 EN
Council of the European Union
Brussels, 13 May 2022 (OR. en) 9068/22 JAI 641 ENFOPOL 256 CRIMORG 69 IXIM 119 DATAPROTECT 149 CYBER 170 COPEN 182 FREMP 98 TELECOM 216 COMPET 332 MI 388 CONSOM 117 DIGIT 97 CODEC 690 IA 71
Interinstitutional File: 2022/0155(COD)
PROPOSAL
From: Secretary-General of the European Commission, signed by Ms Martine DEPREZ, Director
date of receipt: 12 May 2022
To: General Secretariat of the Council
No. Cion doc.: COM(2022) 209 final
Subject: Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse
Delegations will find attached document COM(2022) 209 final.
Encl.: COM(2022) 209 final
EN EN
EUROPEAN COMMISSION
Brussels, 11.5.2022
COM(2022) 209 final
2022/0155 (COD)
Proposal for a
REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL
laying down rules to prevent and combat child sexual abuse
(Text with EEA relevance)
{SEC(2022) 209 final} - {SWD(2022) 209 final} - {SWD(2022) 210 final}
舶 糾
EN 1 EN
EXPLANATORY MEMORANDUM
1. CONTEXT OF THE PROPOSAL
• Reasons for and objectives of the proposal
The United Nations Convention on the Rights of the Child (UNCRC) and Article 24(2) of the
Charter of Fundamental Rights of the European Union (‘the Charter’)1 enshrine as rights the
protection and care of children’s best interests and well-being. In 2021, the United Nations
Committee on the Rights of the Child underlined that these rights must be equally protected in
the digital environment2. The protection of children, both offline and online, is a Union
priority.
At least one in five children falls victim to sexual violence during childhood3. A 2021 global
study found that more than one in three respondents had been asked to do something sexually
explicit online during their childhood, and over half had experienced a form of child sexual
abuse online4. Children with disabilities face an even higher risk of experiencing sexual
violence: up to 68% of girls and 30% of boys with intellectual or developmental disabilities
will be sexually abused before their 18th birthday5. Child sexual abuse material is a product of
the physical sexual abuse of children. Its detection and reporting is necessary to prevent its
production and dissemination, and a vital means to identify and assist its victims. The
pandemic has exposed children to a significantly higher degree of unwanted approaches
online, including solicitation into child sexual abuse. Despite the fact that the sexual abuse
and sexual exploitation of children and child sexual abuse materials are criminalised across
the EU by the Child Sexual Abuse Directive6, adopted in 2011, it is clear that the EU is
currently still failing to protect children from falling victim to child sexual abuse, and that the
online dimension represents a particular challenge.
Therefore, on 24 July 2020, the European Commission adopted the EU Strategy for a More
Effective Fight Against Child Sexual Abuse,7 which sets out a comprehensive response to the
growing threat of child sexual abuse both offline and online, by improving prevention,
investigation, and assistance to victims. It includes eight initiatives to put in place a strong
legal framework for the protection of children and facilitate a coordinated approach across the
many actors involved in protecting and supporting children. These initiatives aim to identify
1 UN Charter of Fundamental Rights of the European Union, 2012/C 326/02, 26 October 2012. 2 UN General Comment No. 25 (2021) on Children’s Rights in Relation to the Digital Environment. 3 One in Five Campaign, Council of Europe, 2010-2015. 4 Economist Impact survey of more than 5,000 18-20 year olds in 54 countries, published in the Global
Threat Assessment, WeProtect Global Alliance, 2021. 5 UN Special Representative of the Secretary-General on Violence Against Children, Children with
Disabilities. 6 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on
combating the sexual abuse and sexual exploitation of children and child pornography, and replacing
Council Framework Decision 2004/68/JHA. 7 EU strategy for a more effective fight against child sexual abuse, COM(2020) 607, 24 July 2020, p. 2.
EN 2 EN
legislative gaps and ensure that EU laws enable an effective response, strengthen law
enforcement efforts at national and EU level, enable EU countries to better protect children
through prevention, galvanise industry efforts to ensure the protection of children when using
the services they offer, and improve protection of children globally through multi-stakeholder
cooperation. This dedicated strategy is flanked by other complementary efforts. On 24 March
2021, the European Commission adopted its comprehensive EU Strategy on the Rights of the
Child, which proposes reinforced measures to protect children against all forms of violence,
including online abuse. In addition, it invites companies to continue their efforts to detect,
report and remove illegal online content, including online child sexual abuse, from their
platforms and services. The proposed European Declaration on Digital Rights and Principles
for the Digital Decade8 also includes a commitment to protect all children against illegal
content, exploitation, manipulation and abuse online, and preventing the digital space from
being used to commit or facilitate crimes9.
In this context, providers of hosting or interpersonal communication services (‘providers’)
play a particularly important role. Their responsible and diligent behaviour is essential for a
safe, predictable and trusted online environment and for the exercise of fundamental rights
guaranteed in the Charter. The circulation of images and videos of child sexual abuse, which
has increased dramatically with the development of the digital world, perpetuates the harm
experienced by victims, while offenders have also found new avenues through these services
to access and exploit children.
Certain providers already voluntarily use technologies to detect, report and remove online
child sexual abuse on their services. Yet the measures taken by providers vary widely, with
the vast majority of reports coming from a handful of providers, and a significant number take
no action. The quality and relevance of reports received by EU law enforcement authorities
from providers also varies considerably. Still, organisations such as the National Centre for
Missing and Exploited Children (‘NCMEC’) to whom US providers are obliged to report
under US law when they become aware of child sexual abuse on their services, received over
21 million reports in 2020, of which over 1 million related to EU Member States. The most
recent reporting figure for 2021 shows a further increase, approaching the 30 million mark10.
Despite the important contribution made by certain providers, voluntary action has thus
proven insufficient to address the misuse of online services for the purposes of child sexual
abuse. As a consequence, several Member States have started preparing and adopting national
rules to fight against online child sexual abuse. As the Impact Assessment Report
accompanying this proposal demonstrates, this results in the development of divergent
national requirements, in turn leading to an increase in the fragmentation of the Digital Single
Market for services11. Against this background, uniform Union rules on the detection,
8 Proposed European Declaration on Digital Rights and Principles for the Digital Decade. COM(2022)
28, 26 January 2022. 9 EU strategy on the rights of the child, COM(2021) 142, 24 March 2021. 10 The 2021 reporting figure of approximately 29.4 million represents a 35% year-on-year increase, EU
Cybertipline data snapshot NCMEC, accessed 11 March 2022. 11 Illustrated by the setting up of diverse new or existing authorities responsible for monitoring and
enforcing different obligations applicable to varying service provider types as constrained by the
national laws of the Member States. See section 3 of Annex 5 of the Impact Assessment Report
accompanying this proposal for further detail.
EN 3 EN
reporting and removal of online child sexual abuse are necessary to complement the Digital
Services Act, remove existing barriers to the Digital Single Market and prevent their
proliferation.12 Addressing the risk of fragmentation through this proposal must take account
of the need to guarantee children’s fundamental rights to care and to protection of their well-
being, mental health and best interest, and support the general public interest to effectively
prevent, investigate and prosecute the perpetration of the serious crime of child sexual abuse.
To address these challenges and in response to calls by the Council and the European
Parliament, this proposal therefore seeks to establish a clear and harmonised legal framework
on preventing and combating online child sexual abuse. It seeks to provide legal certainty to
providers as to their responsibilities to assess and mitigate risks and, where necessary, to
detect, report and remove such abuse on their services in a manner consistent with the
fundamental rights laid down in the Charter and as general principles of EU law. In
combating child sexual abuse as it manifests itself online, there are important rights and
interests at stake on all sides. It is therefore particularly important to establish a fair balance
between measures to protect child victims of sexual abuse and their fundamental rights and
thus to achieve important objectives of general societal interest, and the fundamental rights of
other users and of the providers.
This proposal therefore sets out targeted measures that are proportionate to the risk of misuse
of a given service for online child sexual abuse and are subject to robust conditions and
safeguards. It also seeks to ensure that providers can meet their responsibilities, by
establishing a European Centre to prevent and counter child sexual abuse (‘the EU Centre’) to
facilitate and support implementation of this Regulation and thus help remove obstacles to the
internal market, especially in connection to the obligations of providers under this Regulation
to detect online child sexual abuse, report it and remove child sexual abuse material. In
particular, the EU Centre will create, maintain and operate databases of indicators of online
child sexual abuse that providers will be required to use to comply with the detection
obligations. These databases should therefore be ready before the Regulation enters into
application. To ensure that, the Commission has already made funding available to Member
States to help with the preparations of these databases. The EU Centre should also carry out
certain complementary tasks, such as assisting competent national authorities in the
performance of their tasks under this Regulation and providing support to victims in
connection to the providers’ obligations. It should also use its central position to facilitate
cooperation and the exchange of information and expertise, including for the purposes of
evidence-based policy-making and prevention. Prevention is a priority in the Commission’s
efforts to fight against child sexual abuse.
• Consistency with existing policy provisions in the policy area
This proposal delivers on commitments made in the EU Strategy for a More Effective Fight
Against Child Sexual Abuse, notably to propose legislation to tackle child sexual abuse online
effectively, including by requiring providers to detect known child sexual abuse materials,
and to work towards the creation of a European Centre to prevent and counter child sexual
12 See section 4, Fragmentation of rules for digital services, in Business Journeys on the Single Market:
Practical Obstacles and Barriers, SWD(2020)54, 10 March 2020.
EN 4 EN
abuse. The current EU legal framework in this area consists of Union legislation relating to
child sexual abuse, such as the Child Sexual Abuse Directive, and Regulation (EU)
2021/1232 on combating online child sexual abuse13, which applies until 3 August 2024 (‘the
interim Regulation’).
By introducing an obligation for providers to detect, report, block and remove child sexual
abuse material from their services, the proposal enables improved detection, investigation and
prosecution of offences under the Child Sexual Abuse Directive. The proposed legislation
complements the new European Strategy for a Better Internet for Children14, which aims to
create safe digital experiences for children and to promote digital empowerment.
The EU Centre should work closely with Europol. It will receive the reports from providers,
check them to avoid reporting obvious false positives and forward them to Europol as well as
to national law enforcement authorities. A representative from Europol will be part of the
management board of the EU Centre. In turn, a representative from the EU Centre could be
part of the management board of Europol, to further ensure effective cooperation and
coordination.
The proposed legislation also contributes to the achievement of the objectives set in several
international law instruments. Relevant in this respect are the Council of Europe’s Lanzarote
Convention on the Protection of Children against Sexual Exploitation and Sexual Abuse15,
ratified by all EU Member States, which establishes minimum requirements regarding
substantive criminal law, assistance to victims, and intervention programmes, and the Council
of Europe’s Budapest Convention on Cybercrime16, ratified by almost all EU Member States,
which requires parties to establish certain criminal offences relating to child sexual abuse
material.
• Consistency with other Union policies
The proposal builds on the General Data Protection Regulation17 (GDPR). In practice,
providers tend to invoke various grounds for processing provided for in the GDPR to carry
out the processing of personal data inherent in voluntary detection and reporting of child
sexual abuse online. The proposal sets out a system of targeted detection orders and specifies
the conditions for detection, providing greater legal certainty for those activities. As regards
the mandatory detection activities involving processing of personal data, the proposal, in
13 Regulation 2021/1232/EU of the European Parliament and of the Council of 14 July 2021 on a
temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of
technologies by providers of number-independent interpersonal communications services for the
processing of personal and other data for the purpose of combating online child sexual abuse (Text with
EEA relevance). 14 COM(2022) 212, 11 May 2022. 15 Council of Europe Convention on Protection of Children against Sexual Exploitation and Sexual Abuse,
CETS No. 201, 25 October 2997. 16 Council of Europe Convention on Cybercrime, ETS No. 185, 23 November 2001. 17 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the
protection of natural persons with regard to the processing of personal data and on the free movement of
such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
EN 5 EN
particular the detection orders issued on the basis thereof, thus establishes the ground for such
processing referred to in Article 6(1)(c) GDPR, which provides for the processing of personal
data that is necessary for compliance with a legal obligation under Union or Member State
law to which the controller is subject.
The proposal covers, inter alia, providers that offer interpersonal electronic communications
services and hence are subject to national provisions implementing the ePrivacy Directive18
and its proposed revision currently in negotiations19. The measures set out in the proposal
restrict in some respects the scope of the rights and obligations under the relevant provisions
of that Directive, namely, in relation to activities that are strictly necessary to execute
detection orders. In this regard, the proposal involves the application, by analogy, of Article
15(1) of that Directive.
The proposal is also coherent with the e-Commerce Directive and the Proposal for a Digital
Services Act (DSA)20, on which provisional political agreement between the co-legislators
has recently been reached21. In particular, the proposal lays down specific requirements for
combating particular forms of illegal activities conducted and illegal content exchanged
online, coupled with a set of safeguards. In that manner, it will complement the general
framework provided for by the DSA, once adopted. The proposal builds on the horizontal
framework of the DSA relying on it as a baseline where possible and setting out more specific
rules where needed for the particular case of combating online child sexual abuse. For
example, some providers may be subject to a more general obligation to assess systemic risks
related to the use of their services under the DSA, and a complementary obligation to perform
a specific assessment of risks of child sexual abuse online in the present proposal. Those
providers can build on the more general risk assessment in performing the more specific one,
and in turn, specific risks identified for children on their services pursuant to the specific risk
assessment under the present proposal can inform more general mitigating measures that also
serve to address obligations under the DSA.
The e-Commerce Directive and the DSA prohibit Member States from imposing on providers
of intermediary services general obligations to monitor or to actively seek facts or
circumstances indicating illegal activity. Whilst the precise contours of that prohibition
addressed to Member States are only gradually becoming clear, the proposed Regulation aims
to comply with the underlying requirement of fairly balancing the various conflicting
fundamental rights at stake that underlies that prohibition, taking into account the specific
context of combating online child sexual abuse and the importance of the public interest at
stake. It does so, in particular, by targeting the scope of the obligations imposed on providers
at risk and by setting out a set of clear and carefully balanced rules and safeguards, including
through a clear definition of the objectives pursued, the type of material and activities
concerned, a risk-based approach, the scope and nature of the relevant obligations, rules on
18 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal
aspects of information society services, in particular electronic commerce, in the Internal Market. 19 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the
processing of personal data and the protection of privacy in the electronic communications sector
(Directive on privacy and electronic communications). 20 Proposal for a regulation on a Single Market For Digital Services (Digital Services Act) and amending
Directive 2000/31/EC, COM/2020/825 final, 15 December 2020. 21 https://ec.europa.eu/commission/presscorner/detail/en/IP_22_2545
EN 6 EN
redress and relevant supervision and transparency mechanisms. It also includes strong
measures to facilitate and support implementation and hence reduce the burden on service
providers.
In delivering on its main objectives, the proposal also helps victims. As such, the proposed
Regulation is in coherence with the Victims’ Rights Directive as a horizontal instrument to
improve victims’ access to their rights22.
2. LEGAL BASIS, SUBSIDIARITY AND PROPORTIONALITY
• Legal basis
The legal basis to support action in this area is Article 114 of the Treaty on the Functioning of
the European Union (TFEU). The article provides for the establishment of measures to ensure
the functioning of the Internal Market. Article 114 is the appropriate legal basis for a
Regulation that seeks to harmonise the requirements imposed on providers of relevant online
services in the Digital Single Market. As mentioned above, barriers to the Digital Single
Market for Services have started to emerge following the introduction by some Member
States of diverging national rules to prevent and combat online child sexual abuse.
The proposed Regulation seeks to eliminate those existing divergences and prevents the
emergence of future obstacles which would result from the further development of such
national rules. Given the intrinsic cross-border nature of the provision of online services, lack
of EU action leaving space for a regulatory framework fragmented along national lines would
result in a burden for providers having to comply with diverging sets of national rules and it
would create unequal conditions for providers across the EU, as well as possible loopholes.
• Subsidiarity
According to the principle of subsidiarity, EU action may only be taken if the envisaged aims
cannot be achieved by Member States alone, but can be better achieved at Union level.
The aim of ensuring a level playing field for providers across the Digital Single Market while
taking measures to prevent and combat online child sexual abuse cannot be achieved by the
Member States alone. As mentioned, Member States have started imposing requirements on
providers to tackle online child sexual abuse. Even those Member States who have not yet
introduced such requirements are increasingly considering national measures to that effect.
However, the providers covered typically operate across borders, often on an EU-wide basis,
or may wish to do so. Accordingly, national requirements imposed on such market players to
address online child sexual abuse increase fragmentation in the Digital Single Market and
22 Directive 2012/29/EU of the European Parliament and of the Council of 25 October 2012 establishing
minimum standards on the rights, support and protection of victims of crime, and replacing Council
Framework Decision 2001/220/JHA.
EN 7 EN
entail significant compliance costs for providers, while being insufficiently effective by virtue
of the cross-border nature of the services concerned.
Only EU level action can achieve the aim of eliminating barriers to the Digital Single Market
for the services concerned, enhancing legal certainty for providers and reducing compliance
costs, while at the same time ensuring that the requirements imposed on market players to
tackle online child sexual abuse are effective by virtue of their uniform applicability across
borders within the entire EU. Therefore, EU action is necessary to achieve the objectives of
the proposed Regulation and it presents a significant added value compared to national action.
• Proportionality
This proposal aims at eliminating existing barriers to the provision of relevant services within
the Digital Single Market and preventing the emergence of additional barriers, while allowing
for an effective fight against online child sexual abuse in full respect of the fundamental rights
under EU law of all parties affected. To achieve this objective, the proposal introduces
narrowly targeted and uniform obligations of risk assessment and mitigation, complemented
where necessary by orders for detection, reporting and removal of child sexual abuse content.
These obligations are applicable to relevant providers offering services on the Digital Single
Market regardless of where they have their principal establishment.
The proposed rules only apply to providers of certain types of online services which have
proven to be vulnerable to misuse for the purpose of dissemination of child sexual abuse
material or solicitation of children (known as ‘grooming’), principally by reason of their
technical features or the age composition of their typical user base. The scope of the
obligations is limited to what is strictly necessary to attain the objectives set out above. The
obligations are accompanied by measures to minimise the burden imposed on such providers,
as well as the introduction of a series of safeguards to minimise the interference with
fundamental rights, most notably the right to privacy of users of the services.
To reduce the number of false positives and prevent erroneous reporting to law enforcement
authorities, and to minimise the administrative and financial burden imposed on providers,
among other reasons, the proposal creates the EU Centre as an essential facilitator of
implementation of the obligations imposed on the providers. Among other tasks, the EU
Centre should facilitate access to reliable detection technologies to providers; make available
indicators created based on online child sexual abuse as verified by courts or independent
administrative authorities of Member States for the purpose of detection; provide certain
assistance, upon request, in connection to the performance of risk assessments; and provide
support in communicating with relevant national authorities.
Finally, the proposed Regulation contains safeguards to ensure that technologies used for the
purposes of detection, reporting and removal of online child sexual abuse to comply with a
detection order are the least privacy-intrusive and are in accordance with the state of the art in
the industry, and that they perform any necessary review on an anonymous basis and only
take steps to identify any user in case potential online child sexual abuse is detected. It
guarantees the fundamental right to an effective remedy in all phases of the relevant activities,
EN 8 EN
from detection to removal, and it limits the preservation of removed material and related data
to what is strictly necessary for certain specified purposes. Thereby, the proposed Regulation
limits the interference with the right to personal data protection of users and their right to
confidentiality of communications, to what is strictly necessary for the purpose of ensuring
the achievement of its objectives, that is, laying down harmonised rules for effectively
preventing and combating online child sexual abuse in the internal market.
• Choice of the instrument
Article 114 TFEU gives the Union’s legislator the possibility to adopt Regulations and
Directives. As the proposal aims at introducing uniform obligations on providers, which
usually offer their services in more than one Member State or may wish to do so, a directive
leaving a margin for divergent national transposition of EU rules would not be suitable to
achieve the relevant objectives. Divergent national rules transposing the requirements
imposed on providers by this instrument would lead to the continuation or reintroduction of
those barriers to the Digital Single Market for services that this initiative aims at eliminating.
Unlike a Directive, a Regulation ensures that the same obligations are imposed in a uniform
manner across the EU. A Regulation is also directly applicable, provides clarity and greater
legal certainty and avoids divergent transposition in the Member States. For these reasons, the
appropriate instrument to be used to achieve the objectives of this initiative is a Regulation. In
addition, in view of the date of expiry of the interim Regulation, there would in this case be
insufficient time to adopt a Directive and then transpose its rules at national level.
3. RESULTS OF EX-POST EVALUATIONS, STAKEHOLDER
CONSULTATIONS AND IMPACT ASSESSMENTS
• Stakeholder consultations
The Commission consulted relevant stakeholders over the course of two years to identify
problems and ways forward in the fight against child sexual abuse, both online and offline.
This was done through surveys, ranging from open public consultations to targeted surveys of
law enforcement authorities. Multiple group expert meetings and bilateral meetings were
organised between the Commission and relevant stakeholders to discuss the potential impacts
of legislation in this area, and the Commission participated in relevant workshops,
conferences and events on the rights of the child.
The Commission published an Inception Impact Assessment in December 2020 with the aim
of informing citizens and stakeholders about the planned initiative and seeking initial
feedback. This feedback showed significant support for the objective of tackling online child
sexual abuse. While the holistic approach of the potential Centre and expected improvements
regarding legal clarity were welcomed, some industry stakeholders expressed concerns
regarding the impact of mandatory detection and reporting of online child sexual abuse.
EN 9 EN
The Commission conducted an open public consultation in 2021. This process sought to
gather the views from across a broad range of stakeholders such as public authorities and
private citizens, industry and civil society. Despite efforts to ensure a balanced distribution of
responses, a significant proportion of contributions were received from private individuals in
Germany solely addressing questions relating to the subject of encryption. That apart, issues
of better cooperation and coordination, and sufficient resourcing and expertise to meet
continually increasing volumes of illegal content featured prominently across public
authorities, industry and civil society contributions. There was also widespread support
across all groups for swift takedown of reported child sexual abuse material, for action to
reduce online ‘grooming’ (solicitation of children) and for improvements to prevention efforts
and assistance to victims.
Regarding the possible imposition of legal obligations on providers to detect and report
various types of online child sexual abuse in their services, the consultation revealed strong
support from law enforcement authorities and organisations working in the area of children’s
rights, while privacy rights advocates and submissions from private individuals were largely
opposed to obligations.
• Collection and use of expertise
Targeted surveys of law enforcement authorities in the Member States revealed that reports
made by US providers currently constitute one of the most important sources of reports of
child sexual abuse. However the quality and relevance of such reports vary, and some reports
are found not to constitute online child sexual abuse under the applicable national law.
These surveys also identified the elements necessary to ensure that a report is ‘actionable’,
i.e., that it is of sufficient quality and relevance that the relevant law enforcement authority
can take action. It is for this reason that harmonised reports at EU level, facilitated by the EU
Centre, would be the best strategy to maximise the use of expertise to counter online child
sexual abuse.
• Impact assessment
Following a previous first negative opinion of the Regulatory Scrutiny Board on the Impact
Assessment, in February 2022, the Regulatory Scrutiny Board issued a positive opinion on the
Impact Assessment with reservations and made various suggestions for improvement. The
Impact Assessment report was further revised taking into account the relevant feedback,
notably by clarifying the descriptions of the measures taken to ensure compatibility with
fundamental rights and with the prohibition of general monitoring obligations and by
providing more detailed descriptions of the policy options. The finalised Impact Assessment
report examines and compares several policy alternatives in relation to online child sexual
abuse and to the possible creation of an EU Centre to prevent and combat child sexual abuse.
The Impact Assessment shows that voluntary actions alone against online child sexual abuse
have proven insufficient, by virtue of their adoption by a small number providers only, of the
EN 10 EN
considerable challenges encountered in the context of private-public cooperation in this field,
as well as of the difficulties faced by Member States in preventing the phenomenon and
guaranteeing an adequate level of assistance to victims. This situation has led to the adoption
of divergent sets of measures to fight online child sexual abuse in different Member States. In
the absence of Union action, legal fragmentation can be expected to develop further as
Member States introduce additional measures to address the problem at national level,
creating barriers to cross-border service provision on the Digital Single Market.
Given the need to address the situation and with a view to ensuring the good functioning of
the Digital Single Market for services while, at the same time, improving the mechanisms for
prevention, detection, reporting and removal of online child sexual abuse and ensuring
adequate protection and support for victims, EU level action was found to be necessary.
Five main policy options were considered besides the baseline scenario, with increasing levels
of effectiveness in addressing the objectives set out in the impact assessment and the overall
policy goal of ensuring the good functioning of the Digital Single Market for services while
ensuring that online child sexual abuse is detected, reported and removed throughout the
Union, thereby indirectly improving prevention, facilitating investigations and guaranteeing
adequate assistance to victims.
All options focused on the objective of ensuring detection, removal and reporting of
previously-known and new child sexual abuse material and grooming (material scope) by
relevant online service providers (personal scope) established in the EU and in third countries
- insofar as they offer their services in the Union (geographical scope).
The main differences between the five options relate to the scope of the obligations on
providers and the role and form of the EU Centre. Option A would consist of non-legislative,
practical measures to enhance prevention, detection and reporting of online child sexual
abuse, and assistance to victims. These include practical measures to increase the
implementation and efficiency of voluntary measures by providers to detect and report abuse,
and the creation of a European Centre on prevention and assistance to victims in the form of a
coordination hub managed by the Commission.
Option B would establish an explicit legal basis for voluntary detection of online child sexual
abuse, followed by mandatory reporting and removal. In the context of Option B, the EU
Centre would have been tasked with facilitating detection, reporting and removal and would
have become a fundamental component of the legislation, serving as a key safeguard for
service providers as well as a control mechanism to help ensuring the effective
implementation of the proposal. After examining several options concerning the form that the
EU Centre could take, the Impact Assessment reached the conclusion that the need for
independence, own resources, visibility, staff and expertise needed to perform the relevant
functions would be best met by setting up the EU Centre as an EU decentralised agency. This
conclusion was confirmed and strengthened in relation to Options C to E, which adopt an
incremental approach, building on one another.
EN 11 EN
Options C and D, while building on Option B, would impose legal obligations on providers to
detect certain types of online child sexual abuse on their services. Option C would require
providers to detect known child sexual abuse material (CSAM), namely copies of material
that has previously been reliably verified as constituting CSAM. Option D would require
providers to detect not only ‘known’ CSAM (material confirmed to constitute child sexual
abuse material), but also ‘new’ CSAM (material that potentially constitutes child sexual abuse
material, but not (yet) confirmed as such by an authority).
The retained Option (Option E) builds on Option D, and requires providers to also detect
grooming, in addition to known and new CSAM.
The Impact Assessment concluded that Option E is the preferred option for several reasons.
Obligations to detect online child sexual abuse are preferable to dependence on voluntary
actions by providers (Options A and B), not only because those actions to date have proven
insufficient to effectively fight against online child sexual abuse, but also because only
uniform requirements imposed at Union level are suitable to achieve the objective of avoiding
the fragmentation of the Digital Single Market for services. Hence, Options A and B were
discarded.
The level of the impact on the good functioning of the Digital Single Market for services and
on the fight against online child sexual abuse increases progressively in line with the
increasing obligations that would be imposed under each option. While an obligation to detect
known CSAM (Option C) would help to reduce the recirculation of known material, such an
obligation would have only a limited impact in terms of the goal of preventing abuse and
providing assistance to victims of ongoing abuses, given that the material falling within the
scope of such an obligation might have been in circulation for years. An obligation to detect
both known and new CSAM (Option D) would allow for the identification and rescue of
victims from ongoing abuse and it would do so based on uniform criteria established at EU
level, thereby preventing the adoption of divergent national measures on this point.
Mandatory detection also of grooming (Option E) would go further, and provide the greatest
scope for preventing imminent abuse and guaranteeing a level playing field on the Digital
Single Market for services.
Option E was therefore deemed to be the option which best achieves the policy objective in an
effective and proportionate way, all the while ensuring proportionality through the
introduction of rigorous limits and safeguards so as to ensure, in particular, the required fair
balance of fundamental rights. In addition to the positive social impacts described above, the
preferred option is expected to have an economic impact on the affected providers as a result
of costs arising from compliance with their obligations, as well as on law enforcement
authorities and other competent national authorities as a result of the increased volume of
reports of potential online child sexual abuse. These are reduced as much as possible through
the provision of certain support by the EU Centre.
In turn, the establishment of the Centre is also expected to incur one-off and ongoing costs.
Quantitative estimates of the benefits and costs of each of the policy options were assessed in
the Impact Assessment for the purposes of comparing them. The preferred option was found
to lead to the greatest overall benefits, by virtue of the resulting improvement in the
EN 12 EN
functioning of the Digital Single Market and reduction of the societal costs linked to online
child sexual abuse.
To allow the EU Centre to achieve all of its objectives, it is of key importance that the EU
Centre is established at the same location as its closest partner, Europol. The cooperation
between the EU Centre and Europol will benefit from sharing location, ranging from
improved data exchange possibilities to greater opportunities to create a knowledge hub on
combatting CSAM by attracting specialised staff and/or external experts. This staff will also
have more career opportunities without the need to change location. It would also allow the
EU Centre, while being an independent entity, to rely on the support services of Europol (HR,
IT including cybersecurity, building, communication). Sharing such support services is more
cost efficient and ensures a more professional service than duplicating them by creating them
from scratch for a relatively small entity as the EU Centre will be.
The impact assessment analysed in detail the relevant impacts, i.e. social, economic and
fundamental rights. It also considered the impact on competitiveness and SMEs. The
Regulation incorporates some of the measures indicated in the impact assessment in relation
to SMEs. These include notably the need for the competent national authorities to take into
account the size and financial and technological capabilities of the provider when enforcing
the Regulation, including in relation to the risk assessment, detection obligations and
penalties, as well as the possibility for SMEs to request free support from the EU Centre to
conduct the risk assessment.
The impact assessment also considered the consistency with climate law, the ‘do no
significant harm’ principle and the ‘digital-by-default’ principle. The impact assessment also
analysed the application of the principle ‘one in, one out’ whereby each legislative proposal
creating new burdens should relieve people and businesses of an equivalent existing burden at
EU level in the same policy area, as well as the impacts in relation to the UN Sustainable
Development Goals, where SDG 5.2 (eliminate all forms of violence against women girls)
and SDG 16.2 (end abuse, exploitation, trafficking and all forms of violence against children)
are particularly relevant for this Regulation.
• Fundamental rights
According to Article 52(1) of the Charter, any limitation on the exercise of the rights and
freedoms recognised by the Charter must be provided for by law and respect the essence of
those rights and freedoms. Subject to the principle of proportionality, limitations may be
made only if they are necessary and genuinely meet objectives of general interest recognised
by the Union or the need to protect the rights and freedoms of others.
The proposal aims to harmonise the rules that apply to prevent and combat child sexual abuse,
which is a particularly serious crime23. As such, the proposal pursues an objective of general
23 CSAM is also the only type of illegal content whose mere possession is illegal.
EN 13 EN
interest within the meaning of Article 52(1) of the Charter24. In addition, the proposal seeks to
protect the rights of others, namely of children. It concerns in particular their fundamental
rights to human dignity and to the integrity of the person, the prohibition of inhuman or
degrading treatment, as well as the rights of the child25. The proposal takes into account the
fact that in all actions relating to children, whether taken by public authorities or private
institutions, the child's best interests must be a primary consideration. Furthermore, the types
of child sexual abuse at issue here – notably, the exchange of photos or videos depicting such
abuse – can also affect the children’s rights to respect for private and family life and to
protection of personal data26. In connection to combating criminal offences against minors,
the Court of Justice of the EU has noted that at least some of the fundamental rights
mentioned can give rise to positive obligations of the relevant public authorities, including the
EU legislature, requiring them to adopt legal measures to protect the rights in question27.
At the same time, the measures contained in the proposal affect, in the first place, the exercise
of the fundamental rights of the users of the services at issue. Those rights include, in
particular, the fundamental rights to respect for privacy (including confidentiality of
communications, as part of the broader right to respect for private and family life), to
protection of personal data and to freedom of expression and information28. Whilst of great
importance, none of these rights is absolute and they must be considered in relation to their
function in society29. As indicated above, Article 52(1) of the Charter allows limitations to be
placed on the exercise of those rights, subject to the conditions set out in that provision.
In addition, the freedom to conduct a business of the providers covered by the proposal
comes into play as well30. Broadly speaking, this fundamental right precludes economic
operators from being made subject to excessive burdens. It includes the freedom to choose
with whom to do business and the freedom of contract. However, this right is not absolute
either; it allows for a broad range of interventions that may limit the exercise of economic
activities in the public interest31. Accordingly, the proposal seeks to achieve the
abovementioned objective of general interest and to protect said fundamental rights of
children, whilst ensuring proportionality and striking a fair balance between the fundamental
rights of all parties involved. To that aim, the proposal contains a range of limits and
safeguards, which are differentiated depending on the nature and level of the limit imposed on
the exercise of the fundamental rights concerned.
Specifically, obliging detection of online child sexual abuse on both ‘public-facing’ and
‘private’ services, including interpersonal communication services, results in varying levels of
intrusiveness in respect of the fundamental rights of users. In the case of material that is
accessible to the public, whilst there is an intrusion, the impact especially on the right to
24 Cf. e.g. CJEU, Digital Rights Ireland, Joined Cases C-293/12 and C-594/12, Joined Cases C-511/18, C-
512/18 and C-520/18, para. 42. 25 Art. 1, 3, 4 and 24 of the Charter, respectively. 26 Art. 7 and 8 of the Charter, respectively. 27 See in particular CJEU, La Quadrature du Net, Joined Cases C-511/18, C-512/18 and C-520/18, para.
126. 28 Art. 7, 8 and 11 of the Charter, respectively. 29 Cf. e.g. CJEU, Joined Cases C-511/18, C-512/18 and C-520/18, para. 120. 30 Art. 16 of the Charter. 31 Cf. e.g. CJEU, Sky Österreich, Case C-283/11, para. 45-46.
EN 14 EN
privacy is generally smaller given the role of these services as ‘virtual public spaces’ for
expression and economic transactions. The impact on the right to privacy in relation to private
communications is greater.
Furthermore, the potential or actual removal of users’ material, in particular erroneous
removal (on the mistaken assumption that it concerns child sexual abuse material), can
potentially have a significant impact on users’ fundamental rights, especially to freedom of
expression and information. At the same time, online child sexual abuse material that is not
detected and left unremoved can have a significant negative impact on the aforementioned
fundamental rights of the children, perpetuating harm for children and for society at large.
Other factors to be taken into account in this regard include the nature of the users’ material in
question (text, photos, videos), the accuracy of the technology concerned, as well as the
‘absolute’ nature of the prohibition to exchange child sexual abuse material (which is in
principle not subject to any exceptions and is not context-sensitive).
As a result of the measures obliging providers to detect and report known and new child
sexual abuse material, the proposal would have a significantly positive impact on the
fundamental rights of victims whose images are circulating on the internet, in particular on
their right to respect for private and family life, right to protection of personal data and the
right to the integrity of the person.
These measures would significantly reduce the violation of victims’ rights inherent in the
circulation of material depicting their abuse. These obligations, in particular the requirement
to detect new child sexual abuse materials and ‘grooming’, would result in the identification
of new victims and create a possibility for their rescue from ongoing abuse, leading to a
significant positive impact on their rights and society at large.. The provision of a clear legal
basis for the mandatory detection and reporting of ‘grooming’ would also positively impact
these rights. Increased and more effective prevention efforts will also reduce the prevalence of
child sexual abuse, supporting the rights of children by preventing them from being
victimised. Measures to support victims in removing their images and videos would safeguard
their rights to protection of private and family life (privacy) and of personal data.
As mentioned, the imposition of obligations on providers would affect their right to freedom
to conduct a business, which can in principle be justified in view of the objective pursued,
having regard also to the role that their services play in connection to the abuse. The impact
on providers’ rights nevertheless needs to be limited to the maximum extent possible to
ensure that it is does not go beyond what is strictly necessary. This would be ensured, for
instance, by providing certain forms of support to providers for the implementation of the
obligations imposed, including access to reliable sets of indicators of online child sexual
abuse that in turn provide means to use reliable automated detection technologies, and to free-
of-charge automated detection technologies, reducing the burden on them. In addition,
providers benefit from being subject to a single set of clear and uniform rules.
The processing of users’ personal data for the purposes of detecting, reporting and removing
online child sexual abuse has a significant impact on users’ rights and can be justified only in
view of the importance of preventing and combating online child sexual abuse. As a result,
the decision of whether to engage in these activities in principle cannot be left to the
EN 15 EN
providers; it rather pertains to the legislator. Nonetheless, any obligations need to be narrowly
targeted both in their personal and material scope and be coupled with adequate safeguards, in
order not to affect the essence of the rights and to be proportionate. This proposal therefore
sets out rules that correspond to these requirements, setting out limits and safeguards that are
differentiated in function of the potential impact on the fundamental rights at stake, increasing
generally speaking depending on the types of services concerned and whether the measures
aim to detect the dissemination of known child sexual abuse material, the dissemination of
new child sexual abuse material or the solicitation of children (‘grooming’).
As mentioned, detecting ‘grooming’ would have a positive impact on the fundamental rights
of potential victims especially by contributing to the prevention of abuse; if swift action is
taken, it may even prevent a child from suffering harm. At the same time, the detection
process is generally speaking the most intrusive one for users (compared to the detection of
the dissemination of known and new child sexual abuse material), since it requires
automatically scanning through texts in interpersonal communications. It is important to bear
in mind in this regard that such scanning is often the only possible way to detect it and that
the technology used does not ‘understand’ the content of the communications but rather looks
for known, pre-identified patterns that indicate potential grooming. Detection technologies
have also already acquired a high degree of accuracy32, although human oversight and review
remain necessary, and indicators of ‘grooming’ are becoming ever more reliable with time, as
the algorithms learn.
Nonetheless, the interferences at stake remain highly sensitive. As a result, while robust limits
and safeguards are already applied to the detection of known child sexual abuse material, they
are more restrictive for new child sexual abuse materials and, especially, for the detection of
‘grooming’. These include adjusted criteria for the imposition of the detection orders, a more
limited period of application of those orders and reinforced reporting requirements during that
period. In addition, the proposal also sets out strong oversight mechanisms, which include
requirements regarding the independence and powers of the national authorities charged with
issuing the orders and overseeing their execution, as well as an assisting and advising role for
the EU Centre. The EU Centre also contributes by making available not only accurate and
reliable indicators, but also suitable technologies to providers, and by assessing reports of
potential online child sexual abuse made by providers. In this manner it helps the EU Centre
minimise the risk of erroneous detection and reporting. In addition, various measures are
taken to ensure effective redress for both providers and users.
Whilst different in nature and generally speaking less intrusive, the newly created power to
issue removal orders in respect of known child sexual abuse material certainly also affects
fundamental rights, most notably those of the users concerned relating to freedom of
expression and information. In this respect, too, a set of limits and safeguards is provided for,
ranging from setting clear and standardised rules to ensuring redress and from guaranteeing
the issuing authorities’ independence to transparency and effective oversight.
32 For example, Microsoft reports that the accuracy of its grooming detection tool is 88%, meaning that
out of 100 conversations flagged as possible criminal solicitation of children, 12 can be excluded upon
review and will not be reported to law enforcement; see annex 8 of the Impact Assessment.
EN 16 EN
All references in the proposed Regulation to fundamental rights are to be understood as
referring solely to the fundamental rights recognised under EU law, that is, those enshrined in
the Charter and those recognised as general principles of EU law33.
4. BUDGETARY IMPLICATIONS
The budgetary impact of the proposal will be covered by the allocations foreseen in the Multi-
annual Financial Framework (MFF) 2021-27 under the financial envelopes of the Internal
Security Fund as detailed in the legislative financial statement accompanying this proposal for
a regulation, to the extent that it falls within the current budgetary perspective. These
implications also require reprogramming of Heading 7 of the Financial Perspective.
The legislative financial statement accompanying this proposal for a Regulation covers the
budgetary impacts for the Regulation itself.
5. OTHER ELEMENTS
• Implementation plans and monitoring, evaluation and reporting arrangements
The programme for monitoring the outputs, results and impacts of the proposed Regulation is
set out in its Article 83 and outlined in more detail in the Impact Assessment. The programme
sets out various indicators used to monitor the achievement of operational objectives and the
implementation of the Regulation.
The Commission will carry out an evaluation and submit a report to the European Parliament
and the Council at the latest five years after the entry into force of the Regulation, and every
six years thereafter. Based on the findings of the report, in particular on whether the
Regulation leaves any gaps which are relevant in practice, and taking into account
technological developments, the Commission will assess the need to adapt the scope of the
Regulation. If necessary, the Commission will submit proposals to adapt the Regulation.
• Detailed explanation of the specific provisions of the proposal
The proposed Regulation consists of two main building blocks: first, it imposes on providers
obligations concerning the detection, reporting, removal and blocking of known and new
child sexual abuse material, as well as solicitation of children, regardless of the technology
used in the online exchanges, and, second, it establishes the EU Centre on Child Sexual
Abuse as a decentralised agency to enable the implementation of the new Regulation.
33 See Art. 6 Treaty on European Union (TEU).
EN 17 EN
Chapter I sets out general provisions, including the subject matter and scope of the
Regulation (Article 1) and the definitions of key terms used in the Regulation (Article 2). The
reference to ‘child sexual abuse material’ builds on the relevant terms as defined in the Child
Sexual Abuse Directive, namely, child pornography and pornographic performance, and aims
to encompass all of the material covered therein insofar as such material can be disseminated
through the services in question (in practice, typically in the form of video and pictures). The
definition is in line with the one contained in the interim Regulation. The same holds true in
respect of the definition of ‘solicitation of children’ and ‘online child sexual abuse’. For the
definition of several other terms, the proposal relies on definition contained in other acts of
EU law or proposal, in particular the European Electronic Communications Code (EECC)34
and the DSA proposal.
Chapter II establishes uniform obligations, applicable to all providers of hosting or
interpersonal communication service offering such services in the EU’s digital single market,
to perform an assessment of risks of misuse of their services for the dissemination of known
or new child sexual abuse material or for the solicitation of children (together defined as
‘online child sexual abuse’). It also includes targeted obligations for certain providers to
detect such abuse, to report it via the EU Centre, to remove or disable access to, or to block
online child sexual abuse material when so ordered.
Section 1 creates the aforementioned risk assessment obligations for hosting or interpersonal
communication service providers (Article 3). It also requires providers to adopt tailored and
proportionate measures to mitigate the risks identified (Article 4) and to report on the
outcome of the risk assessment and on the mitigation measures adopted to the Coordinating
Authorities designated by the Member States (Article 5). Finally, it imposes targeted
obligations on software application stores to assess whether any application that they
intermediate is at risk of being used for the purpose of solicitation and, if this is the case and
the risk is significant, take reasonable measures to identify child users and prevent them from
accessing it (Article 6).
Section 2 empowers Coordinating Authorities which have become aware – through a risk
assessment or other means – of evidence that a specific hosting or interpersonal
communications service is at a significant risk of being misused for the purpose of online
child sexual abuse to ask the competent judicial or independent administrative authority to
issue an order obliging the provider concerned to detect the type of online child sexual abuse
at issue on the relevant service (Articles 7 and 8). It contains a set of complementary
measures, such as those ensuring that providers have a right to challenge orders received
(Article 9). The section also establishes requirements and safeguards to ensure that detection
is carried out effectively and, at the same time, in a balanced and proportionate manner
(Article 10). Finally, it attributes to the Commission the power to adopt guidelines on the
application of Articles 7 to 10 (Article 11).
Section 3 obliges providers of hosting or interpersonal communication services that have
become aware, irrespective of the manner in which they have become aware, of any instance
of potential online child sexual abuse on their services provided in the Union to report it
34 Directive (EU) 2018/1972 of the European Parliament and the Council of 11 December 2018
establishing the European Electronic Communications Code.
EN 18 EN
immediately to the EU Centre (Article 12) and specifies the requirements that the relevant
report has to fulfil (Article 13).
Section 4 empowers Coordinating Authorities to request the competent judicial or
independent administrative authority to issue an order obliging a hosting service provider to
remove child sexual abuse material on its services or to disable access to it in all Member
States, specifying the requirements that the order has to fulfil (Article 14). Where providers
detect online child sexual abuse, they are under no obligation under EU law to remove such
material. Nonetheless, given the manifestly illegal nature of most online child sexual abuse
and the risk of losing the liability exemption contained in the e-Commerce Directive and the
DSA proposal, providers will regularly choose to remove it (or to disable access thereto).
Where a provider does not remove online child sexual abuse material of its own motion, the
Coordinating Authorities can compel removal by issuing an order to that effect. The article
also requires providers of hosting services that have received such an order to inform the user
who provided the material, subject to exceptions to prevent interfering with activities for the
prevention, detection, investigation and prosecution of child sexual abuse offences. Other
measures, such as redress, are also regulated (Article 15). The rules contained in this section
have been inspired by those contained in the Terrorist Content Online Regulation (Regulation
2021/784).
Section 5 empowers Coordinating Authorities to request the competent judicial or
independent administrative authority to issue an order obliging a provider of internet access
services to disable access to uniform resource locators indicating specific items of child
sexual abuse material that cannot reasonably be removed at source (Article 16 and 17). Article
18 ensures inter alia that providers that received such a blocking order have a right to
challenge it and that users’ redress is ensured as well, including through requests for re-
assessment by the Coordinating Authorities. These Articles, in combination with the
provisions on reliable identification of child sexual abuse material (Article 36) and data
quality (Article 46), set out conditions and safeguards for such orders, ensuring that they are
effective as well as balanced and proportionate.
Section 6 lays out an exemption from liability for child sexual abuse offenses for providers of
relevant information society services carrying out activities to comply with this Regulation
(Article 19). This principally aims to prevent the risk of being held liable under national
criminal law for conduct required under this Regulation.
Section 6 also creates specific rights for victims, whose child sexual abuse images and videos
may be circulating online long after the physical abuse has ended. Article 20 gives victims of
child sexual abuse a right to receive from the EU Centre, via the Coordinating Authority of
their place of residence, information on reports of known child sexual abuse material
depicting them. Article 21 sets out a right for victims to seek assistance from providers of
hosting services concerned or, via the Coordinating Authority of their place of residence, the
support of the EU Centre, when they seek to obtain the removal or disabling of access to such
material.
This Section also exhaustively lists the purposes for which providers of hosting or
interpersonal communication services are to preserve content data and other data processed in
EN 19 EN
connection to the measures taken to comply with this Regulation and the personal data
generated through such processing, setting out a series of safeguards and guarantees,
including a maximum period of preservation of 12 months (Article 22).
Finally, it lays out the obligation for providers of relevant information society services to
establish a single point of contact to facilitate direct communication with the relevant public
authorities (Article 23), as well as the obligation for such providers not established in any
Member State, but offering their services in the EU, to designate a legal representative in the
EU, so as to facilitate enforcement (Article 24).
Chapter III contains provisions concerning the implementation and enforcement of this
Regulation. Section 1 lays down provisions concerning national competent authorities, in
particular Coordinating Authorities, which are the primary national authorities designated by
the Member States for the consistent application of this Regulation (Article 25). Coordinating
Authorities, like other designated competent authorities, are to be independent in all respects,
akin to a court, and are to perform their tasks impartially, transparently and in a timely
manner (Article 26).
Section 2 attributes specific investigatory and enforcement powers to Coordinating
Authorities in relation to providers of relevant information society services under the
jurisdiction of the Member State that designated the Coordinating Authorities (Articles 27 to
30). These provisions have mostly been inspired by the provisions in the DSA proposal. This
section also provides for the power to monitor compliance with this Regulation by conducting
searches of child sexual abuse material (Article 31) and to submit notices to providers of
hosting services to flag the presence of known child sexual abuse material on their services
(Article 32).
Section 3 includes further provisions on enforcement and penalties, by establishing that
Member States of the main establishment of the provider of relevant information society
services (or of its legal representative) have jurisdiction to apply and enforce this Regulation
(Article 33). It also ensures that Coordinating Authorities can receive complaints against such
providers for alleged breaches of their obligations laid down in this Regulation (Article 34). In
addition, Member States are to lay down rules on penalties applicable to breaches of those
obligations (Article 35).
Section 4 contains provisions on cooperation among Coordinating Authorities at EU level. It
sets out rules on the assessment of material or conversations so as to confirm that it
constitutes online child sexual abuse, which is a task reserved for Coordinating Authorities,
other national independent administrative authorities or national courts, as well as for the
submission of the outcomes thereof to the EU Centre for the generation of indicators or,
where it concerns uniform resource locators, inclusion in the relevant list (Article 36). It also
contains rules for cross-border cooperation among Coordinating Authorities (Article 37) and
provides for the possibility that they undertake joint investigations, where relevant with the
support of the EU Centre (Article 38). These provisions have also been inspired by the DSA
proposal. Finally, this section provides for general rules on cooperation at EU level and on a
reliable and secure information-sharing system to support communication among the relevant
parties (Article 39).
EN 20 EN
Chapter IV concerns the EU Centre. Its provisions have been based on the Common
Approach of the European Parliament, the Council and the Commission on decentralised
agencies.
Section 1 establishes the EU Centre on Child Sexual Abuse (EUCSA) as a decentralised EU
Centre (Article 40) and regulates the EU Centre’s legal status and its seat (Articles 41 and 42).
To allow the Centre to achieve all of its objectives, it is of key importance that the EU Centre
is established at the same as its closest partner, Europol. The cooperation between the EU
Centre and Europol will benefit from sharing location, ranging from improved data exchange
possibilities to greater opportunities to create a knowledge hub on child sexual abuse by
attracting specialised staff and/or external experts. This staff will also have more career
opportunities without the need to change location. It would also allow the EU Centre, while
being an independent entity, to rely on the support services of Europol (HR, IT including
cybersecurity, communication). Sharing such support services is more cost efficient and
ensures a more professional service than duplicating them by creating them from scratch for a
relatively small entity as the EU Centre will be.
Section 2 specifies the tasks of the EU Centre under this Regulation. Those include support to
Coordinating Authorities, facilitation of the risk assessment, detection, reporting, removal and
blocking processes, and facilitating the generation and sharing of knowledge and expertise
(Article 43). The EU Centre is mandated to create and maintain databases of indicators of
online child sexual abuse (Article 44) and of reports (Article 45) and to grant relevant parties
such access to the databases of indicators as required, respecting the conditions and
safeguards specified (Article 46). The section also empowers the Commission to adopt
delegated acts supplementing this Regulation in relation to those databases (Article 47).
In addition, this section clarifies that the EU Centre is intended to act as a dedicated reporting
channel for the entire EU, receiving reports on potential online child sexual abuse from all
providers of hosting or interpersonal communication services issued under this Regulation,
assessing them to determine whether reports may be manifestly unfounded, and forwarding
the reports that are not manifestly unfounded to Europol and competent law enforcement
authorities of the Member States (Article 48). Finally, this section establishes that, to facilitate
the monitoring of compliance with this Regulation, the EU Centre may under certain
circumstances conduct online searches for child sexual abuse material or notify such material
to the providers of hosting services concerned requesting removal or disabling of access, for
their voluntary consideration (Article 49). The EU Centre is also mandated to make available
relevant technologies for the execution of detection orders and to act as an information and
expertise hub, collecting information, conducting and supporting research and information-
sharing in the area of online child sexual abuse (Article 50).
Section 3 allows the EU Centre to process personal data for the purposes of this Regulation in
compliance with the rules on the processing of such data set by this Regulation and by other
acts of EU law on this subject-matter (Article 51).
Section 4 establishes channels of cooperation linking the EU Centre to the Coordinating
Authorities, through the designation of national contact officers (Article 52); to Europol
EN 21 EN
(Article 53); and to possible partner organisations, such as the INHOPE network of hotlines
for reporting child sexual abuse material (Article 54).
Section 5 sets out the administrative and management structure of the EU Centre (Article 55),
establishing the composition, structure, tasks, meeting frequency and voting rules of its
Management Board (Articles 56 to 60); the composition, appointment procedure, tasks and
voting rules of its Executive Board (Articles 61 to 63); as well as the appointment procedure,
and tasks of its Executive Director (Articles 64 and 65). In light of the technical nature and
fast-paced evolution of the technologies used by providers of relevant information society
services and to support the EU Centre’s involvement in the monitoring and implementation of
this Regulation in this regard, this section establishes a Technology Committee within the EU
Centre, composed of technical experts and performing an advisory function (Article 66).
Section 6 provides for the establishment and structure of the budget (Article 67), the financial
rules applicable to the EU Centre (Article 68), the rules for the presentation, implementation
and control of the EU Centre’s budget (Article 69), as well as presentation of accounts and
discharge (Article 70).
Sections 7 and 8 contain closing provisions on composition and status of the EU Centre’s
staff, language arrangements, transparency and communications concerning its activities,
measures to combat fraud, contractual and non-contractual liability, possibility for
administrative inquires, headquarters agreement and operating conditions, as well as the start
of the EU Centre’s activities (Articles 71 to 82).
Chapter V sets out data collection and transparency reporting obligations. It requires the EU
Centre, Coordinating Authorities and providers of hosting, interpersonal communications and
internet access services to collect aggregated data relating to their activities under this
Regulation and make the relevant information available to the EU Centre (Article 83), as well
as to report annually on their activities to the general public and the Commission (Article 84).
Chapter VI contains the final provisions of this Regulation. Those relate to the periodic
evaluation of this Regulation and of the activities of the EU Centre (Article 85); to the
adoption of delegated and implementing acts in accordance with Articles 290 and 291 TFEU,
respectively (Articles 86 and 87); to the repeal of the interim Regulation (Regulation
2021/1232) (Article 88) and finally to the entry into force and application of this Regulation
(Article 89).
EN 22 EN
2022/0155 (COD)
Proposal for a
REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL
laying down rules to prevent and combat child sexual abuse
(Text with EEA relevance)
THE EUROPEAN PARLIAMENT AND THE COUNCIL OF THE EUROPEAN UNION,
Having regard to the Treaty on the Functioning of the European Union, and in particular
Article 114 thereof,
Having regard to the proposal from the European Commission,
After transmission of the draft legislative act to the national parliaments,
Having regard to the opinion of the European Economic and Social Committee35,
Having regard to the opinion of the Committee of the Regions36,
Having regard to the opinion of the European Data Protection Board and the European Data
Protection Supervisor37,
Acting in accordance with the ordinary legislative procedure,
Whereas:
(1) Information society services have become very important for communication,
expression, gathering of information and many other aspects of present-day life,
including for children but also for perpetrators of child sexual abuse offences. Such
offences, which are subject to minimum rules set at Union level, are very serious
criminal offences that need to be prevented and combated effectively in order to
protect children’s rights and well-being, as is required under the Charter of
Fundamental Rights of the European Union (‘Charter’), and to protect society at large.
Users of such services offered in the Union should be able to trust that the services
concerned can be used safely, especially by children.
(2) Given the central importance of relevant information society services, those aims can
only be achieved by ensuring that providers offering such services in the Union
behave responsibly and take reasonable measures to minimise the risk of their services
35 OJ C , , p. . 36 OJ C , , p. . 37 OJ C , , p. .
EN 23 EN
being misused for the purpose of child sexual abuse, those providers often being the
only ones in a position to prevent and combat such abuse. The measures taken should
be targeted, carefully balanced and proportionate, so as to avoid any undue negative
consequences for those who use the services for lawful purposes, in particular for the
exercise of their fundamental rights protected under Union law, that is, those enshrined
in the Charter and recognised as general principles of Union law, and so as to avoid
imposing any excessive burdens on the providers of the services.
(3) Member States are increasingly introducing, or are considering introducing, national
laws to prevent and combat online child sexual abuse, in particular by imposing
requirements on providers of relevant information society services. In the light of the
inherently cross-border nature of the internet and the service provision concerned,
those national laws, which diverge, have a direct negative effect on the internal
market. To increase legal certainty, eliminate the resulting obstacles to the provision of
the services and ensure a level playing field in the internal market, the necessary
harmonised requirements should be laid down at Union level.
(4) Therefore, this Regulation should contribute to the proper functioning of the internal
market by setting out clear, uniform and balanced rules to prevent and combat child
sexual abuse in a manner that is effective and that respects the fundamental rights of
all parties concerned. In view of the fast-changing nature of the services concerned
and the technologies used to provide them, those rules should be laid down in
technology-neutral and future-proof manner, so as not to hamper innovation.
(5) In order to achieve the objectives of this Regulation, it should cover providers of
services that have the potential to be misused for the purpose of online child sexual
abuse. As they are increasingly misused for that purpose, those services should include
publicly available interpersonal communications services, such as messaging services
and web-based e-mail services, in so far as those service as publicly available. As
services which enable direct interpersonal and interactive exchange of information
merely as a minor ancillary feature that is intrinsically linked to another service, such
as chat and similar functions as part of gaming, image-sharing and video-hosting are
equally at risk of misuse, they should also be covered by this Regulation. However,
given the inherent differences between the various relevant information society
services covered by this Regulation and the related varying risks that those services
are misused for the purpose of online child sexual abuse and varying ability of the
providers concerned to prevent and combat such abuse, the obligations imposed on the
providers of those services should be differentiated in an appropriate manner.
(6) Online child sexual abuse frequently involves the misuse of information society
services offered in the Union by providers established in third countries. In order to
ensure the effectiveness of the rules laid down in this Regulation and a level playing
field within the internal market, those rules should apply to all providers, irrespective
of their place of establishment or residence, that offer services in the Union, as
evidenced by a substantial connection to the Union.
EN 24 EN
(7) This Regulation should be without prejudice to the rules resulting from other Union
acts, in particular Directive 2011/93 of the European Parliament and of the Council38,
Directive 2000/31/EC of the European Parliament and of the Council39 and Regulation
(EU) …/… of the European Parliament and of the Council40 [on a Single Market For
Digital Services (Digital Services Act) and amending Directive 2000/31/EC],
Directive 2010/13/EU of the European Parliament and of the Council 41, Regulation
(EU) 2016/679 of the European Parliament and of the Council42, and Directive
2002/58/EC of the European Parliament and of the Council43.
(8) This Regulation should be considered lex specialis in relation to the generally
applicable framework set out in Regulation (EU) …/… [on a Single Market For
Digital Services (Digital Services Act) and amending Directive 2000/31/EC] laying
down harmonised rules on the provision of certain information society services in the
internal market. The rules set out in Regulation (EU) …/… [on a Single Market For
Digital Services (Digital Services Act) and amending Directive 2000/31/EC] apply in
respect of issues that are not or not fully addressed by this Regulation.
(9) Article 15(1) of Directive 2002/58/EC allows Member States to adopt legislative
measures to restrict the scope of the rights and obligations provided for in certain
specific provisions of that Directive relating to the confidentiality of communications
when such restriction constitutes a necessary, appropriate and proportionate measure
within a democratic society, inter alia, to prevent, investigate, detect and prosecute
criminal offences, provided certain conditions are met, including compliance with the
Charter. Applying the requirements of that provision by analogy, this Regulation
should limit the exercise of the rights and obligations provided for in Articles 5(1), (3)
and 6(1) of Directive 2002/58/EC, insofar as strictly necessary to execute detection
orders issued in accordance with this Regulation with a view to prevent and combat
online child sexual abuse.
(10) In the interest of clarity and consistency, the definitions provided for in this Regulation
should, where possible and appropriate, be based on and aligned with the relevant
definitions contained in other acts of Union law, such as Regulation (EU) …/… [on a
Single Market For Digital Services (Digital Services Act) and amending Directive
2000/31/EC].
38 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on
combating the sexual abuse and sexual exploitation of children and child pornography, and replacing
Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). 39 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal
aspects of information society services, in particular electronic commerce, in the Internal Market
('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). 40 Regulation (EU) …/… of the European Parliament and of the Council on a Single Market For Digital
Services (Digital Services Act) and amending Directive 2000/31/EC (OJ L ….). 41 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the
coordination of certain provisions laid down by law, regulation or administrative action in Member
States concerning the provision of audiovisual media service (OJ L 95, 15.4.2010, p. 1). 42 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the
protection of natural persons with regard to the processing of personal data and on the free movement of
such data, and repealing Directive 95/46/EC (OJ L 119, 4.5.2016, p. 1). 43 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the
processing of personal data and the protection of privacy in the electronic communications sector
(‘Directive on privacy and electronic communications’) (OJ L 201, 31.7.2002, p. 37).
EN 25 EN
(11) A substantial connection to the Union should be considered to exist where the relevant
information society services has an establishment in the Union or, in its absence, on
the basis of the existence of a significant number of users in one or more Member
States, or the targeting of activities towards one or more Member States. The targeting
of activities towards one or more Member States should be determined on the basis of
all relevant circumstances, including factors such as the use of a language or a
currency generally used in that Member State, or the possibility of ordering products
or services, or using a national top level domain. The targeting of activities towards a
Member State could also be derived from the availability of a software application in
the relevant national software application store, from the provision of local advertising
or advertising in the language used in that Member State, or from the handling of
customer relations such as by providing customer service in the language generally
used in that Member State. A substantial connection should also be assumed where a
service provider directs its activities to one or more Member State as set out in Article
17(1), point (c), of Regulation (EU) 1215/2012 of the European Parliament and of the
Council44. Mere technical accessibility of a website from the Union should not, alone,
be considered as establishing a substantial connection to the Union.
(12) For reasons of consistency and technological neutrality, the term ‘child sexual abuse
material’ should for the purpose of this Regulation be defined as referring to any type
of material constituting child pornography or pornographic performance within the
meaning of Directive 2011/93/EU, which is capable of being disseminated through the
use of hosting or interpersonal communication services. At present, such material
typically consists of images or videos, without it however being excluded that it takes
other forms, especially in view of future technological developments.
(13) The term ‘online child sexual abuse’ should cover not only the dissemination of
material previously detected and confirmed as constituting child sexual abuse material
(‘known’ material), but also of material not previously detected that is likely to
constitute child sexual abuse material but that has not yet been confirmed as such
(‘new’ material), as well as activities constituting the solicitation of children
(‘grooming’). That is needed in order to address not only past abuse, the re-
victimisation and violation of the victims’ rights it entails, such as those to privacy and
protection of personal data, but to also address recent, ongoing and imminent abuse, so
as to prevent it as much as possible, to effectively protect children and to increase the
likelihood of rescuing victims and stopping perpetrators.
(14) With a view to minimising the risk that their services are misused for the
dissemination of known or new child sexual abuse material or the solicitation of
children, providers of hosting services and providers of publicly available
interpersonal communications services should assess such risk for each of the services
that they offer in the Union. To guide their risk assessment, a non-exhaustive list of
elements to be taken into account should be provided. To allow for a full consideration
of the specific characteristics of the services they offer, providers should be allowed to
take account of additional elements where relevant. As risks evolve over time, in
function of developments such as those related to technology and the manners in
44 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on
jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L
351, 20.12.2012, p. 1).
EN 26 EN
which the services in question are offered and used, it is appropriate to ensure that the
risk assessment is updated regularly and when needed for particular reasons.
(15) Some of those providers of relevant information society services in scope of this
Regulation may also be subject to an obligation to conduct a risk assessment under
Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act)
and amending Directive 2000/31/EC] with respect to information that they store and
disseminate to the public. For the purposes of the present Regulation, those providers
may draw on such a risk assessment and complement it with a more specific
assessment of the risks of use of their services for the purpose of online child sexual
abuse, as required by this Regulation.
(16) In order to prevent and combat online child sexual abuse effectively, providers of
hosting services and providers of publicly available interpersonal communications
services should take reasonable measures to mitigate the risk of their services being
misused for such abuse, as identified through the risk assessment. Providers subject to
an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a
Single Market For Digital Services (Digital Services Act) and amending Directive
2000/31/EC] may consider to which extent mitigation measures adopted to comply
with that obligation, which may include targeted measures to protect the rights of the
child, including age verification and parental control tools, may also serve to address
the risk identified in the specific risk assessment pursuant to this Regulation, and to
which extent further targeted mitigation measures may be required to comply with this
Regulation.
(17) To allow for innovation and ensure proportionality and technological neutrality, no
exhaustive list of the compulsory mitigation measures should be established. Instead,
providers should be left a degree of flexibility to design and implement measures
tailored to the risk identified and the characteristics of the services they provide and
the manners in which those services are used. In particular, providers are free to design
and implement, in accordance with Union law, measures based on their existing
practices to detect online child sexual abuse in their services and indicate as part of the
risk reporting their willingness and preparedness to eventually being issued a detection
order under this Regulation, if deemed necessary by the competent national authority.
(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility
should be subject to the need to comply with Union law and, in particular, the
requirements of this Regulation on mitigation measures. Therefore, providers of
hosting services and providers of publicly available interpersonal communications
services should, when designing and implementing the mitigation measures, give
importance not only to ensuring their effectiveness, but also to avoiding any undue
negative consequences for other affected parties, notably for the exercise of users’
fundamental rights. In order to ensure proportionality, when determining which
mitigation measures should reasonably be taken in a given situation, account should
also be taken of the financial and technological capabilities and the size of the provider
concerned. When selecting appropriate mitigation measures, providers should at least
duly consider the possible measures listed in this Regulation, as well as, where
appropriate, other measures such as those based on industry best practices, including
as established through self-regulatory cooperation, and those contained in guidelines
from the Commission. When no risk has been detected after a diligently conducted or
EN 27 EN
updated risk assessment, providers should not be required to take any mitigation
measures.
(19) In the light of their role as intermediaries facilitating access to software applications
that may be misused for online child sexual abuse, providers of software application
stores should be made subject to obligations to take certain reasonable measures to
assess and mitigate that risk. The providers should make that assessment in a diligent
manner, making efforts that are reasonable under the given circumstances, having
regard inter alia to the nature and extent of that risk as well as their financial and
technological capabilities and size, and cooperating with the providers of the services
offered through the software application where possible.
(20) With a view to ensuring effective prevention and fight against online child sexual
abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of
a certain service for the purpose of online child sexual abuse, the Coordinating
Authorities designated by Member States under this Regulation should be empowered
to request the issuance of detection orders. In order to avoid any undue interference
with fundamental rights and to ensure proportionality, that power should be subject to
a carefully balanced set of limits and safeguards. For instance, considering that child
sexual abuse material tends to be disseminated through hosting services and publicly
available interpersonal communications services, and that solicitation of children
mostly takes place in publicly available interpersonal communications services, it
should only be possible to address detection orders to providers of such services.
(21) Furthermore, as parts of those limits and safeguards, detection orders should only be
issued after a diligent and objective assessment leading to the finding of a significant
risk of the specific service concerned being misused for a given type of online child
sexual abuse covered by this Regulation. One of the elements to be taken into account
in this regard is the likelihood that the service is used to an appreciable extent, that is,
beyond isolated and relatively rare instances, for such abuse. The criteria should vary
so as to account of the different characteristics of the various types of online child
sexual abuse at stake and of the different characteristics of the services used to engage
in such abuse, as well as the related different degree of intrusiveness of the measures
to be taken to execute the detection order.
(22) However, the finding of such a significant risk should in itself be insufficient to justify
the issuance of a detection order, given that in such a case the order might lead to
disproportionate negative consequences for the rights and legitimate interests of other
affected parties, in particular for the exercise of users’ fundamental rights. Therefore,
it should be ensured that detection orders can be issued only after the Coordinating
Authorities and the competent judicial authority or independent administrative
authority having objectively and diligently assessed, identified and weighted, on a
case-by-case basis, not only the likelihood and seriousness of the potential
consequences of the service being misused for the type of online child sexual abuse at
issue, but also the likelihood and seriousness of any potential negative consequences
for other parties affected. With a view to avoiding the imposition of excessive
burdens, the assessment should also take account of the financial and technological
capabilities and size of the provider concerned.
(23) In addition, to avoid undue interference with fundamental rights and ensure
proportionality, when it is established that those requirements have been met and a
EN 28 EN
detection order is to be issued, it should still be ensured that the detection order is
targeted and specified so as to ensure that any such negative consequences for affected
parties do not go beyond what is strictly necessary to effectively address the
significant risk identified. This should concern, in particular, a limitation to an
identifiable part or component of the service where possible without prejudice to the
effectiveness of the measure, such as specific types of channels of a publicly available
interpersonal communications service, or to specific users or specific groups of users,
to the extent that they can be taken in isolation for the purpose of detection, as well as
the specification of the safeguards additional to the ones already expressly specified in
this Regulation, such as independent auditing, the provision of additional information
or access to data, or reinforced human oversight and review, and the further limitation
of the duration of application of the detection order that the Coordinating Authority
deems necessary. To avoid unreasonable or disproportionate outcomes, such
requirements should be set after an objective and diligent assessment conducted on a
case-by-case basis.
(24) The competent judicial authority or the competent independent administrative
authority, as applicable in accordance with the detailed procedural rules set by the
relevant Member State, should be in a position to take a well-informed decision on
requests for the issuance of detections orders. That is of particular importance to
ensure the necessary fair balance of the fundamental rights at stake and a consistent
approach, especially in connection to detection orders concerning the solicitation of
children. Therefore, a procedure should be provided for that allows the providers
concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU
Centre’) and, where so provided in this Regulation, the competent data protection
authority designated under Regulation (EU) 2016/679 to provide their views on the
measures in question. They should do so as soon as possible, having regard to the
important public policy objective at stake and the need to act without undue delay to
protect children. In particular, data protections authorities should do their utmost to
avoid extending the time period set out in Regulation (EU) 2016/679 for providing
their opinions in response to a prior consultation. Furthermore, they should normally
be able to provide their opinion well within that time period in situations where the
European Data Protection Board has already issued guidelines regarding the
technologies that a provider envisages deploying and operating to execute a detection
order addressed to it under this Regulation.
(25) Where new services are concerned, that is, services not previously offered in the
Union, the evidence available on the potential misuse of the service in the last 12
months is normally non-existent. Taking this into account, and to ensure the
effectiveness of this Regulation, the Coordinating Authority should be able to draw on
evidence stemming from comparable services when assessing whether to request the
issuance of a detection order in respect of such a new service. A service should be
considered comparable where it provides a functional equivalent to the service in
question, having regard to all relevant facts and circumstances, in particular its main
characteristics and functionalities, the manner in which it is offered and used, the user
base, the applicable terms and conditions and risk mitigation measures, as well as the
overall remaining risk profile.
(26) The measures taken by providers of hosting services and providers of publicly
available interpersonal communications services to execute detection orders addressed
to them should remain strictly limited to what is specified in this Regulation and in the
EN 29 EN
detection orders issued in accordance with this Regulation. In order to ensure the
effectiveness of those measures, allow for tailored solutions, remain technologically
neutral, and avoid circumvention of the detection obligations, those measures should
be taken regardless of the technologies used by the providers concerned in connection
to the provision of their services. Therefore, this Regulation leaves to the provider
concerned the choice of the technologies to be operated to comply effectively with
detection orders and should not be understood as incentivising or disincentivising the
use of any given technology, provided that the technologies and accompanying
measures meet the requirements of this Regulation. That includes the use of end-to-
end encryption technology, which is an important tool to guarantee the security and
confidentiality of the communications of users, including those of children. When
executing the detection order, providers should take all available safeguard measures
to ensure that the technologies employed by them cannot be used by them or their
employees for purposes other than compliance with this Regulation, nor by third
parties, and thus to avoid undermining the security and confidentiality of the
communications of users.
(27) In order to facilitate the providers’ compliance with the detection obligations, the EU
Centre should make available to providers detection technologies that they may
choose to use, on a free-of-charge basis, for the sole purpose of executing the detection
orders addressed to them. The European Data Protection Board should be consulted on
those technologies and the ways in which they should be best deployed to ensure
compliance with applicable rules of Union law on the protection of personal data. The
advice of the European Data Protection Board should be taken into account by the EU
Centre when compiling the lists of available technologies and also by the Commission
when preparing guidelines regarding the application of the detection obligations. The
providers may operate the technologies made available by the EU Centre or by others
or technologies that they developed themselves, as long as they meet the requirements
of this Regulation.
(28) With a view to constantly assess the performance of the detection technologies and
ensure that they are sufficiently reliable, as well as to identify false positives and avoid
to the extent erroneous reporting to the EU Centre, providers should ensure human
oversight and, where necessary, human intervention, adapted to the type of detection
technologies and the type of online child sexual abuse at issue. Such oversight should
include regular assessment of the rates of false negatives and positives generated by
the technologies, based on an analysis of anonymised representative data samples. In
particular where the detection of the solicitation of children in publicly available
interpersonal communications is concerned, service providers should ensure regular,
specific and detailed human oversight and human verification of conversations
identified by the technologies as involving potential solicitation of children.
(29) Providers of hosting services and providers of publicly available interpersonal
communications services are uniquely positioned to detect potential online child
sexual abuse involving their services. The information that they may obtain when
offering their services is often indispensable to effectively investigate and prosecute
child sexual abuse offences. Therefore, they should be required to report on potential
online child sexual abuse on their services, whenever they become aware of it, that is,
when there are reasonable grounds to believe that a particular activity may constitute
online child sexual abuse. Where such reasonable grounds exist, doubts about the
potential victim’s age should not prevent those providers from submitting reports. In
EN 30 EN
the interest of effectiveness, it should be immaterial in which manner they obtain such
awareness. Such awareness could, for example, be obtained through the execution of
detection orders, information flagged by users or organisations acting in the public
interest against child sexual abuse, or activities conducted on the providers’ own
initiative. Those providers should report a minimum of information, as specified in
this Regulation, for competent law enforcement authorities to be able to assess
whether to initiate an investigation, where relevant, and should ensure that the reports
are as complete as possible before submitting them.
(30) To ensure that online child sexual abuse material is removed as swiftly as possible
after its detection, Coordinating Authorities of establishment should have the power to
request competent judicial authorities or independent administrative authorities to
issue a removal order addressed to providers of hosting services. As removal or
disabling of access may affect the right of users who have provided the material
concerned, providers should inform such users of the reasons for the removal, to
enable them to exercise their right of redress, subject to exceptions needed to avoid
interfering with activities for the prevention, detection, investigation and prosecution
of child sexual abuse offences.
(31) The rules of this Regulation should not be understood as affecting the requirements
regarding removal orders set out in Regulation (EU) …/… [on a Single Market For
Digital Services (Digital Services Act) and amending Directive 2000/31/EC].
(32) The obligations of this Regulation do not apply to providers of hosting services that do
not offer their services in the Union. However, such services may still be used to
disseminate child sexual abuse material to or by users in the Union, causing harm to
children and society at large, even if the providers’ activities are not targeted towards
Member States and the total numbers of users of those services in the Union are
limited. For legal and practical reasons, it may not be reasonably possible to have
those providers remove or disable access to the material, not even through cooperation
with the competent authorities of the third country where they are established.
Therefore, in line with existing practices in several Member States, it should be
possible to require providers of internet access services to take reasonable measures to
block the access of users in the Union to the material.
(33) In the interest of consistency, efficiency and effectiveness and to minimise the risk of
circumvention, such blocking orders should be based on the list of uniform resource
locators, leading to specific items of verified child sexual abuse, compiled and
provided centrally by the EU Centre on the basis of diligently verified submissions by
the relevant authorities of the Member States. In order to avoid the taking of
unjustified or disproportionate measures, especially those that would unduly affect the
fundamental rights at stake, notably, in addition to the rights of the children, the users’
freedom of expression and information and the providers’ freedom to conduct a
business, appropriate limits and safeguards should be provided for. In particular, it
should be ensured that the burdens imposed on the providers of internet access
services concerned are not unreasonable, that the need for and proportionality of the
blocking orders is diligently assessed also after their issuance and that both the
providers and the users affected have effective means of judicial as well as non-
judicial redress.
EN 31 EN
(34) Considering that acquiring, possessing, knowingly obtaining access and transmitting
child sexual abuse material constitute criminal offences under Directive 2011/93/EU,
it is necessary to exempt providers of relevant information society services from
criminal liability when they are involved in such activities, insofar as their activities
remain strictly limited to what is needed for the purpose of complying with their
obligations under this Regulation and they act in good faith.
(35) The dissemination of child sexual abuse material is a criminal offence that affects the
rights of the victims depicted. Victims should therefore have the right to obtain, upon
request, from the EU Centre yet via the Coordinating Authorities, relevant information
if known child sexual abuse material depicting them is reported by providers of
hosting services or providers of publicly available interpersonal communications
services in accordance with this Regulation.
(36) Given the impact on the rights of victims depicted in such known child sexual abuse
material and the typical ability of providers of hosting services to limit that impact by
helping ensure that the material is no longer available on their services, those
providers should assist victims who request the removal or disabling of access of the
material in question. That assistance should remain limited to what can reasonably be
asked from the provider concerned under the given circumstances, having regard to
factors such as the content and scope of the request, the steps needed to locate the
items of known child sexual abuse material concerned and the means available to the
provider. The assistance could consist, for example, of helping to locate the items,
carrying out checks and removing or disabling access to the items. Considering that
carrying out the activities needed to obtain such removal or disabling of access can be
painful or even traumatic as well as complex, victims should also have the right to be
assisted by the EU Centre in this regard, via the Coordinating Authorities.
(37) To ensure the efficient management of such victim support functions, victims should
be allowed to contact and rely on the Coordinating Authority that is most accessible to
them, which should channel all communications between victims and the EU Centre.
(38) For the purpose of facilitating the exercise of the victims’ right to information and of
assistance and support for removal or disabling of access, victims should be allowed to
indicate the relevant item or items of child sexual abuse material in respect of which
they are seeking to obtain information or removal or disabling of access either by
means of providing the image or images or the video or videos themselves, or by
means of providing the uniform resource locators leading to the specific item or items
of child sexual abuse material, or by means of any other representation allowing for
the unequivocal identification of the item or items in question.
(39) To avoid disproportionate interferences with users’ rights to private and family life
and to protection of personal data, the data related to instances of potential online child
sexual abuse should not be preserved by providers of relevant information society
services, unless and for no longer than necessary for one or more of the purposes
specified in this Regulation and subject to an appropriate maximum duration. As those
preservation requirements relate only to this Regulation, they should not be understood
as affecting the possibility to store relevant content data and traffic data in accordance
with Directive 2002/58/EC or the application of any legal obligation to preserve data
that applies to providers under other acts of Union law or under national law that is in
accordance with Union law.
EN 32 EN
(40) In order to facilitate smooth and efficient communications by electronic means,
including, where relevant, by acknowledging the receipt of such communications,
relating to matters covered by this Regulation, providers of relevant information
society services should be required to designate a single point of contact and to publish
relevant information relating to that point of contact, including the languages to be
used in such communications. In contrast to the provider’s legal representative, the
point of contact should serve operational purposes and should not be required to have
a physical location. Suitable conditions should be set in relation to the languages of
communication to be specified, so as to ensure that smooth communication is not
unreasonably complicated. For providers subject to the obligation to establish a
compliance function and nominate compliance officers in accordance with Regulation
(EU) …/… [on a Single Market For Digital Services (Digital Services Act) and
amending Directive 2000/31/EC], one of these compliance officers may be designated
as the point of contact under this Regulation, in order to facilitate coherent
implementation of the obligations arising from both frameworks.
(41) In order to allow for effective oversight and, where necessary, enforcement of this
Regulation, providers of relevant information society services that are not established
in a third country and that offer services in the Union should have a legal
representative in the Union and inform the public and relevant authorities about how
the legal representative can be contacted. In order to allow for flexible solutions where
needed and notwithstanding their different purposes under this Regulation, it should be
possible, if the provider concerned has made this clear, for its legal representative to
also function as its point of contact, provided the relevant requirements of this
Regulation are complied with.
(42) Where relevant and convenient, subject to the choice of the provider of relevant
information society services and the need to meet the applicable legal requirements in
this respect, it should be possible for those providers to designate a single point of
contact and a single legal representative for the purposes of Regulation (EU) …/… [on
a Single Market For Digital Services (Digital Services Act) and amending Directive
2000/31/EC] and this Regulation.
(43) In the interest of the effective application and, where necessary, enforcement of this
Regulation, each Member State should designate at least one existing or newly
established authority competent to ensure such application and enforcement in respect
of providers of relevant information society services under the jurisdiction of the
designating Member State.
(44) In order to provide clarity and enable effective, efficient and consistent coordination
and cooperation both at national and at Union level, where a Member State designates
more than one competent authority to apply and enforce this Regulation, it should
designate one lead authority as the Coordinating Authority, whilst the designated
authority should automatically be considered the Coordinating Authority where a
Member State designates only one authority. For those reasons, the Coordinating
Authority should act as the single contact point with regard to all matters related to the
application of this Regulation, without prejudice to the enforcement powers of other
national authorities.
(45) Considering the EU Centre’s particular expertise and central position in connection to
the implementation of this Regulation, Coordinating Authorities should be able to
EN 33 EN
request the assistance of the EU Centre in carrying out certain of their tasks. Such
assistance should be without prejudice to the respective tasks and powers of the
Coordinating Authorities requesting assistance and of the EU Centre and to the
requirements applicable to the performance of their respective tasks and the exercise
of their respective powers provided in this Regulation.
(46) Given the importance of their tasks and the potential impact of the use of their powers
for the exercise of fundamental rights of the parties affected, it is essential that
Coordinating Authorities are fully independent. To that aim, the rules and assurances
applicable to Coordinating Authorities should be similar to those applicable to courts
and tribunals, in order to guarantee that they constitute, and can in all respects act as,
independent administrative authorities.
(47) The Coordinating Authority, as well as other competent authorities, play a crucial role
in ensuring the effectiveness of the rights and obligations laid down in this Regulation
and the achievement of its objectives. Accordingly, it is necessary to ensure that those
authorities have not only the necessary investigatory and enforcement powers, but also
the necessary financial, human, technological and other resources to adequately carry
out their tasks under this Regulation. In particular, given the variety of providers of
relevant information society services and their use of advanced technology in offering
their services, it is essential that the Coordinating Authority, as well as other
competent authorities, are equipped with the necessary number of staff, including
experts with specialised skills. The resources of Coordinating Authorities should be
determined taking into account the size, complexity and potential societal impact of
the providers of relevant information society services under the jurisdiction of the
designating Member State, as well as the reach of their services across the Union.
(48) Given the need to ensure the effectiveness of the obligations imposed, Coordinating
Authorities should be granted enforcement powers to address infringements of this
Regulation. These powers should include the power to temporarily restrict access of
users of the service concerned by the infringement or, only where that is not
technically feasible, to the online interface of the provider on which the infringement
takes place. In light of the high level of interference with the rights of the service
providers that such a power entails, the latter should only be exercised when certain
conditions are met. Those conditions should include the condition that the
infringement results in the regular and structural facilitation of child sexual abuse
offences, which should be understood as referring to a situation in which it is apparent
from all available evidence that such facilitation has occurred on a large scale and over
an extended period of time.
(49) In order to verify that the rules of this Regulation, in particular those on mitigation
measures and on the execution of detection orders, removal orders or blocking orders
that it issued, are effectively complied in practice, each Coordinating Authority should
be able to carry out searches, using the relevant indicators provided by the EU Centre,
to detect the dissemination of known or new child sexual abuse material through
publicly available material in the hosting services of the providers concerned.
(50) With a view to ensuring that providers of hosting services are aware of the misuse
made of their services and to afford them an opportunity to take expeditious action to
remove or disable access on a voluntary basis, Coordinating Authorities of
establishment should be able to notify those providers of the presence of known child
EN 34 EN
sexual abuse material on their services and requesting removal or disabling of access
thereof, for the providers’ voluntary consideration. Such notifying activities should be
clearly distinguished from the Coordinating Authorities’ powers under this Regulation
to request the issuance of removal orders, which impose on the provider concerned a
binding legal obligation to remove or disable access to the material in question within
a set time period.
(51) In order to provide clarity and ensure effective enforcement of this Regulation, a
provider of relevant information society services should be under the jurisdiction of
the Member State where its main establishment is located, that is, where the provider
has its head office or registered office within which the principal financial functions
and operational control are exercised. In respect of providers that do not have an
establishment in the Union but that offer services in the Union, the Member State
where their appointed legal representative resides or is established should have
jurisdiction, considering the function of legal representatives under this Regulation.
(52) To ensure effective enforcement and the safeguarding of users’ rights under this
Regulation, it is appropriate to facilitate the lodging of complaints about alleged non-
compliance with obligations of providers of relevant information society services
under this Regulation. This should be done by allowing users to lodge such complaints
with the Coordinating Authority in the territory of the Member State where the users
reside or are established, irrespective of which Member State has jurisdiction in
respect of the provider concerned. For the purpose of lodging of complaints, users can
decide to rely on organisations acting in the public interest against child sexual abuse.
However, in order not to endanger the aim of establishing a clear and effective system
of oversight and to avoid the risk of inconsistent decisions, it should remain solely for
the Coordinating Authority of establishment to subsequently exercise any of its
investigatory or enforcement powers regarding the conduct complained of, as
appropriate, without prejudice to the competence of other supervisory authorities
within their mandate.
(53) Member States should ensure that for infringements of the obligations laid down in
this Regulation there are penalties that are effective, proportionate and dissuasive,
taking into account elements such as the nature, gravity, recurrence and duration of the
infringement, in view of the public interest pursued, the scope and kind of activities
carried out, as well as the economic capacity of the provider of relevant information
society services concerned.
(54) The rules of this Regulation on supervision and enforcement should not be understood
as affecting the powers and competences of the data protection authorities under
Regulation (EU) 2016/679.
(55) It is essential for the proper functioning of the system of mandatory detection and
blocking of online child sexual abuse set up by this Regulation that the EU Centre
receives, via the Coordinating Authorities, material identified as constituting child
sexual abuse material or transcripts of conversations identified as constituting the
solicitation of children, such as may have been found for example during criminal
investigations, so that that material or conversations can serve as an accurate and
reliable basis for the EU Centre to generate indicators of such abuses. In order to
achieve that result, the identification should be made after a diligent assessment,
conducted in the context of a procedure that guarantees a fair and objective outcome,
EN 35 EN
either by the Coordinating Authorities themselves or by a court or another independent
administrative authority than the Coordinating Authority. Whilst the swift assessment,
identification and submission of such material is important also in other contexts, it is
crucial in connection to new child sexual abuse material and the solicitation of
children reported under this Regulation, considering that this material can lead to the
identification of ongoing or imminent abuse and the rescuing of victims. Therefore,
specific time limits should be set in connection to such reporting.
(56) With a view to ensuring that the indicators generated by the EU Centre for the purpose
of detection are as complete as possible, the submission of relevant material and
transcripts should be done proactively by the Coordinating Authorities. However, the
EU Centre should also be allowed to bring certain material or conversations to the
attention of the Coordinating Authorities for those purposes.
(57) Certain providers of relevant information society services offer their services in
several or even all Member States, whilst under this Regulation only a single Member
State has jurisdiction in respect of a given provider. It is therefore imperative that the
Coordinating Authority designated by the Member State having jurisdiction takes
account of the interests of all users in the Union when performing its tasks and using
its powers, without making any distinction depending on elements such as the users’
location or nationality, and that Coordinating Authorities cooperate with each other in
an effective and efficient manner. To facilitate such cooperation, the necessary
mechanisms and information-sharing systems should be provided for. That
cooperation shall be without prejudice to the possibility for Member States to provide
for regular exchanges of views with other public authorities where relevant for the
performance of the tasks of those other authorities and of the Coordinating Authority.
(58) In particular, in order to facilitate the cooperation needed for the proper functioning of
the mechanisms set up by this Regulation, the EU Centre should establish and
maintain the necessary information-sharing systems. When establishing and
maintaining such systems, the EU Centre should cooperate with the European Union
Agency for Law Enforcement Cooperation (‘Europol’) and national authorities to
build on existing systems and best practices, where relevant.
(59) To support the implementation of this Regulation and contribute to the achievement of
its objectives, the EU Centre should serve as a central facilitator, carrying out a range
of specific tasks. The performance of those tasks requires strong guarantees of
independence, in particular from law enforcement authorities, as well as a governance
structure ensuring the effective, efficient and coherent performance of its different
tasks, and legal personality to be able to interact effectively with all relevant
stakeholders. Therefore, it should be established as a decentralised Union agency.
(60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be
listed in a clear and comprehensive manner. With a view to ensuring the proper
implementation of this Regulation, those tasks should relate in particular to the
facilitation of the detection, reporting and blocking obligations imposed on providers
of hosting services, providers of publicly available interpersonal communications
services and providers of internet access services. However, for that same reason, the
EU Centre should also be charged with certain other tasks, notably those relating to
the implementation of the risk assessment and mitigation obligations of providers of
relevant information society services, the removal of or disabling of access to child
EN 36 EN
sexual abuse material by providers of hosting services, the provision of assistance to
Coordinating Authorities, as well as the generation and sharing of knowledge and
expertise related to online child sexual abuse.
(61) The EU Centre should provide reliable information on which activities can reasonably
be considered to constitute online child sexual abuse, so as to enable the detection and
blocking thereof in accordance with this Regulation. Given the nature of child sexual
abuse material, that reliable information needs to be provided without sharing the
material itself. Therefore, the EU Centre should generate accurate and reliable
indicators, based on identified child sexual abuse material and solicitation of children
submitted to it by Coordinating Authorities in accordance with the relevant provisions
of this Regulation. These indicators should allow technologies to detect the
dissemination of either the same material (known material) or of different child sexual
abuse material (new material), or the solicitation of children, as applicable.
(62) For the system established by this Regulation to function properly, the EU Centre
should be charged with creating databases for each of those three types of online child
sexual abuse, and with maintaining and operating those databases. For accountability
purposes and to allow for corrections where needed, it should keep records of the
submissions and the process used for the generation of the indicators.
(63) For the purpose of ensuring the traceability of the reporting process and of any follow-
up activity undertaken based on reporting, as well as of allowing for the provision of
feedback on reporting to providers of hosting services and providers of publicly
available interpersonal communications services, generating statistics concerning
reports and the reliable and swift management and processing of reports, the EU
Centre should create a dedicated database of such reports. To be able to fulfil the
above purposes, that database should also contain relevant information relating to
those reports, such as the indicators representing the material and ancillary tags, which
can indicate, for example, the fact that a reported image or video is part of a series of
images and videos depicting the same victim or victims.
(64) Given the sensitivity of the data concerned and with a view to avoiding any errors and
possible misuse, it is necessary to lay down strict rules on the access to those databases
of indicators and databases of reports, on the data contained therein and on their
security. In particular, the data concerned should not be stored for longer than is
strictly necessary. For the above reasons, access to the database of indicators should be
given only to the parties and for the purposes specified in this Regulation, subject to
the controls by the EU Centre, and be limited in time and in scope to what is strictly
necessary for those purposes.
(65) In order to avoid erroneous reporting of online child sexual abuse under this
Regulation and to allow law enforcement authorities to focus on their core
investigatory tasks, reports should pass through the EU Centre. The EU Centre should
assess those reports in order to identify those that are manifestly unfounded, that is,
where it is immediately evident, without any substantive legal or factual analysis, that
the reported activities do not constitute online child sexual abuse. Where the report is
manifestly unfounded, the EU Centre should provide feedback to the reporting
provider of hosting services or provider of publicly available interpersonal
communications services in order to allow for improvements in the technologies and
processes used and for other appropriate steps, such as reinstating material wrongly
EN 37 EN
removed. As every report could be an important means to investigate and prosecute
the child sexual abuse offences concerned and to rescue the victim of the abuse,
reports should be processed as quickly as possible.
(66) With a view to contributing to the effective application of this Regulation and the
protection of victims’ rights, the EU Centre should be able, upon request, to support
victims and to assist Competent Authorities by conducting searches of hosting services
for the dissemination of known child sexual abuse material that is publicly accessible,
using the corresponding indicators. Where it identifies such material after having
conducted such a search, the EU Centre should also be able to request the provider of
the hosting service concerned to remove or disable access to the item or items in
question, given that the provider may not be aware of their presence and may be
willing to do so on a voluntary basis.
(67) Given its central position resulting from the performance of its primary tasks under
this Regulation and the information and expertise it can gather in connection thereto,
the EU Centre should also contribute to the achievement of the objectives of this
Regulation by serving as a hub for knowledge, expertise and research on matters
related to the prevention and combating of online child sexual abuse. In this
connection, the EU Centre should cooperate with relevant stakeholders from both
within and outside the Union and allow Member States to benefit from the knowledge
and expertise gathered, including best practices and lessons learned.
(68) Processing and storing certain personal data is necessary for the performance of the
EU Centre’s tasks under this Regulation. In order to ensure that such personal data is
adequately protected, the EU Centre should only process and store personal data if
strictly necessary for the purposes detailed in this Regulation. It should do so in a
secure manner and limit storage to what is strictly necessary for the performance of the
relevant tasks.
(69) In order to allow for the effective and efficient performance of its tasks, the EU Centre
should closely cooperate with Coordinating Authorities, the Europol and relevant
partner organisations, such as the US National Centre for Missing and Exploited
Children or the International Association of Internet Hotlines (‘INHOPE’) network of
hotlines for reporting child sexual abuse material, within the limits sets by this
Regulation and other legal instruments regulating their respective activities. To
facilitate such cooperation, the necessary arrangements should be made, including the
designation of contact officers by Coordinating Authorities and the conclusion of
memoranda of understanding with Europol and, where appropriate, with one or more
of the relevant partner organisations.
(70) Longstanding Union support for both INHOPE and its member hotlines recognises
that hotlines are in the frontline in the fight against online child sexual abuse. The EU
Centre should leverage the network of hotlines and encourage that they work together
effectively with the Coordinating Authorities, providers of relevant information
society services and law enforcement authorities of the Member States. The hotlines’
expertise and experience is an invaluable source of information on the early
identification of common threats and solutions, as well as on regional and national
differences across the Union.
EN 38 EN
(71) Considering Europol’s mandate and its experience in identifying competent national
authorities in unclear situation and its database of criminal intelligence which can
contribute to identifying links to investigations in other Member States, the EU Centre
should cooperate closely with it, especially in order to ensure the swift identification
of competent national law enforcement authorities in cases where that is not clear or
where more than one Member State may be affected.
(72) Considering the need for the EU Centre to cooperate intensively with Europol, the EU
Centre’s headquarters should be located alongside Europol’s, which is located in The
Hague, the Netherlands. The highly sensitive nature of the reports shared with Europol
by the EU Centre and the technical requirements, such as on secure data connections,
both benefit from a shared location between the EU Centre and Europol. It would also
allow the EU Centre, while being an independent entity, to rely on the support services
of Europol, notably those regarding human resources management, information
technology (IT), including cybersecurity, the building and communications. Sharing
such support services is more cost-efficient and ensure a more professional service
than duplicating them by creating them anew.
(73) To ensure its proper functioning, the necessary rules should be laid down regarding the
EU Centre’s organisation. In the interest of consistency, those rules should be in line
with the Common Approach of the European Parliament, the Council and the
Commission on decentralised agencies.
(74) In view of the need for technical expertise in order to perform its tasks, in particular
the task of providing a list of technologies that can be used for detection, the EU
Centre should have a Technology Committee composed of experts with advisory
function. The Technology Committee may, in particular, provide expertise to support
the work of the EU Centre, within the scope of its mandate, with respect to matters
related to detection of online child sexual abuse, to support the EU Centre in
contributing to a high level of technical standards and safeguards in detection
technology.
(75) In the interest of transparency and accountability and to enable evaluation and, where
necessary, adjustments, providers of hosting services, providers of publicly available
interpersonal communications services and providers of internet access services,
Coordinating Authorities and the EU Centre should be required to collect, record and
analyse information, based on anonymised gathering of non-personal data and to
publish annual reports on their activities under this Regulation. The Coordinating
Authorities should cooperate with Europol and with law enforcement authorities and
other relevant national authorities of the Member State that designated the
Coordinating Authority in question in gathering that information.
(76) In the interest of good governance and drawing on the statistics and information
gathered and transparency reporting mechanisms provided for in this Regulation, the
Commission should carry out an evaluation of this Regulation within five years of the
date of its entry into force, and every five years thereafter.
(77) The evaluation should be based on the criteria of efficiency, necessity, effectiveness,
proportionality, relevance, coherence and Union added value. It should assess the
functioning of the different operational and technical measures provided for by this
Regulation, including the effectiveness of measures to enhance the detection, reporting
EN 39 EN
and removal of online child sexual abuse, the effectiveness of safeguard mechanisms
as well as the impacts on potentially affected fundamental rights, the freedom to
conduct a business, the right to private life and the protection of personal data. The
Commission should also assess the impact on potentially affected interests of third
parties.
(78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45 provides
for a temporary solution in respect of the use of technologies by certain providers of
publicly available interpersonal communications services for the purpose of combating
online child sexual abuse, pending the preparation and adoption of a long-term legal
framework. This Regulation provides that long-term legal framework. Regulation
(EU) 2021/1232 should therefore be repealed.
(79) In order to achieve the objectives of this Regulation, the power to adopt acts in
accordance with Article 290 of the Treaty should be delegated to the Commission to
amend the Annexes to this Regulation and to supplement it by laying down detailed
rules concerning the setting up, content and access to the databases operated by the EU
Centre, concerning the form, precise content and other details of the reports and the
reporting process, concerning the determination and charging of the costs incurred by
the EU Centre to support providers in the risk assessment, as well as concerning
technical requirements for the information sharing systems supporting
communications between Coordinating Authorities, the Commission, the EU Centre,
other relevant Union agencies and providers of relevant information society services.
(80) It is important that the Commission carry out appropriate consultations during its
preparatory work for delegated acts, including via open public consultation and at
expert level, and that those consultations be conducted in accordance with the
principles laid down in the Inter-institutional Agreement of 13 April 2016 on Better
Law Making46. In particular, to ensure equal participation in the preparation of
delegated acts, the European Parliament and the Council receive all documents at the
same time as Member States' experts, and their experts systematically have access to
meetings of the Commission expert groups dealing with the preparation of delegated
acts.
(81) In order to ensure uniform conditions for the implementation of the information-
sharing system, implementing powers should be conferred on the Commission. Those
powers should be exercised in accordance with Regulation (EU) No 182/2011 of the
European Parliament and of the Council47.
(82) In order to allow all affected parties sufficient time to take the necessary measures to
comply with this Regulation, provision should be made for an appropriate time period
between the date of its entry into force and that of its application.
45 Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a
temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of
technologies by providers of number-independent interpersonal communications services for the
processing of personal and other data for the purpose of combating online child sexual abuse (OJ L 274,
30.7.2021, p. 41). 46 Inter-institutional Agreement of 13 April 2016 on Better Law Making (OJ L 123, 12.5.2016, p. 1). 47 Regulation (EU) No 182/2011 of the European Parliament and of the Council of 16 February 2011
laying down the rules and general principles concerning mechanisms for control by the Member States
of the Commission's exercise of implementing powers (OJ L 55, 28.2.2011, p. 13).
EN 40 EN
(83) Since the objectives of this Regulation, namely contributing to the proper functioning
of the internal market by setting out clear, uniform and balanced rules to prevent and
combat child sexual abuse in a manner that is effective and that respects the
fundamental rights, cannot be sufficiently achieved by the Member States but can
rather, by reason of its scale and effects, be better achieved at Union level, the Union
may adopt measures, in accordance with the principle of subsidiarity as set out in
Article 5 of the Treaty on European Union. In accordance with the principle of
proportionality, as set out in that Article, this Regulation does not go beyond what is
necessary in order to achieve that objective.
(84) The European Data Protection Supervisor and the European Data Protection Board
were consulted in accordance with Article 42(2) of Regulation (EU) 2018/1725 of the
European Parliament and of the Council48 and delivered their opinion on […].
HAVE ADOPTED THIS REGULATION:
CHAPTER I
GENERAL PROVISIONS
Article 1
Subject matter and scope
1. This Regulation lays down uniform rules to address the misuse of relevant
information society services for online child sexual abuse in the internal market.
It establishes, in particular:
(a) obligations on providers of relevant information society services to minimise
the risk that their services are misused for online child sexual abuse;
(b) obligations on providers of hosting services and providers of interpersonal
communication services to detect and report online child sexual abuse;
(c) obligations on providers of hosting services to remove or disable access to
child sexual abuse material on their services;
(d) obligations on providers of internet access services to disable access to child
sexual abuse material;
(e) rules on the implementation and enforcement of this Regulation, including as
regards the designation and functioning of the competent authorities of the
Member States, the EU Centre on Child Sexual Abuse established in Article 40
(‘EU Centre’) and cooperation and transparency.
48 Regulation (EU) 2018/1725 of the European Parliament and of the Council of 23 October 2018 on the
protection of natural persons with regard to the processing of personal data by the Union institutions,
bodies, offices and agencies and on the free movement of such data, and repealing Regulation (EC) No
45/2001 and Decision No 1247/2002/EC (OJ L 295, 21.11.2018, p. 39).
EN 41 EN
2. This Regulation shall apply to providers of relevant information society services
offering such services in the Union, irrespective of their place of main establishment.
3. This Regulation shall not affect the rules laid down by the following legal acts:
(a) Directive 2011/93/EU on combating the sexual abuse and sexual exploitation
of children and child pornography, and replacing Council Framework Decision
2004/68/JHA;
(b) Directive 2000/31/EC and Regulation (EU) …/… [on a Single Market For
Digital Services (Digital Services Act) and amending Directive 2000/31/EC];
(c) Directive 2010/13/EU;
(d) Regulation (EU) 2016/679, Directive 2016/680, Regulation (EU) 2018/1725,
and, subject to paragraph 4 of this Article, Directive 2002/58/EC.
4. This Regulation limits the exercise of the rights and obligations provided for in 5(1)
and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the
execution of the detection orders issued in accordance with Section 2 of Chapter 1 of
this Regulation.
Article 2
Definitions
For the purpose of this Regulation, the following definitions apply:
(a) ‘hosting service’ means an information society service as defined in Article 2, point
(f), third indent, of Regulation (EU) …/… [on a Single Market For Digital Services
(Digital Services Act) and amending Directive 2000/31/EC];
(b) ‘interpersonal communications service’ means a publicly available service as defined
in Article 2, point 5, of Directive (EU) 2018/1972, including services which enable
direct interpersonal and interactive exchange of information merely as a minor
ancillary feature that is intrinsically linked to another service;
(c) ‘software application’ means a digital product or service as defined in Article 2,
point 13, of Regulation (EU) …/… [on contestable and fair markets in the digital
sector (Digital Markets Act)];
(d) ‘software application store’ means a service as defined in Article 2, point 12, of
Regulation (EU) …/… [on contestable and fair markets in the digital sector (Digital
Markets Act)];
(e) ‘internet access service’ means a service as defined in Article 2(2), point 2, of
Regulation (EU) 2015/2120 of the European Parliament and of the Council49;
49 Regulation (EU) 2015/2120 of the European Parliament and of the Council of 25 November 2015
laying down measures concerning open internet access and amending Directive 2002/22/EC on
universal service and users’ rights relating to electronic communications networks and services and
EN 42 EN
(f) ‘relevant information society services’ means all of the following services:
(i) a hosting service;
(ii) an interpersonal communications service;
(iii) a software applications store;
(iv) an internet access service.
(g) ‘to offer services in the Union’ means to offer services in the Union as defined in
Article 2, point (d), of Regulation (EU) …/… [on a Single Market For Digital
Services (Digital Services Act) and amending Directive 2000/31/EC];
(h) ‘user’ means any natural or legal person who uses a relevant information society
service;
(i) ‘child’ means any natural person below the age of 18 years;
(j) ‘child user’ means a natural person who uses a relevant information society service
and who is a natural person below the age of 17 years;
(k) ‘micro, small or medium-sized enterprise’ means an enterprise as defined in
Commission Recommendation 2003/361 concerning the definition of micro, small
and medium-sized enterprises50;
(l) ‘child sexual abuse material’ means material constituting child pornography or
pornographic performance as defined in Article 2, points (c) and (e), respectively, of
Directive 2011/93/EU;
(m) ‘known child sexual abuse material’ means potential child sexual abuse material
detected using the indicators contained in the database of indicators referred to in
Article 44(1), point (a);
(n) ‘new child sexual abuse material’ means potential child sexual abuse material
detected using the indicators contained in the database of indicators referred to in
Article 44(1), point (b);
(o) ‘solicitation of children’ means the solicitation of children for sexual purposes as
referred to in Article 6 of Directive 2011/93/EU;
(p) ‘online child sexual abuse’ means the online dissemination of child sexual abuse
material and the solicitation of children;
(q) ‘child sexual abuse offences’ means offences as defined in Articles 3 to 7 of
Directive 2011/93/EU;
Regulation (EU) No 531/2012 on roaming on public mobile communications networks within the
Union (OJ L 310, 26.11.2015, p. 1–18). 50 Commission Recommendation of 6 May 2003 concerning the definition of micro, small and medium-
sized enterprises (OJ L 124, 20.5.2003, p. 36–41).
EN 43 EN
(r) ‘recommender system’ means the system as defined in Article 2, point (o), of
Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services
Act) and amending Directive 2000/31/EC];
(s) ‘content data’ means data as defined in Article 2, point 10, of Regulation (EU) … [on
European Production and Preservation Orders for electronic evidence in criminal
matters (…/… e-evidence Regulation)];
(t) ‘content moderation’ means the activities as defined in Article 2, point (p), of
Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services
Act) and amending Directive 2000/31/EC];
(u) ‘Coordinating Authority of establishment’ means the Coordinating Authority for
child sexual abuse issues designated in accordance with Article 25 by the Member
State where the provider of information society services has its main establishment
or, where applicable, where its legal representative resides or is established;
(v) ‘terms and conditions’ means terms and conditions as defined in Article 2, point (q),
of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services
Act) and amending Directive 2000/31/EC];
(w) ‘main establishment’ means the head office or registered office of the provider of
relevant information society services within which the principal financial functions
and operational control are exercised.
EN 44 EN
CHAPTER II
OBLIGATIONS OF PROVIDERS OF RELEVANT INFORMATION SOCIETY
SERVICES TO PREVENT AND COMBAT ONLINE CHILD SEXUAL ABUSE
Section 1
Risk assessment and mitigation obligations
Article 3
Risk assessment
1. Providers of hosting services and providers of interpersonal communications services
shall identify, analyse and assess, for each such service that they offer, the risk of use
of the service for the purpose of online child sexual abuse.
2. When carrying out a risk assessment, the provider shall take into account, in
particular:
(a) any previously identified instances of use of its services for the purpose of
online child sexual abuse;
(b) the existence and implementation by the provider of a policy and the
availability of functionalities to address the risk referred to in paragraph 1,
including through the following:
– prohibitions and restrictions laid down in the terms and conditions;
– measures taken to enforce such prohibitions and restrictions;
– functionalities enabling age verification;
– functionalities enabling users to flag online child sexual abuse to the
provider through tools that are easily accessible and age-appropriate;
(c) the manner in which users use the service and the impact thereof on that risk;
(d) the manner in which the provider designed and operates the service, including
the business model, governance and relevant systems and processes, and the
impact thereof on that risk;
(e) with respect to the risk of solicitation of children:
(i) the extent to which the service is used or is likely to be used by children;
(ii) where the service is used by children, the different age groups of the child
users and the risk of solicitation of children in relation to those age
groups;
(iii) the availability of functionalities creating or reinforcing the risk of
solicitation of children, including the following functionalities:
EN 45 EN
– enabling users to search for other users and, in particular, for adult
users to search for child users;
– enabling users to establish contact with other users directly, in
particular through private communications;
– enabling users to share images or videos with other users, in
particular through private communications.
3. The provider may request the EU Centre to perform an analysis of representative,
anonymized data samples to identify potential online child sexual abuse, to support
the risk assessment.
The costs incurred by the EU Centre for the performance of such an analysis shall be
borne by the requesting provider. However, the EU Centre shall bear those costs
where the provider is a micro, small or medium-sized enterprise, provided the
request is reasonably necessary to support the risk assessment.
The Commission shall be empowered to adopt delegated acts in accordance with
Article 86 in order to supplement this Regulation with the necessary detailed rules on
the determination and charging of those costs and the application of the exemption
for micro, small and medium-sized enterprises.
4. The provider shall carry out the first risk assessment by [Date of application of this
Regulation + 3 months] or, where the provider did not offer the service in the Union
by [Date of application of this Regulation], by three months from the date at which
the provider started offering the service in the Union.
Subsequently, the provider shall update the risk assessment where necessary and at
least once every three years from the date at which it last carried out or updated the
risk assessment. However:
(a) for a service which is subject to a detection order issued in accordance with
Article 7, the provider shall update the risk assessment at the latest two months
before the expiry of the period of application of the detection order;
(b) the Coordinating Authority of establishment may require the provider to update
the risk assessment at a reasonable earlier date than the date referred to in the
second subparagraph, where there is evidence indicating a possible substantial
change in the risk that the service is used for the purpose of online child sexual
abuse.
5. The risk assessment shall include an assessment of any potential remaining risk that,
after taking the mitigation measures pursuant to Article 4, the service is used for the
purpose of online child sexual abuse.
6. The Commission, in cooperation with Coordinating Authorities and the EU Centre
and after having conducted a public consultation, may issue guidelines on the
application of paragraphs 1 to 5, having due regard in particular to relevant
technological developments and to the manners in which the services covered by
those provisions are offered and used.
EN 46 EN
Article 4
Risk mitigation
1. Providers of hosting services and providers of interpersonal communications services
shall take reasonable mitigation measures, tailored to the risk identified pursuant to
Article 3, to minimise that risk. Such measures shall include some or all of the
following:
(a) adapting, through appropriate technical and operational measures and staffing,
the provider’s content moderation or recommender systems, its decision-
making processes, the operation or functionalities of the service, or the content
or enforcement of its terms and conditions;
(b) reinforcing the provider’s internal processes or the internal supervision of the
functioning of the service;
(c) initiating or adjusting cooperation, in accordance with competition law, with
other providers of hosting services or providers of interpersonal
communication services, public authorities, civil society organisations or,
where applicable, entities awarded the status of trusted flaggers in accordance
with Article 19 of Regulation (EU) …/… [on a Single Market For Digital
Services (Digital Services Act) and amending Directive 2000/31/EC] .
2. The mitigation measures shall be:
(a) effective in mitigating the identified risk;
(b) targeted and proportionate in relation to that risk, taking into account, in
particular, the seriousness of the risk as well as the provider’s financial and
technological capabilities and the number of users;
(c) applied in a diligent and non-discriminatory manner, having due regard, in all
circumstances, to the potential consequences of the mitigation measures for the
exercise of fundamental rights of all parties affected;
(d) introduced, reviewed, discontinued or expanded, as appropriate, each time the
risk assessment is conducted or updated pursuant to Article 3(4), within three
months from the date referred to therein.
3. Providers of interpersonal communications services that have identified, pursuant to
the risk assessment conducted or updated in accordance with Article 3, a risk of use
of their services for the purpose of the solicitation of children, shall take the
necessary age verification and age assessment measures to reliably identify child
users on their services, enabling them to take the mitigation measures.
4. Providers of hosting services and providers of interpersonal communications services
shall clearly describe in their terms and conditions the mitigation measures that they
have taken. That description shall not include information that may reduce the
effectiveness of the mitigation measures.
EN 47 EN
5. The Commission, in cooperation with Coordinating Authorities and the EU Centre
and after having conducted a public consultation, may issue guidelines on the
application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant
technological developments and in the manners in which the services covered by
those provisions are offered and used.
Article 5
Risk reporting
1. Providers of hosting services and providers of interpersonal communications services
shall transmit, by three months from the date referred to in Article 3(4), to the
Coordinating Authority of establishment a report specifying the following:
(a) the process and the results of the risk assessment conducted or updated
pursuant to Article 3, including the assessment of any potential remaining risk
referred to in Article 3(5);
(b) any mitigation measures taken pursuant to Article 4.
2. Within three months after receiving the report, the Coordinating Authority of
establishment shall assess it and determine, on that basis and taking into account any
other relevant information available to it, whether the risk assessment has been
carried out or updated and the mitigation measures have been taken in accordance
with the requirements of Articles 3 and 4.
3. Where necessary for that assessment, that Coordinating Authority may require
further information from the provider, within a reasonable time period set by that
Coordinating Authority. That time period shall not be longer than two weeks.
The time period referred to in the first subparagraph shall be suspended until that
additional information is provided.
4. Without prejudice to Articles 7 and 27 to 29, where the requirements of Articles 3
and 4 have not been met, that Coordinating Authority shall require the provider to re-
conduct or update the risk assessment or to introduce, review, discontinue or expand,
as applicable, the mitigation measures, within a reasonable time period set by that
Coordinating Authority. That time period shall not be longer than one month.
5. Providers shall, when transmitting the report to the Coordinating Authority of
establishment in accordance with paragraph 1, transmit the report also to the EU
Centre.
6. Providers shall, upon request, transmit the report to the providers of software
application stores, insofar as necessary for the assessment referred to in Article 6(2).
Where necessary, they may remove confidential information from the reports.
Article 6
Obligations for software application stores
1. Providers of software application stores shall:
EN 48 EN
(a) make reasonable efforts to assess, where possible together with the providers of
software applications, whether each service offered through the software
applications that they intermediate presents a risk of being used for the purpose
of the solicitation of children;
(b) take reasonable measures to prevent child users from accessing the software
applications in relation to which they have identified a significant risk of use of
the service concerned for the purpose of the solicitation of children;
(c) take the necessary age verification and age assessment measures to reliably
identify child users on their services, enabling them to take the measures
referred to in point (b).
2. In assessing the risk referred to in paragraph 1, the provider shall take into account
all the available information, including the results of the risk assessment conducted
or updated pursuant to Article 3.
3. Providers of software application stores shall make publicly available information
describing the process and criteria used to assess the risk and describing the measures
referred to in paragraph 1. That description shall not include information that may
reduce the effectiveness of the assessment of those measures.
4. The Commission, in cooperation with Coordinating Authorities and the EU Centre
and after having conducted a public consultation, may issue guidelines on the
application of paragraphs 1, 2 and 3, having due regard in particular to relevant
technological developments and to the manners in which the services covered by
those provisions are offered and used.
Section 2 Detection obligations
Article 7
Issuance of detection orders
1. The Coordinating Authority of establishment shall have the power to request the
competent judicial authority of the Member State that designated it or another
independent administrative authority of that Member State to issue a detection order
requiring a provider of hosting services or a provider of interpersonal
communications services under the jurisdiction of that Member State to take the
measures specified in Article 10 to detect online child sexual abuse on a specific
service.
2. The Coordinating Authority of establishment shall, before requesting the issuance of
a detection order, carry out the investigations and assessments necessary to
determine whether the conditions of paragraph 4 have been met.
To that end, it may, where appropriate, require the provider to submit the necessary
information, additional to the report and the further information referred to in Article
5(1) and (3), respectively, within a reasonable time period set by that Coordinating
Authority, or request the EU Centre, another public authority or relevant experts or
entities to provide the necessary additional information.
EN 49 EN
3. Where the Coordinating Authority of establishment takes the preliminary view that
the conditions of paragraph 4 have been met, it shall:
(a) establish a draft request for the issuance of a detection order, specifying the
main elements of the content of the detection order it intends to request and the
reasons for requesting it;
(b) submit the draft request to the provider and the EU Centre;
(c) afford the provider an opportunity to comment on the draft request, within a
reasonable time period set by that Coordinating Authority;
(d) invite the EU Centre to provide its opinion on the draft request, within a time
period of four weeks from the date of receiving the draft request.
Where, having regard to the comments of the provider and the opinion of the EU
Centre, that Coordinating Authority continues to be of the view that the conditions of
paragraph 4 have met, it shall re-submit the draft request, adjusted where appropriate,
to the provider. In that case, the provider shall do all of the following, within a
reasonable time period set by that Coordinating Authority:
(a) draft an implementation plan setting out the measures it envisages taking to
execute the intended detection order, including detailed information regarding
the envisaged technologies and safeguards;
(b) where the draft implementation plan concerns an intended detection order
concerning the solicitation of children other than the renewal of a previously
issued detection order without any substantive changes, conduct a data
protection impact assessment and a prior consultation procedure as referred to
in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to
the measures set out in the implementation plan;
(c) where point (b) applies, or where the conditions of Articles 35 and 36 of
Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where
necessary in view of the outcome of the data protection impact assessment and
in order to take into account the opinion of the data protection authority
provided in response to the prior consultation;
(d) submit to that Coordinating Authority the implementation plan, where
applicable attaching the opinion of the competent data protection authority and
specifying how the implementation plan has been adjusted in view of the
outcome of the data protection impact assessment and of that opinion.
Where, having regard to the implementation plan of the provider and the opinion of
the data protection authority, that Coordinating Authority continues to be of the view
that the conditions of paragraph 4 have met, it shall submit the request for the
issuance of the detection, adjusted where appropriate, to the competent judicial
authority or independent administrative authority. It shall attach the implementation
plan of the provider and the opinions of the EU Centre and the data protection
authority to that request.
EN 50 EN
4. The Coordinating Authority of establishment shall request the issuance of the
detection order, and the competent judicial authority or independent administrative
authority shall issue the detection order where it considers that the following
conditions are met:
(a) there is evidence of a significant risk of the service being used for the purpose
of online child sexual abuse, within the meaning of paragraphs 5, 6 and 7, as
applicable;
(b) the reasons for issuing the detection order outweigh negative consequences for
the rights and legitimate interests of all parties affected, having regard in
particular to the need to ensure a fair balance between the fundamental rights
of those parties.
When assessing whether the conditions of the first subparagraph have been met,
account shall be taken of all relevant facts and circumstances of the case at hand, in
particular:
(a) the risk assessment conducted or updated and any mitigation measures taken
by the provider pursuant to Articles 3 and 4, including any mitigation measures
introduced, reviewed, discontinued or expanded pursuant to Article 5(4) where
applicable;
(b) any additional information obtained pursuant to paragraph 2 or any other
relevant information available to it, in particular regarding the use, design and
operation of the service, regarding the provider’s financial and technological
capabilities and size and regarding the potential consequences of the measures
to be taken to execute the detection order for all other parties affected;
(c) the views and the implementation plan of the provider submitted in accordance
with paragraph 3;
(d) the opinions of the EU Centre and of the data protection authority submitted in
accordance with paragraph 3.
As regards the second subparagraph, point (d), where that Coordinating Authority
substantially deviates from the opinion of the EU Centre, it shall inform the EU
Centre and the Commission thereof, specifying the points at which it deviated and
the main reasons for the deviation.
5. As regards detection orders concerning the dissemination of known child sexual
abuse material, the significant risk referred to in paragraph 4, first subparagraph,
point (a), shall be deemed to exist where the following conditions are met:
(a) it is likely, despite any mitigation measures that the provider may have taken or
will take, that the service is used, to an appreciable extent for the dissemination
of known child sexual abuse material;
(b) there is evidence of the service, or of a comparable service if the service has
not yet been offered in the Union at the date of the request for the issuance of
the detection order, having been used in the past 12 months and to an
appreciable extent for the dissemination of known child sexual abuse material.
EN 51 EN
6. As regards detection orders concerning the dissemination of new child sexual abuse
material, the significant risk referred to in paragraph 4, first subparagraph, point (a),
shall be deemed to exist where the following conditions are met:
(a) it is likely that, despite any mitigation measures that the provider may have
taken or will take, the service is used, to an appreciable extent, for the
dissemination of new child sexual abuse material;
(b) there is evidence of the service, or of a comparable service if the service has
not yet been offered in the Union at the date of the request for the issuance of
the detection order, having been used in the past 12 months and to an
appreciable extent, for the dissemination of new child sexual abuse material;
(c) for services other than those enabling the live transmission of pornographic
performances as defined in Article 2, point (e), of Directive 2011/93/EU:
(1) a detection order concerning the dissemination of known child sexual
abuse material has been issued in respect of the service;
(2) the provider submitted a significant number of reports concerning known
child sexual abuse material, detected through the measures taken to
execute the detection order referred to in point (1), pursuant to Article 12.
7. As regards detection orders concerning the solicitation of children, the significant
risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist
where the following conditions are met:
(a) the provider qualifies as a provider of interpersonal communication services;
(b) it is likely that, despite any mitigation measures that the provider may have
taken or will take, the service is used, to an appreciable extent, for the
solicitation of children;
(c) there is evidence of the service, or of a comparable service if the service has
not yet been offered in the Union at the date of the request for the issuance of
the detection order, having been used in the past 12 months and to an
appreciable extent, for the solicitation of children.
The detection orders concerning the solicitation of children shall apply only to
interpersonal communications where one of the users is a child user.
8. The Coordinating Authority of establishment when requesting the issuance of
detection orders, and the competent judicial or independent administrative authority
when issuing the detection order, shall target and specify it in such a manner that the
negative consequences referred to in paragraph 4, first subparagraph, point (b),
remain limited to what is strictly necessary to effectively address the significant risk
referred to in point (a) thereof.
To that aim, they shall take into account all relevant parameters, including the
availability of sufficiently reliable detection technologies in that they limit to the
maximum extent possible the rate of errors regarding the detection and their
suitability and effectiveness for achieving the objectives of this Regulation, as well
EN 52 EN
as the impact of the measures on the rights of the users affected, and require the
taking of the least intrusive measures, in accordance with Article 10, from among
several equally effective measures.
In particular, they shall ensure that:
(a) where that risk is limited to an identifiable part or component of a service, the
required measures are only applied in respect of that part or component;
(b) where necessary, in particular to limit such negative consequences, effective
and proportionate safeguards additional to those listed in Article 10(4), (5) and
(6) are provided for;
(c) subject to paragraph 9, the period of application remains limited to what is
strictly necessary.
9. The competent judicial authority or independent administrative authority shall
specify in the detection order the period during which it applies, indicating the start
date and the end date.
The start date shall be set taking into account the time reasonably required for the
provider to take the necessary measures to prepare the execution of the detection
order. It shall not be earlier than three months from the date at which the provider
received the detection order and not be later than 12 months from that date.
The period of application of detection orders concerning the dissemination of known
or new child sexual abuse material shall not exceed 24 months and that of detection
orders concerning the solicitation of children shall not exceed 12 months.
Article 8
Additional rules regarding detection orders
1. The competent judicial authority or independent administrative authority shall issue
the detection orders referred to in Article 7 using the template set out in Annex I.
Detection orders shall include:
(a) information regarding the measures to be taken to execute the detection order,
including the indicators to be used and the safeguards to be provided for,
including the reporting requirements set pursuant to Article 9(3) and, where
applicable, any additional safeguards as referred to in Article 7(8);
(b) identification details of the competent judicial authority or the independent
administrative authority issuing the detection order and authentication of the
detection order by that judicial or independent administrative authority;
(c) the name of the provider and, where applicable, its legal representative;
(d) the specific service in respect of which the detection order is issued and, where
applicable, the part or component of the service affected as referred to in
Article 7(8);
EN 53 EN
(e) whether the detection order issued concerns the dissemination of known or new
child sexual abuse material or the solicitation of children;
(f) the start date and the end date of the detection order;
(g) a sufficiently detailed statement of reasons explaining why the detection order
is issued;
(h) a reference to this Regulation as the legal basis for the detection order;
(i) the date, time stamp and electronic signature of the judicial or independent
administrative authority issuing the detection order;
(j) easily understandable information about the redress available to the addressee
of the detection order, including information about redress to a court and about
the time periods applicable to such redress.
2. The competent judicial authority or independent administrative authority issuing the
detection order shall address it to the main establishment of the provider or, where
applicable, to its legal representative designated in accordance with Article 24.
The detection order shall be transmitted to the provider’s point of contact referred to
in Article 23(1), to the Coordinating Authority of establishment and to the EU
Centre, through the system established in accordance with Article 39(2).
The detection order shall be drafted in the language declared by the provider
pursuant to Article 23(3).
3. If the provider cannot execute the detection order because it contains manifest errors
or does not contain sufficient information for its execution, the provider shall,
without undue delay, request the necessary clarification to the Coordinating
Authority of establishment, using the template set out in Annex II.
4. The Commission shall be empowered to adopt delegated acts in accordance with
Article 86 in order to amend Annexes I and II where necessary to improve the
templates in view of relevant technological developments or practical experiences
gained.
Article 9
Redress, information, reporting and modification of detection orders
1. Providers of hosting services and providers of interpersonal communications services
that have received a detection order, as well as users affected by the measures taken
to execute it, shall have a right to effective redress. That right shall include the right
to challenge the detection order before the courts of the Member State of the
competent judicial authority or independent administrative authority that issued the
detection order.
2. When the detection order becomes final, the competent judicial authority or
independent administrative authority that issued the detection order shall, without
undue delay, transmit a copy thereof to the Coordinating Authority of establishment.
EN 54 EN
The Coordinating Authority of establishment shall then, without undue delay,
transmit a copy thereof to all other Coordinating Authorities through the system
established in accordance with Article 39(2).
For the purpose of the first subparagraph, a detection order shall become final upon
the expiry of the time period for appeal where no appeal has been lodged in
accordance with national law or upon confirmation of the detection order following
an appeal.
3. Where the period of application of the detection order exceeds 12 months, or six
months in the case of a detection order concerning the solicitation of children, the
Coordinating Authority of establishment shall require the provider to report to it on
the execution of the detection order at least once, halfway through the period of
application.
Those reports shall include a detailed description of the measures taken to execute
the detection order, including the safeguards provided, and information on the
functioning in practice of those measures, in particular on their effectiveness in
detecting the dissemination of known or new child sexual abuse material or the
solicitation of children, as applicable, and on the consequences of those measures for
the rights and legitimate interests of all parties affected.
4. In respect of the detection orders that the competent judicial authority or independent
administrative authority issued at its request, the Coordinating Authority of
establishment shall, where necessary and in any event following reception of the
reports referred to in paragraph 3, assess whether any substantial changes to the
grounds for issuing the detection orders occurred and, in particular, whether the
conditions of Article 7(4) continue to be met. In that regard, it shall take account of
additional mitigation measures that the provider may take to address the significant
risk identified at the time of the issuance of the detection order.
That Coordinating Authority shall request to the competent judicial authority or
independent administrative authority that issued the detection order the modification
or revocation of such order, where necessary in the light of the outcome of that
assessment. The provisions of this Section shall apply to such requests, mutatis
mutandis.
Article 10
Technologies and safeguards
1. Providers of hosting services and providers of interpersonal communication services
that have received a detection order shall execute it by installing and operating
technologies to detect the dissemination of known or new child sexual abuse material
or the solicitation of children, as applicable, using the corresponding indicators
provided by the EU Centre in accordance with Article 46.
2. The provider shall be entitled to acquire, install and operate, free of charge,
technologies made available by the EU Centre in accordance with Article 50(1), for
the sole purpose of executing the detection order. The provider shall not be required
to use any specific technology, including those made available by the EU Centre, as
long as the requirements set out in this Article are met. The use of the technologies
EN 55 EN
made available by the EU Centre shall not affect the responsibility of the provider to
comply with those requirements and for any decisions it may take in connection to or
as a result of the use of the technologies.
3. The technologies shall be:
(a) effective in detecting the dissemination of known or new child sexual abuse
material or the solicitation of children, as applicable;
(b) not be able to extract any other information from the relevant communications
than the information strictly necessary to detect, using the indicators referred to
in paragraph 1, patterns pointing to the dissemination of known or new child
sexual abuse material or the solicitation of children, as applicable;
(c) in accordance with the state of the art in the industry and the least intrusive in
terms of the impact on the users’ rights to private and family life, including the
confidentiality of communication, and to protection of personal data;
(d) sufficiently reliable, in that they limit to the maximum extent possible the rate
of errors regarding the detection.
4. The provider shall:
(a) take all the necessary measures to ensure that the technologies and indicators,
as well as the processing of personal data and other data in connection thereto,
are used for the sole purpose of detecting the dissemination of known or new
child sexual abuse material or the solicitation of children, as applicable, insofar
as strictly necessary to execute the detection orders addressed to them;
(b) establish effective internal procedures to prevent and, where necessary, detect
and remedy any misuse of the technologies, indicators and personal data and
other data referred to in point (a), including unauthorized access to, and
unauthorised transfers of, such personal data and other data;
(c) ensure regular human oversight as necessary to ensure that the technologies
operate in a sufficiently reliable manner and, where necessary, in particular
when potential errors and potential solicitation of children are detected, human
intervention;
(d) establish and operate an accessible, age-appropriate and user-friendly
mechanism that allows users to submit to it, within a reasonable timeframe,
complaints about alleged infringements of its obligations under this Section, as
well as any decisions that the provider may have taken in relation to the use of
the technologies, including the removal or disabling of access to material
provided by users, blocking the users’ accounts or suspending or terminating
the provision of the service to the users, and process such complaints in an
objective, effective and timely manner;
(e) inform the Coordinating Authority, at the latest one month before the start date
specified in the detection order, on the implementation of the envisaged
measures set out in the implementation plan referred to in Article 7(3);
EN 56 EN
(f) regularly review the functioning of the measures referred to in points (a), (b),
(c) and (d) of this paragraph and adjust them where necessary to ensure that the
requirements set out therein are met, as well as document the review process
and the outcomes thereof and include that information in the report referred to
in Article 9(3).
5. The provider shall inform users in a clear, prominent and comprehensible way of the
following:
(a) the fact that it operates technologies to detect online child sexual abuse to
execute the detection order, the ways in which it operates those technologies
and the impact on the confidentiality of users’ communications;
(b) the fact that it is required to report potential online child sexual abuse to the EU
Centre in accordance with Article 12;
(c) the users’ right of judicial redress referred to in Article 9(1) and their rights to
submit complaints to the provider through the mechanism referred to in
paragraph 4, point (d) and to the Coordinating Authority in accordance with
Article 34.
The provider shall not provide information to users that may reduce the effectiveness
of the measures to execute the detection order.
6. Where a provider detects potential online child sexual abuse through the measures
taken to execute the detection order, it shall inform the users concerned without
undue delay, after Europol or the national law enforcement authority of a Member
State that received the report pursuant to Article 48 has confirmed that the
information to the users would not interfere with activities for the prevention,
detection, investigation and prosecution of child sexual abuse offences.
Article 11
Guidelines regarding detection obligations
The Commission, in cooperation with the Coordinating Authorities and the EU Centre and
after having conducted a public consultation, may issue guidelines on the application of
Articles 7 to 10, having due regard in particular to relevant technological developments and
the manners in which the services covered by those provisions are offered and used.
EN 57 EN
Section 3
Reporting obligations
Article 12
Reporting obligations
1. Where a provider of hosting services or a provider of interpersonal communications
services becomes aware in any manner other than through a removal order issued in
accordance with this Regulation of any information indicating potential online child
sexual abuse on its services, it shall promptly submit a report thereon to the EU
Centre in accordance with Article 13. It shall do so through the system established in
accordance with Article 39(2).
2. Where the provider submits a report pursuant to paragraph 1, it shall inform the user
concerned, providing information on the main content of the report, on the manner in
which the provider has become aware of the potential child sexual abuse concerned,
on the follow-up given to the report insofar as such information is available to the
provider and on the user’s possibilities of redress, including on the right to submit
complaints to the Coordinating Authority in accordance with Article 34.
The provider shall inform the user concerned without undue delay, either after
having received a communication from the EU Centre indicating that it considers the
report to be manifestly unfounded as referred to in Article 48(2), or after the expiry
of a time period of three months from the date of the report without having received
a communication from the EU Centre indicating that the information is not to be
provided as referred to in Article 48(6), point (a), whichever occurs first.
Where within the three months’ time period referred to in the second subparagraph
the provider receives such a communication from the EU Centre indicating that the
information is not to be provided, it shall inform the user concerned, without undue
delay, after the expiry of the time period set out in that communication.
3. The provider shall establish and operate an accessible, age-appropriate and user-
friendly mechanism that allows users to flag to the provider potential online child
sexual abuse on the service.
Article 13
Specific requirements for reporting
1. Providers of hosting services and providers of interpersonal communications services
shall submit the report referred to in Article 12 using the template set out in Annex
III. The report shall include:
(a) identification details of the provider and, where applicable, its legal
representative;
(b) the date, time stamp and electronic signature of the provider;
(c) all content data, including images, videos and text;
EN 58 EN
(d) all available data other than content data related to the potential online child
sexual abuse;
(e) whether the potential online child sexual abuse concerns the dissemination of
known or new child sexual abuse material or the solicitation of children;
(f) information concerning the geographic location related to the potential online
child sexual abuse, such as the Internet Protocol address;
(g) information concerning the identity of any user involved in the potential online
child sexual abuse;
(h) whether the provider has also reported, or will also report, the potential online
child sexual abuse to a public authority or other entity competent to receive
such reports of a third country and if so, which authority or entity;
(i) where the potential online child sexual abuse concerns the dissemination of
known or new child sexual abuse material, whether the provider has removed
or disabled access to the material;
(j) whether the provider considers that the report requires urgent action;
(k) a reference to this Regulation as the legal basis for reporting.
2. The Commission shall be empowered to adopt delegated acts in accordance with
Article 86 in order to amend Annex III to improve the template where necessary in
view of relevant technological developments or practical experiences gained.
Section 4
Removal obligations
Article 14
Removal orders
1. The Coordinating Authority of establishment shall have the power to request the
competent judicial authority of the Member State that designated it or another
independent administrative authority of that Member State to issue a removal order
requiring a provider of hosting services under the jurisdiction of the Member State
that designated that Coordinating Authority to remove or disable access in all
Member States of one or more specific items of material that, after a diligent
assessment, the Coordinating Authority or the courts or other independent
administrative authorities referred to in Article 36(1) identified as constituting child
sexual abuse material.
2. The provider shall execute the removal order as soon as possible and in any event
within 24 hours of receipt thereof.
3. The competent judicial authority or the independent administrative authority shall
issue a removal order using the template set out in Annex IV. Removal orders shall
include:
EN 59 EN
(a) identification details of the judicial or independent administrative authority
issuing the removal order and authentication of the removal order by that
authority;
(b) the name of the provider and, where applicable, of its legal representative;
(c) the specific service for which the removal order is issued;
(d) a sufficiently detailed statement of reasons explaining why the removal order is
issued and in particular why the material constitutes child sexual abuse
material;
(e) an exact uniform resource locator and, where necessary, additional information
for the identification of the child sexual abuse material;
(f) where applicable, the information about non-disclosure during a specified time
period, in accordance with Article 15(4), point (c);
(g) a reference to this Regulation as the legal basis for the removal order;
(h) the date, time stamp and electronic signature of the judicial or independent
administrative authority issuing the removal order;
(i) easily understandable information about the redress available to the addressee
of the removal order, including information about redress to a court and about
the time periods applicable to such redress.
4. The judicial authority or the independent administrative issuing the removal order
shall address it to the main establishment of the provider or, where applicable, to its
legal representative designated in accordance with Article 24.
It shall transmit the removal order to the point of contact referred to in Article 23(1)
by electronic means capable of producing a written record under conditions that
allow to establish the authentication of the sender, including the accuracy of the date
and the time of sending and receipt of the order, to the Coordinating Authority of
establishment and to the EU Centre, through the system established in accordance
with Article 39(2).
It shall draft the removal order in the language declared by the provider pursuant to
Article 23(3).
5. If the provider cannot execute the removal order on grounds of force majeure or de
facto impossibility not attributable to it, including for objectively justifiable technical
or operational reasons, it shall, without undue delay, inform the Coordinating
Authority of establishment of those grounds, using the template set out in Annex V.
The time period set out in paragraph 1 shall start to run as soon as the reasons
referred to in the first subparagraph have ceased to exist.
6. If the provider cannot execute the removal order because it contains manifest errors
or does not contain sufficient information for its execution, it shall, without undue
EN 60 EN
delay, request the necessary clarification to the Coordinating Authority of
establishment, using the template set out in Annex V.
The time period set out in paragraph 1 shall start to run as soon as the provider has
received the necessary clarification.
7. The provider shall, without undue delay and using the template set out in Annex VI,
inform the Coordinating Authority of establishment and the EU Centre, of the
measures taken to execute the removal order, indicating, in particular, whether the
provider removed the child sexual abuse material or disabled access thereto in all
Member States and the date and time thereof.
8. The Commission shall be empowered to adopt delegated acts in accordance with
Article 86 in order to amend Annexes IV, V and VI where necessary to improve the
templates in view of relevant technological developments or practical experiences
gained.
Article 15
Redress and provision of information
1. Providers of hosting services that have received a removal order issued in accordance
with Article 14, as well as the users who provided the material, shall have the right to
an effective redress. That right shall include the right to challenge such a removal
order before the courts of the Member State of the competent judicial authority or
independent administrative authority that issued the removal order.
2. When the removal order becomes final, the competent judicial authority or
independent administrative authority that issued the removal order shall, without
undue delay, transmit a copy thereof to the Coordinating Authority of establishment.
The Coordinating Authority of establishment shall then, without undue delay,
transmit a copy thereof to all other Coordinating Authorities through the system
established in accordance with Article 39(2).
For the purpose of the first subparagraph, a removal order shall become final upon
the expiry of the time period for appeal where no appeal has been lodged in
accordance with national law or upon confirmation of the removal order following an
appeal.
3. Where a provider removes or disables access to child sexual abuse material pursuant
to a removal order issued in accordance with Article 14, it shall without undue delay,
inform the user who provided the material of the following:
(a) the fact that it removed the material or disabled access thereto;
(b) the reasons for the removal or disabling, providing a copy of the removal order
upon the user’s request;
(c) the users’ rights of judicial redress referred to in paragraph 1 and to submit
complaints to the Coordinating Authority in accordance with Article 34.
EN 61 EN
4. The Coordinating Authority of establishment may request, when requesting the
judicial authority or independent administrative authority issuing the removal order,
and after having consulted with relevant public authorities, that the provider is not to
disclose any information regarding the removal of or disabling of access to the child
sexual abuse material, where and to the extent necessary to avoid interfering with
activities for the prevention, detection, investigation and prosecution of child sexual
abuse offences.
In such a case:
(a) the judicial authority or independent administrative authority issuing the
removal order shall set the time period not longer than necessary and not
exceeding six weeks, during which the provider is not to disclose such
information;
(b) the obligations set out in paragraph 3 shall not apply during that time period;
(c) that judicial authority or independent administrative authority shall inform the
provider of its decision, specifying the applicable time period.
That judicial authority or independent administrative authority may decide to extend
the time period referred to in the second subparagraph, point (a), by a further time
period of maximum six weeks, where and to the extent the non-disclosure continues
to be necessary. In that case, that judicial authority or independent administrative
authority shall inform the provider of its decision, specifying the applicable time
period. Article 14(3) shall apply to that decision.
Section 5
Blocking obligations
Article 16
Blocking orders
1. The Coordinating Authority of establishment shall have the power to request the
competent judicial authority of the Member State that designated it or an independent
administrative authority of that Member State to issue a blocking order requiring a
provider of internet access services under the jurisdiction of that Member State to
take reasonable measures to prevent users from accessing known child sexual abuse
material indicated by all uniform resource locators on the list of uniform resource
locators included in the database of indicators, in accordance with Article 44(2),
point (b) and provided by the EU Centre.
2. The Coordinating Authority of establishment shall, before requesting the issuance of
a blocking order, carry out all investigations and assessments necessary to determine
whether the conditions of paragraph 4 have been met.
To that end, it shall, where appropriate:
(a) verify that, in respect of all or a representative sample of the uniform resource
locators on the list referred to in paragraph 1, the conditions of Article 36(1),
EN 62 EN
point (b), are met, including by carrying out checks to verify in cooperation
with the EU Centre that the list is complete, accurate and up-to-date;
(b) require the provider to submit, within a reasonable time period set by that
Coordinating Authority, the necessary information, in particular regarding the
accessing or attempting to access by users of the child sexual abuse material
indicated by the uniform resource locators, regarding the provider’s policy to
address the risk of dissemination of the child sexual abuse material and
regarding the provider’s financial and technological capabilities and size;
(c) request the EU Centre to provide the necessary information, in particular
explanations and assurances regarding the accuracy of the uniform resource
locators in indicating child sexual abuse material, regarding the quantity and
nature of that material and regarding the verifications by the EU Centre and the
audits referred to in Article 36(2) and Article 46(7), respectively;
(d) request any other relevant public authority or relevant experts or entities to
provide the necessary information.
3. The Coordinating Authority of establishment shall, before requesting the issuance of
the blocking order, inform the provider of its intention to request the issuance of the
blocking order, specifying the main elements of the content of the intended blocking
order and the reasons to request the blocking order. It shall afford the provider an
opportunity to comment on that information, within a reasonable time period set by
that Coordinating Authority.
4. The Coordinating Authority of establishment shall request the issuance of the
blocking order, and the competent judicial authority or independent authority shall
issue the blocking order, where it considers that the following conditions are met:
(a) there is evidence of the service having been used during the past 12 months, to
an appreciable extent, for accessing or attempting to access the child sexual
abuse material indicated by the uniform resource locators;
(b) the blocking order is necessary to prevent the dissemination of the child sexual
abuse material to users in the Union, having regard in particular to the quantity
and nature of that material, the need to protect the rights of the victims and the
existence and implementation by the provider of a policy to address the risk of
such dissemination;
(c) the uniform resource locators indicate, in a sufficiently reliable manner, child
sexual abuse material;
(d) the reasons for issuing the blocking order outweigh negative consequences for
the rights and legitimate interests of all parties affected, having regard in
particular to the need to ensure a fair balance between the fundamental rights
of those parties, including the exercise of the users’ freedom of expression and
information and the provider’s freedom to conduct a business.
When assessing whether the conditions of the first subparagraph have been met,
account shall be taken of all relevant facts and circumstances of the case at hand,
EN 63 EN
including any information obtained pursuant to paragraph 2 and the views of the
provider submitted in accordance with paragraph 3.
5. The Coordinating Authority of establishment when requesting the issuance of
blocking orders, and the competent judicial or independent administrative authority
when issuing the blocking order, shall:
(a) specify effective and proportionate limits and safeguards necessary to ensure
that any negative consequences referred to in paragraph 4, point (d), remain
limited to what is strictly necessary;
(b) subject to paragraph 6, ensure that the period of application remains limited to
what is strictly necessary.
6. The Coordinating Authority shall specify in the blocking order the period during
which it applies, indicating the start date and the end date.
The period of application of blocking orders shall not exceed five years.
7. In respect of the blocking orders that the competent judicial authority or independent
administrative authority issued at its request, the Coordinating Authority shall, where
necessary and at least once every year, assess whether any substantial changes to the
grounds for issuing the blocking orders occurred and, in particular, whether the
conditions of paragraph 4 continue to be met.
That Coordinating Authority shall request to the competent judicial authority or
independent administrative authority that issued the blocking order the modification
or revocation of such order, where necessary in the light of the outcome of that
assessment or to take account of justified requests or the reports referred to in Article
18(5) and (6), respectively. The provisions of this Section shall apply to such
requests, mutatis mutandis.
Article 17
Additional rules regarding blocking orders
1. The Coordinating Authority of establishment shall issue the blocking orders referred
to in Article 16 using the template set out in Annex VII. Blocking orders shall
include:
(a) the reference to the list of uniform resource locators, provided by the EU
Centre, and the safeguards to be provided for, including the limits and
safeguards specified pursuant to Article 16(5) and, where applicable, the
reporting requirements set pursuant to Article 18(6);
(b) identification details of the competent judicial authority or the independent
administrative authority issuing the blocking order and authentication of the
blocking order by that authority;
(c) the name of the provider and, where applicable, its legal representative;
(d) the specific service in respect of which the detection order is issued;
EN 64 EN
(e) the start date and the end date of the blocking order;
(f) a sufficiently detailed statement of reasons explaining why the blocking order
is issued;
(g) a reference to this Regulation as the legal basis for the blocking order;
(h) the date, time stamp and electronic signature of the judicial authority or the
independent administrative authority issuing the blocking order;
(i) easily understandable information about the redress available to the addressee
of the blocking order, including information about redress to a court and about
the time periods applicable to such redress.
2. The competent judicial authority or independent administrative authority issuing the
blocking order shall address it to the main establishment of the provider or, where
applicable, to its legal representative designated in accordance with Article 24.
3. The blocking order shall be transmitted to the provider’s point of contact referred to
in Article 23(1), to the Coordinating Authority of establishment and to the EU
Centre, through the system established in accordance with Article 39(2).
4. The blocking order shall be drafted in the language declared by the provider pursuant
to Article 23(3).
5. If the provider cannot execute the blocking order because it contains manifest errors
or does not contain sufficient information for its execution, the provider shall,
without undue delay, request the necessary clarification to the Coordinating
Authority of establishment, using the template set out in Annex VIII.
6. The Commission shall be empowered to adopt delegated acts in accordance with
Article 86 in order to amend Annexes VII and VIII where necessary to improve the
templates in view of relevant technological developments or practical experiences
gained.
Article 18
Redress, information and reporting of blocking orders
1. Providers of internet access services that have received a blocking order, as well as
users who provided or were prevented from accessing a specific item of material
indicated by the uniform resource locators in execution of such orders, shall have a
right to effective redress. That right shall include the right to challenge the blocking
order before the courts of the Member State of the competent judicial authority or
independent administrative authority that issued the blocking order.
2. When the blocking order becomes final, the competent judicial authority or
independent administrative authority that issued the blocking order shall, without
undue delay, transmit a copy thereof to the Coordinating Authority of establishment.
The Coordinating Authority of establishment shall then, without undue delay,
transmit a copy thereof to all other Coordinating Authorities through the system
established in accordance with Article 39(2).
EN 65 EN
For the purpose of the first subparagraph, a blocking order shall become final upon
the expiry of the time period for appeal where no appeal has been lodged in
accordance with national law or upon confirmation of the removal order following an
appeal.
3. The provider shall establish and operate an accessible, age-appropriate and user-
friendly mechanism that allows users to submit to it, within a reasonable timeframe,
complaints about alleged infringements of its obligations under this Section. It shall
process such complaints in an objective, effective and timely manner.
4. Where a provider prevents users from accessing the uniform resource locators
pursuant to a blocking order issued in accordance with Article 17, it shall take
reasonable measures to inform the users of the following:
(a) the fact that it does so pursuant to a blocking order;
(b) the reasons for doing so, providing, upon request, a copy of the blocking order;
(c) the users’ right of judicial redress referred to in paragraph 1, their rights to
submit complaints to the provider through the mechanism referred to in
paragraph 3 and to the Coordinating Authority in accordance with Article 34,
as well as their right to submit the requests referred to in paragraph 5.
5. The provider and the users referred to in paragraph 1 shall be entitled to request the
Coordinating Authority that requested the issuance of the blocking order to assess
whether users are wrongly prevented from accessing a specific item of material
indicated by uniform resource locators pursuant to the blocking order. The provider
shall also be entitled to request modification or revocation of the blocking order,
where it considers it necessary due to substantial changes to the grounds for issuing
the blocking orders that occurred after the issuance thereof, in particular substantial
changes preventing the provider from taking the required reasonable measures to
execute the blocking order,
The Coordinating Authority shall, without undue delay, diligently assess such
requests and inform the provider or the user submitting the request of the outcome
thereof. Where it considers the request to be justified, it shall request modification or
revocation of the blocking order in accordance with Article 16(7) and inform the EU
Centre.
6. Where the period of application of the blocking order exceeds 24 months, the
Coordinating Authority of establishment shall require the provider to report to it on
the measures taken to execute the blocking order, including the safeguards provided
for, at least once, halfway through the period of application.
EN 66 EN
Section 6
Additional provisions
Article 19
Liability of providers
Providers of relevant information society services shall not be liable for child sexual abuse
offences solely because they carry out, in good faith, the necessary activities to comply with
the requirements of this Regulation, in particular activities aimed at detecting, identifying,
removing, disabling of access to, blocking or reporting online child sexual abuse in
accordance with those requirements.
Article 20
Victims’ right to information
1. Persons residing in the Union shall have the right to receive, upon their request, from
the Coordinating Authority designated by the Member State where they reside,
information regarding any instances where the dissemination of known child sexual
abuse material depicting them is reported to the EU Centre pursuant to Article 12.
Persons with disabilities shall have the right to ask and receive such an information
in a manner accessible to them.
That Coordinating Authority shall transmit the request to the EU Centre through the
system established in accordance with Article 39(2) and shall communicate the
results received from the EU Centre to the person making the request.
2. The request referred to in paragraph 1 shall indicate:
(a) the relevant item or items of known child sexual abuse material;
(b) where applicable, the individual or entity that is to receive the information on
behalf of the person making the request;
(c) sufficient elements to demonstrate the identity of the person making the
request.
3. The information referred to in paragraph 1 shall include:
(a) the identification of the provider that submitted the report;
(b) the date of the report;
(c) whether the EU Centre forwarded the report in accordance with Article 48(3)
and, if so, to which authorities;
(d) whether the provider reported having removed or disabled access to the
material, in accordance with Article 13(1), point (i).
EN 67 EN
Article 21
Victims’ right of assistance and support for removal
1. Providers of hosting services shall provide reasonable assistance, on request, to
persons residing in the Union that seek to have one or more specific items of known
child sexual abuse material depicting them removed or to have access thereto
disabled by the provider.
2. Persons residing in the Union shall have the right to receive, upon their request, from
the Coordinating Authority designated by the Member State where the person
resides, support from the EU Centre when they seek to have a provider of hosting
services remove or disable access to one or more specific items of known child
sexual abuse material depicting them. Persons with disabilities shall have the right to
ask and receive any information relating to such support in a manner accessible to
them.
That Coordinating Authority shall transmit the request to the EU Centre through the
system established in accordance with Article 39(2) and shall communicate the
results received from the EU Centre to the person making the request.
3. The requests referred to in paragraphs 1 and 2 shall indicate the relevant item or
items of child sexual abuse material.
4. The EU Centre’s support referred to in paragraph 2 shall include, as applicable:
(a) support in connection to requesting the provider’s assistance referred to in
paragraph 1;
(b) verifying whether the provider removed or disabled access to that item or those
items, including by conducting the searches referred to in Article 49(1);
(c) notifying the item or items of known child sexual abuse material depicting the
person to the provider and requesting removal or disabling of access, in
accordance with Article 49(2);
(d) where necessary, informing the Coordinating Authority of establishment of the
presence of that item or those items on the service, with a view to the issuance
of a removal order pursuant to Article 14.
Article 22
Preservation of information
1. Providers of hosting services and providers of interpersonal communications services
shall preserve the content data and other data processed in connection to the
measures taken to comply with this Regulation and the personal data generated
through such processing, only for one or more of the following purposes, as
applicable:
(a) executing a detection order issued pursuant to Article 7, or a removal order
issued pursuant to Article 14;
EN 68 EN
(b) reporting potential online child sexual abuse to the EU Centre pursuant to
Article 12;
(c) blocking the account of, or suspending or terminating the provision of the
service to, the user concerned;
(d) handling users’ complaints to the provider or to the Coordinating Authority, or
the exercise of users’ right to administrative or judicial redress, in respect of
alleged infringements of this Regulation;
(e) responding to requests issued by competent law enforcement authorities and
judicial authorities in accordance with the applicable law, with a view to
providing them with the necessary information for the prevention, detection,
investigation or prosecution of child sexual abuse offences, insofar as the
content data and other data relate to a report that the provider has submitted to
the EU Centre pursuant to Article 12.
As regards the first subparagraph, point (a), the provider may also preserve the
information for the purpose of improving the effectiveness and accuracy of the
technologies to detect online child sexual abuse for the execution of a detection order
issued to it in accordance with Article 7. However, it shall not store any personal data
for that purpose.
2. Providers shall preserve the information referred to in paragraph 1 for no longer than
necessary for the applicable purpose and, in any event, no longer than 12 months
from the date of the reporting or of the removal or disabling of access, whichever
occurs first.
They shall, upon request from the competent national authority or court, preserve the
information for a further specified period, set by that authority or court where and to
the extent necessary for ongoing administrative or judicial redress proceedings, as
referred to in paragraph 1, point (d).
Providers shall ensure that the information referred to in paragraph 1 is preserved in
a secure manner and that the preservation is subject to appropriate technical and
organisational safeguards. Those safeguards shall ensure, in particular, that the
information can be accessed and processed only for the purpose for which it is
preserved, that a high level of security is achieved and that the information is deleted
upon the expiry of the applicable time periods for preservation. Providers shall
regularly review those safeguards and adjust them where necessary.
Article 23
Points of contact
1. Providers of relevant information society services shall establish a single point of
contact allowing for direct communication, by electronic means, with the
Coordinating Authorities, other competent authorities of the Member States, the
Commission and the EU Centre, for the application of this Regulation.
2. The providers shall communicate to the EU Centre and make public the information
necessary to easily identify and communicate with their single points of contact,
EN 69 EN
including their names, addresses, the electronic mail addresses and telephone
numbers.
3. The providers shall specify in the information referred to in paragraph 2 the official
language or languages of the Union, which can be used to communicate with their
points of contact.
The specified languages shall include at least one of the official languages of the
Member State in which the provider has its main establishment or, where applicable,
where its legal representative resides or is established.
Article 24
Legal representative
1. Providers of relevant information society services which do not have their main
establishment in the Union shall designate, in writing, a natural or legal person as its
legal representative in the Union.
2. The legal representative shall reside or be established in one of the Member States
where the provider offers its services.
3. The provider shall mandate its legal representatives to be addressed in addition to or
instead of the provider by the Coordinating Authorities, other competent authorities
of the Member States and the Commission on all issues necessary for the receipt of,
compliance with and enforcement of decisions issued in relation to this Regulation,
including detection orders, removal orders and blocking orders.
4. The provider shall provide its legal representative with the necessary powers and
resources to cooperate with the Coordinating Authorities, other competent authorities
of the Member States and the Commission and comply with the decisions referred to
in paragraph 3.
5. The designated legal representative may be held liable for non-compliance with
obligations of the provider under this Regulation, without prejudice to the liability
and legal actions that could be initiated against the provider.
6. The provider shall notify the name, address, the electronic mail address and
telephone number of its legal representative designated pursuant to paragraph 1 to
the Coordinating Authority in the Member State where that legal representative
resides or is established, and to the EU Centre. They shall ensure that that
information is up to date and publicly available.
7. The designation of a legal representative within the Union pursuant to paragraph 1
shall not amount to an establishment in the Union.
EN 70 EN
CHAPTER III
SUPERVISION, ENFORCEMENT AND COOPERATION
Section 1
Coordinating Authorities for child sexual abuse issues
Article 25
Coordinating Authorities for child sexual abuse issues and other competent authorities
1. Member States shall, by [Date - two months from the date of entry into force of this
Regulation], designate one or more competent authorities as responsible for the
application and enforcement of this Regulation (‘competent authorities’).
2. Member States shall, by the date referred to in paragraph 1, designate one of the
competent authorities as their Coordinating Authority for child sexual abuse issues
(‘Coordinating Authority’).
The Coordinating Authority shall be responsible for all matters related to application
and enforcement of this Regulation in the Member State concerned, unless that
Member State has assigned certain specific tasks or sectors to other competent
authorities.
The Coordinating Authority shall in any event be responsible for ensuring
coordination at national level in respect of those matters and for contributing to the
effective, efficient and consistent application and enforcement of this Regulation
throughout the Union.
3. Where a Member State designates more than one competent authority in addition to
the Coordinating Authority, it shall ensure that the respective tasks of those
authorities and of the Coordinating Authority are clearly defined and that they
cooperate closely and effectively when performing their tasks. The Member State
concerned shall communicate the name of the other competent authorities as well as
their respective tasks to the EU Centre and the Commission.
4. Within one week after the designation of the Coordinating Authorities and any other
competent authorities pursuant to paragraph 1, Member States shall make publicly
available, and communicate to the Commission and the EU Centre, the name of their
Coordinating Authority. They shall keep that information updated.
5. Each Member State shall ensure that a contact point is designated or established
within the Coordinating Authority’s office to handle requests for clarification,
feedback and other communications in relation to all matters related to the
application and enforcement of this Regulation in that Member State. Member States
shall make the information on the contact point publicly available and communicate
it to the EU Centre. They shall keep that information updated.
6. Within two weeks after the designation of the Coordinating Authorities pursuant to
paragraph 2, the EU Centre shall set up an online register listing the Coordinating
EN 71 EN
Authorities and their contact points. The EU Centre shall regularly publish any
modification thereto.
7. Coordinating Authorities may, where necessary for the performance of their tasks
under this Regulation, request the assistance of the EU Centre in carrying out those
tasks, in particular by requesting the EU Centre to:
(a) provide certain information or technical expertise on matters covered by this
Regulation;
(b) assist in assessing, in accordance with Article 5(2), the risk assessment
conducted or updated or the mitigation measures taken by a provider of hosting
or interpersonal communication services under the jurisdiction of the Member
State that designated the requesting Coordinating Authority;
(c) verify the possible need to request competent national authorities to issue a
detection order, a removal order or a blocking order in respect of a service
under the jurisdiction of the Member State that designated that Coordinating
Authority;
(d) verify the effectiveness of a detection order or a removal order issued upon the
request of the requesting Coordinating Authority.
8. The EU Centre shall provide such assistance free of charge and in accordance with
its tasks and obligations under this Regulation and insofar as its resources and
priorities allow.
9. The requirements applicable to Coordinating Authorities set out in Articles 26, 27,
28, 29 and 30 shall also apply to any other competent authorities that the Member
States designate pursuant to paragraph 1.
Article 26
Requirements for Coordinating Authorities
1. Member States shall ensure that the Coordinating Authorities that they designated
perform their tasks under this Regulation in an objective, impartial, transparent and
timely manner, while fully respecting the fundamental rights of all parties affected.
Member States shall ensure that their Coordinating Authorities have adequate
technical, financial and human resources to carry out their tasks.
2. When carrying out their tasks and exercising their powers in accordance with this
Regulation, the Coordinating Authorities shall act with complete independence. To
that aim, Member States shall ensure, in particular, that they:
(a) are legally and functionally independent from any other public authority;
(b) have a status enabling them to act objectively and impartially when carrying
out their tasks under this Regulation;
(c) are free from any external influence, whether direct or indirect;
EN 72 EN
(d) neither seek nor take instructions from any other public authority or any private
party;
(e) are not charged with tasks relating to the prevention or combating of child
sexual abuse, other than their tasks under this Regulation.
3. Paragraph 2 shall not prevent supervision of the Coordinating Authorities in
accordance with national constitutional law, to the extent that such supervision does
not affect their independence as required under this Regulation.
4. The Coordinating Authorities shall ensure that relevant members of staff have the
required qualifications, experience and technical skills to perform their duties.
5. The management and other staff of the Coordinating Authorities shall, in accordance
with Union or national law, be subject to a duty of professional secrecy both during
and after their term of office, with regard to any confidential information which has
come to their knowledge in the course of the performance of their tasks. Member
States shall ensure that the management and other staff are subject to rules
guaranteeing that they can carry out their tasks in an objective, impartial and
independent manner, in particular as regards their appointment, dismissal,
remuneration and career prospects.
Section 2
Powers of Coordinating Authorities
Article 27
Investigatory powers
1. Where needed for carrying out their tasks, Coordinating Authorities shall have the
following powers of investigation, in respect of providers of relevant information
society services under the jurisdiction of the Member State that designated them:
(a) the power to require those providers, as well as any other persons acting for
purposes related to their trade, business, craft or profession that may
reasonably be aware of information relating to a suspected infringement of this
Regulation, to provide such information within a reasonable time period;
(b) the power to carry out on-site inspections of any premises that those providers
or the other persons referred to in point (a) use for purposes related to their
trade, business, craft or profession, or to request other public authorities to do
so, in order to examine, seize, take or obtain copies of information relating to a
suspected infringement of this Regulation in any form, irrespective of the
storage medium;
(c) the power to ask any member of staff or representative of those providers or the
other persons referred to in point (a) to give explanations in respect of any
information relating to a suspected infringement of this Regulation and to
record the answers;
EN 73 EN
(d) the power to request information, including to assess whether the measures
taken to execute a detection order, removal order or blocking order comply
with the requirements of this Regulation.
2. Member States may grant additional investigative powers to the Coordinating
Authorities.
Article 28
Enforcement powers
1. Where needed for carrying out their tasks, Coordinating Authorities shall have the
following enforcement powers, in respect of providers of relevant information
society services under the jurisdiction of the Member State that designated them:
(a) the power to accept the commitments offered by those providers in relation to
their compliance with this Regulation and to make those commitments binding;
(b) the power to order the cessation of infringements of this Regulation and, where
appropriate, to impose remedies proportionate to the infringement and
necessary to bring the infringement effectively to an end;
(c) the power to impose fines, or request a judicial authority in their Member State
to do so, in accordance with Article 35 for infringements of this Regulation,
including non-compliance with any of the orders issued pursuant to Article 27
and to point (b) of this paragraph;
(d) the power to impose a periodic penalty payment in accordance with Article 35
to ensure that an infringement of this Regulation is terminated in compliance
with an order issued pursuant to point (b) of this paragraph or for failure to
comply with any of the orders issued pursuant to Article 27 and to point (b) of
this paragraph;
(e) the power to adopt interim measures to avoid the risk of serious harm.
2. Member States may grant additional enforcement powers to the Coordinating
Authorities.
3. As regards paragraph 1, points (c) and (d), Coordinating Authorities shall have the
enforcement powers set out in those points also in respect of the other persons
referred to in Article 27, for failure to comply with any of the orders issued to them
pursuant to that Article.
4. They shall only exercise those enforcement powers after having provided those other
persons in good time with all relevant information relating to such orders, including
the applicable time period, the fines or periodic payments that may be imposed for
failure to comply and redress possibilities.
EN 74 EN
Article 29
Additional enforcement powers
1. Where needed for carrying out their tasks, Coordinating Authorities shall have the
additional enforcement powers referred to in paragraph 2, in respect of providers of
relevant information society services under the jurisdiction of the Member State that
designated them, provided that:
(a) all other powers pursuant to Articles 27 and 28 to bring about the cessation of
an infringement of this Regulation have been exhausted;
(b) the infringement persists;
(c) the infringement causes serious harm which cannot be avoided through the
exercise of other powers available under Union or national law.
2. Coordinating Authorities shall have the additional enforcement powers to take the
following measures:
(a) require the management body of the providers to examine the situation within a
reasonable time period and to:
(i) adopt and submit an action plan setting out the necessary measures to
terminate the infringement;
(ii) ensure that the provider takes those measures;
(iii) report on the measures taken;
(b) request the competent judicial authority or independent administrative
authority of the Member State that designated the Coordinating Authority to
order the temporary restriction of access of users of the service concerned by
the infringement or, only where that is not technically feasible, to the online
interface of the provider on which the infringement takes place, where the
Coordinating Authority considers that:
(i) the provider has not sufficiently complied with the requirements of point
(a);
(ii) the infringement persists and causes serious harm;
(iii) the infringement results in the regular and structural facilitation of child
sexual abuse offences.
3. The Coordinating Authority shall, prior to submitting the request referred to in
paragraph 2, point (b), invite interested parties to submit written observations on its
intention to submit that request within a reasonable time period set by that
Coordinating Authority. That time period shall not be less than two weeks.
The invitation to submit written observations shall:
(a) describe the measures that it intends to request;
EN 75 EN
(b) identify the intended addressee or addressees thereof.
The provider, the intended addressee or addressees and any other third party
demonstrating a legitimate interest shall be entitled to participate in the proceedings
regarding the request.
4. Any measure ordered upon the request referred to in paragraph 2, point (b), shall be
proportionate to the nature, gravity, recurrence and duration of the infringement,
without unduly restricting access to lawful information by users of the service
concerned.
The temporary restriction shall apply for a period of four weeks, subject to the
possibility for the competent judicial authority, in its order, to allow the Coordinating
Authority to extend that period for further periods of the same lengths, subject to a
maximum number of extensions set by that judicial authority.
The Coordinating Authority shall only extend the period where it considers, having
regard to the rights and legitimate interests of all parties affected by the restriction
and all relevant facts and circumstances, including any information that the provider,
the addressee or addressees and any other third party that demonstrated a legitimate
interest may provide to it, that both of the following conditions have been met:
(a) the provider has failed to take the necessary measures to terminate the
infringement;
(b) the temporary restriction does not unduly restrict access to lawful information
by users of the service, having regard to the number of users affected and
whether any adequate and readily accessible alternatives exist.
Where the Coordinating Authority considers that those two conditions have been met
but it cannot further extend the period pursuant to the second subparagraph, it shall
submit a new request to the competent judicial authority, as referred to in paragraph
2, point (b).
Article 30
Common provisions on investigatory and enforcement powers
1. The measures taken by the Coordinating Authorities in the exercise of their
investigatory and enforcement powers referred to in Articles 27, 28 and 29 shall be
effective, dissuasive and proportionate, having regard, in particular, to the nature,
gravity, recurrence and duration of the infringement of this Regulation or suspected
infringement to which those measures relate, as well as the economic, technical and
operational capacity of the provider of relevant information society services
concerned, where applicable.
2. Member States shall ensure that any exercise of the investigatory and enforcement
powers referred to in Articles 27, 28 and 29 is subject to adequate safeguards laid
down in the applicable national law to respect the fundamental rights of all parties
affected. In particular, those measures shall only be taken in accordance with the
right to respect for private life and the rights of defence, including the rights to be
EN 76 EN
heard and of access to the file, and subject to the right to an effective judicial remedy
of all parties affected.
Article 31
Searches to verify compliance
Coordinating Authorities shall have the power to carry out searches on publicly accessible
material on hosting services to detect the dissemination of known or new child sexual abuse
material, using the indicators contained in the databases referred to in Article 44(1), points (a)
and (b), where necessary to verify whether the providers of hosting services under the
jurisdiction of the Member State that designated the Coordinating Authorities comply with
their obligations under this Regulation.
Article 32
Notification of known child sexual abuse material
Coordinating Authorities shall have the power to notify providers of hosting services under
the jurisdiction of the Member State that designated them of the presence on their service of
one or more specific items of known child sexual abuse material and to request them to
remove or disable access to that item or those items, for the providers’ voluntary
consideration.
The request shall clearly set out the identification details of the Coordinating Authority
making the request and information on its contact point referred to in Article 25(5), the
necessary information for the identification of the item or items of known child sexual abuse
material concerned, as well as the reasons for the request. The request shall also clearly state
that it is for the provider’s voluntary consideration.
Section 3
Other provisions on enforcement
Article 33
Jurisdiction
1. The Member State in which the main establishment of the provider of relevant
information society services is located shall have jurisdiction for the purposes of this
Regulation.
2. A provider of relevant information society services which does not have an
establishment in the Union shall be deemed to be under the jurisdiction of the
Member State where its legal representative resides or is established.
Where a provider failed to appoint a legal representative in accordance with Article
24, all Member States shall have jurisdiction. Where a Member State decides to
exercise jurisdiction under this subparagraph, it shall inform all other Member States
and ensure that the principle of ne bis in idem is respected.
EN 77 EN
Article 34
Right of users of the service to lodge a complaint
1. Users shall have the right to lodge a complaint alleging an infringement of this
Regulation affecting them against providers of relevant information society services
with the Coordinating Authority designated by the Member State where the user
resides or is established.
2. Coordinating Authorities shall provide child-friendly mechanisms to submit a
complaint under this Article and adopt a child-sensitive approach when handling
complaints submitted by children, taking due account of the child's age, maturity,
views, needs and concerns.
3. The Coordinating Authority receiving the complaint shall assess the complaint and,
where appropriate, transmit it to the Coordinating Authority of establishment.
Where the complaint falls under the responsibility of another competent authority of
the Member State that designated the Coordinating Authority receiving the
complaint, that Coordinating Authority shall transmit it to that other competent
authority.
Article 35
Penalties
1. Member States shall lay down the rules on penalties applicable to infringements of
the obligations pursuant to Chapters II and V of this Regulation by providers of
relevant information society services under their jurisdiction and shall take all the
necessary measures to ensure that they are implemented.
The penalties shall be effective, proportionate and dissuasive. Member States shall,
by [Date of application of this Regulation], notify the Commission of those rules and
of those measures and shall notify it, without delay, of any subsequent amendments
affecting them.
2. Member States shall ensure that the maximum amount of penalties imposed for an
infringement of this Regulation shall not exceed 6 % of the annual income or global
turnover of the preceding business year of the provider.
3. Penalties for the supply of incorrect, incomplete or misleading information, failure to
reply or rectify incorrect, incomplete or misleading information or to submit to an
on-site inspection shall not exceed 1% of the annual income or global turnover of the
preceding business year of the provider or the other person referred to in Article 27.
4. Member States shall ensure that the maximum amount of a periodic penalty payment
shall not exceed 5 % of the average daily global turnover of the provider or the other
person referred to in Article 27 in the preceding financial year per day, calculated
from the date specified in the decision concerned.
EN 78 EN
5. Member States shall ensure that, when deciding whether to impose a penalty and
when determining the type and level of penalty, account is taken of all relevant
circumstances, including:
(a) the nature, gravity and duration of the infringement;
(b) whether the infringement was intentional or negligent;
(c) any previous infringements by the provider or the other person;
(d) the financial strength of the provider or the other person;
(e) the level of cooperation of the provider or the other person;
(f) the nature and size of the provider or the other person, in particular whether it
is a micro, small or medium-sized enterprise;
(g) the degree of fault of the provider or other person, taking into account the
technical and organisational measures taken by it to comply with this
Regulation.
Section 4
Cooperation
Article 36
Identification and submission of online child sexual abuse
1. Coordinating Authorities shall submit to the EU Centre, without undue delay and
through the system established in accordance with Article 39(2):
(a) specific items of material and transcripts of conversations that Coordinating
Authorities or that the competent judicial authorities or other independent
administrative authorities of a Member State have identified, after a diligent
assessment, as constituting child sexual abuse material or the solicitation of
children, as applicable, for the EU Centre to generate indicators in accordance
with Article 44(3);
(b) exact uniform resource locators indicating specific items of material that
Coordinating Authorities or that competent judicial authorities or other
independent administrative authorities of a Member State have identified, after
a diligent assessment, as constituting child sexual abuse material, hosted by
providers of hosting services not offering services in the Union, that cannot be
removed due to those providers’ refusal to remove or disable access thereto and
to the lack of cooperation by the competent authorities of the third country
having jurisdiction, for the EU Centre to compile the list of uniform resource
locators in accordance with Article 44(3).
Member States shall take the necessary measures to ensure that the Coordinating
Authorities that they designated receive, without undue delay, the material identified
as child sexual abuse material, the transcripts of conversations identified as the
EN 79 EN
solicitation of children, and the uniform resource locators, identified by a competent
judicial authority or other independent administrative authority than the Coordinating
Authority, for submission to the EU Centre in accordance with the first
subparagraph.
2. Upon the request of the EU Centre where necessary to ensure that the data contained
in the databases referred to in Article 44(1) are complete, accurate and up-to-date,
Coordinating Authorities shall verify or provide clarifications or additional
information as to whether the conditions of paragraph 1, points (a) and (b) have been
and, where relevant, continue to be met, in respect of a given submission to the EU
Centre in accordance with that paragraph.
3. Member States shall ensure that, where their law enforcement authorities receive a
report of the dissemination of new child sexual abuse material or of the solicitation
of children forwarded to them by the EU Centre in accordance with Article 48(3), a
diligent assessment is conducted in accordance with paragraph 1 and, if the material
or conversation is identified as constituting child sexual abuse material or as the
solicitation of children, the Coordinating Authority submits the material to the EU
Centre, in accordance with that paragraph, within one month from the date of
reception of the report or, where the assessment is particularly complex, two months
from that date.
4. They shall also ensure that, where the diligent assessment indicates that the material
does not constitute child sexual abuse material or the solicitation of children, the
Coordinating Authority is informed of that outcome and subsequently informs the
EU Centre thereof, within the time periods specified in the first subparagraph.
Article 37
Cross-border cooperation among Coordinating Authorities
1. Where a Coordinating Authority that is not the Coordinating Authority of
establishment has reasons to suspect that a provider of relevant information society
services infringed this Regulation, it shall request the Coordinating Authority of
establishment to assess the matter and take the necessary investigatory and
enforcement measures to ensure compliance with this Regulation.
Where the Commission has reasons to suspect that a provider of relevant information
society services infringed this Regulation in a manner involving at least three
Member States, it may recommend that the Coordinating Authority of establishment
assess the matter and take the necessary investigatory and enforcement measures to
ensure compliance with this Regulation.
2. The request or recommendation referred to in paragraph 1 shall at least indicate:
(a) the point of contact of the provider as set out in Article 23;
(b) a description of the relevant facts, the provisions of this Regulation concerned
and the reasons why the Coordinating Authority that sent the request, or the
Commission suspects, that the provider infringed this Regulation;
EN 80 EN
(c) any other information that the Coordinating Authority that sent the request, or
the Commission, considers relevant, including, where appropriate, information
gathered on its own initiative and suggestions for specific investigatory or
enforcement measures to be taken.
3. The Coordinating Authority of establishment shall assess the suspected infringement,
taking into utmost account the request or recommendation referred to in paragraph 1.
Where it considers that it has insufficient information to asses the suspected
infringement or to act upon the request or recommendation and has reasons to
consider that the Coordinating Authority that sent the request, or the Commission,
could provide additional information, it may request such information. The time
period laid down in paragraph 4 shall be suspended until that additional information
is provided.
4. The Coordinating Authority of establishment shall, without undue delay and in any
event not later than two months following receipt of the request or recommendation
referred to in paragraph 1, communicate to the Coordinating Authority that sent the
request, or the Commission, the outcome of its assessment of the suspected
infringement, or that of any other competent authority pursuant to national law where
relevant, and, where applicable, an explanation of the investigatory or enforcement
measures taken or envisaged in relation thereto to ensure compliance with this
Regulation.
Article 38
Joint investigations
1. Coordinating Authorities may participate in joint investigations, which may be
coordinated with the support of the EU Centre, of matters covered by this
Regulation, concerning providers of relevant information society services that offer
their services in several Member States.
Such joint investigations are without prejudice to the tasks and powers of the
participating Coordinating Authorities and the requirements applicable to the
performance of those tasks and exercise of those powers provided for in this
Regulation.
2. The participating Coordinating Authorities shall make the results of the joint
investigations available to other Coordinating Authorities, the Commission and the
EU Centre, through the system established in accordance with Article 39(2), for the
fulfilment of their respective tasks under this Regulation.
Article 39
General cooperation and information-sharing system
1. Coordinating Authorities shall cooperate with each other, any other competent
authorities of the Member State that designated the Coordinating Authority, the
Commission, the EU Centre and other relevant Union agencies, including Europol, to
facilitate the performance of their respective tasks under this Regulation and ensure
its effective, efficient and consistent application and enforcement.
EN 81 EN
2. The EU Centre shall establish and maintain one or more reliable and secure
information sharing systems supporting communications between Coordinating
Authorities, the Commission, the EU Centre, other relevant Union agencies and
providers of relevant information society services.
3. The Coordinating Authorities, the Commission, the EU Centre, other relevant Union
agencies and providers of relevant information society services shall use the
information-sharing systems referred to in paragraph 2 for all relevant
communications pursuant to this Regulation.
4. The Commission shall adopt implementing acts laying down the practical and
operational arrangements for the functioning of the information-sharing systems
referred to in paragraph 2 and their interoperability with other relevant systems.
Those implementing acts shall be adopted in accordance with the advisory procedure
referred to in Article 87.
EN 82 EN
CHAPTER IV
EU CENTRE TO PREVENT AND COMBAT CHILD SEXUAL ABUSE
Section 1
Principles
Article 40
Establishment and scope of action of the EU Centre
1. A European Union Agency to prevent and combat child sexual abuse, the EU Centre
on Child Sexual Abuse, is established.
2. The EU Centre shall contribute to the achievement of the objective of this Regulation
by supporting and facilitating the implementation of its provisions concerning the
detection, reporting, removal or disabling of access to, and blocking of online child
sexual abuse and gather and share information and expertise and facilitate
cooperation between relevant public and private parties in connection to the
prevention and combating of child sexual abuse, in particular online.
Article 41
Legal status
1. The EU Centre shall be a body of the Union with legal personality.
2. In each of the Member States the EU Centre shall enjoy the most extensive legal
capacity accorded to legal persons under their laws. It may, in particular, acquire and
dispose of movable and immovable property and be party to legal proceedings.
3. The EU Centre shall be represented by its Executive Director.
Article 42
Seat
The seat of the EU Centre shall be The Hague, The Netherlands.
Section 2
Tasks
Article 43
Tasks of the EU Centre
The EU Centre shall:
EN 83 EN
(1) facilitate the risk assessment process referred to in Section 1 of Chapter II, by:
(a) supporting the Commission in the preparation of the guidelines referred to in
Article 3(8), Article 4(5), Article 6(4) and Article 11, including by collecting
and providing relevant information, expertise and best practices, taking into
account advice from the Technology Committee referred to in Article 66;
(b) upon request from a provider of relevant information services, providing an
analysis of anonymised data samples for the purpose referred to in Article 3(3);
(2) facilitate the detection process referred to in Section 2 of Chapter II, by:
(a) providing the opinions on intended detection orders referred to in Article 7(3),
first subparagraph, point (d);
(b) maintaining and operating the databases of indicators referred to in Article 44;
(c) giving providers of hosting services and providers of interpersonal
communications services that received a detection order access to the relevant
databases of indicators in accordance with Article 46;
(d) making technologies available to providers for the execution of detection
orders issued to them, in accordance with Article 50(1);
(3) facilitate the reporting process referred to in Section 3 of Chapter II, by:
(a) maintaining and operating the database of reports referred to in Article 45;
(b) assessing, processing and, where necessary, forwarding the reports and
providing feedback thereon in accordance with Article 48;
(4) facilitate the removal process referred to in Section 4 of Chapter II and the other
processes referred to in Section 5 and 6 of that Chapter, by:
(a) receiving the removal orders transmitted to it pursuant to Article 14(4) in order
to fulfil the verification function referred to in Article 49(1);
(b) cooperating with and responding to requests of Coordinating Authorities in
connection to intended blocking orders as referred to in Article 16(2);
(c) receiving and processing the blocking orders transmitted to it pursuant to
Article 17(3);
(d) providing information and support to victims in accordance with Articles 20
and 21;
(e) maintaining up-to-date records of contact points and legal representatives of
providers of relevant information society services as provided in accordance
with Article 23(2) and Article 24(6);
(5) support the Coordinating Authorities and the Commission in the performance of their
tasks under this Regulation and facilitate cooperation, coordination and
communication in connection to matters covered by this Regulation, by:
EN 84 EN
(a) creating and maintaining an online register listing the Coordinating Authorities
and their contact points referred to in Article 25(6);
(b) providing assistance to the Coordinating Authorities as provided for in Article
25(7);
(c) assisting the Commission, upon its request, in connection to its tasks under the
cooperation mechanism referred to in Article 37;
(d) creating, maintaining and operating the information-sharing system referred to
in Article 39;
(e) assisting the Commission in the preparation of the delegated and implementing
acts and the guidelines that the Commission adopts under this Regulation;
(f) providing information to Coordinating Authorities, upon their request or on its
own initiative, relevant for the performance of their tasks under this
Regulation, including by informing the Coordinating Authority of
establishment of potential infringements identified in the performance of the
EU Centre’s other tasks;
(6) facilitate the generation and sharing of knowledge with other Union institutions,
bodies, offices and agencies, Coordinating Authorities or other relevant authorities of
the Member States to contribute to the achievement of the objective of this
Regulation, by:
(a) collecting, recording, analysing and providing information, providing analysis
based on anonymised and non-personal data gathering, and providing expertise
on matters regarding the prevention and combating of online child sexual
abuse, in accordance with Article 51;
(b) supporting the development and dissemination of research and expertise on
those matters and on assistance to victims, including by serving as a hub of
expertise to support evidence-based policy;
(c) drawing up the annual reports referred to in Article 84.
Article 44
Databases of indicators
1. The EU Centre shall create, maintain and operate databases of the following three
types of indicators of online child sexual abuse:
(a) indicators to detect the dissemination of child sexual abuse material previously
detected and identified as constituting child sexual abuse material in
accordance with Article 36(1);
(b) indicators to detect the dissemination of child sexual abuse material not
previously detected and identified as constituting child sexual abuse material in
accordance with Article 36(1);
EN 85 EN
(c) indicators to detect the solicitation of children.
2. The databases of indicators shall solely contain:
(a) relevant indicators, consisting of digital identifiers to be used to detect the
dissemination of known or new child sexual abuse material or the solicitation
of children, as applicable, on hosting services and interpersonal
communications services, generated by the EU Centre in accordance with
paragraph 3;
(b) as regards paragraph 1, point (a), the relevant indicators shall include a list of
uniform resource locators compiled by the EU Centre in accordance with
paragraph 3;
(c) the necessary additional information to facilitate the use of the indicators in
accordance with this Regulation, including identifiers allowing for a distinction
between images, videos and, where relevant, other types of material for the
detection of the dissemination of known and new child sexual abuse material
and language identifiers for the detection of solicitation of children.
3. The EU Centre shall generate the indicators referred to in paragraph 2, point (a),
solely on the basis of the child sexual abuse material and the solicitation of children
identified as such by the Coordinating Authorities or the courts or other independent
authorities of the Member States, submitted to it by the Coordinating Authorities
pursuant to Article 36(1), point (a).
The EU Centre shall compile the list of uniform resource locators referred to in
paragraph 2, point (b), solely on the basis of the uniform resource locators submitted
to it pursuant to Article 36(1), point (b).
4. The EU Centre shall keep records of the submissions and of the process applied to
generate the indicators and compile the list referred to in the first and second
subparagraphs. It shall keep those records for as long as the indicators, including the
uniform resource locators, to which they correspond are contained in the databases of
indicators referred to in paragraph 1.
Article 45
Database of reports
1. The EU Centre shall create, maintain and operate a database for the reports submitted
to it by providers of hosting services and providers of interpersonal communications
services in accordance with Article 12(1) and assessed and processed in accordance
with Article 48.
2. The database of reports shall contain the following information:
(a) the report;
(b) where the EU Centre considered the report manifestly unfounded, the reasons
and the date and time of informing the provider in accordance with Article
48(2);
EN 86 EN
(c) where the EU Centre forwarded the report in accordance with Article 48(3), the
date and time of such forwarding and the name of the competent law
enforcement authority or authorities to which it forwarded the report or, where
applicable, information on the reasons for forwarding the report solely to
Europol for further analysis;
(d) where applicable, information on the requests for and provision of additional
information referred to in Article 48(5);
(e) where available, information indicating that the provider that submitted a
report concerning the dissemination of known or new child sexual abuse
material removed or disabled access to the material;
(f) where applicable, information on the EU Centre’s request to the Coordinating
Authority of establishment to issue a removal order pursuant to Article 14 in
relation to the item or items of child sexual abuse material to which the report
relates;
(g) relevant indicators and ancillary tags associated with the reported potential
child sexual abuse material.
Article 46
Access, accuracy and security
1. Subject to paragraphs 2 and 3, solely EU Centre staff and auditors duly authorised by
the Executive Director shall have access to and be entitled to process the data
contained in the databases referred to in Articles 44 and 45.
2. The EU Centre shall give providers of hosting services, providers of interpersonal
communications services and providers of internet access services access to the
databases of indicators referred to in Article 44, where and to the extent necessary for
them to execute the detection or blocking orders that they received in accordance
with Articles 7 or 16. It shall take measures to ensure that such access remains
limited to what is strictly necessary for the period of application of the detection or
blocking orders concerned and that such access does not in any way endanger the
proper operation of those databases and the accuracy and security of the data
contained therein.
3. The EU Centre shall give Coordinating Authorities access to the databases of
indicators referred to in Article 44 where and to the extent necessary for the
performance of their tasks under this Regulation.
4. The EU Centre shall give Europol and the competent law enforcement authorities of
the Member States access to the databases of indicators referred to in Article 44
where and to the extent necessary for the performance of their tasks of investigating
suspected child sexual abuse offences.
5. The EU Centre shall give Europol access to the databases of reports referred to in
Article 45, where and to the extent necessary for the performance of its tasks of
assisting investigations of suspected child sexual abuse offences
EN 87 EN
6. The EU Centre shall provide the access referred to in paragraphs 2, 3, 4 and 5 only
upon the reception of a request, specifying the purpose of the request, the modalities
of the requested access, and the degree of access needed to achieve that purpose. The
requests for the access referred to in paragraph 2 shall also include a reference to the
detection order or the blocking order, as applicable.
The EU Centre shall diligently assess those requests and only grant access where it
considers that the requested access is necessary for and proportionate to the specified
purpose.
7. The EU Centre shall regularly verify that the data contained in the databases referred
to in Articles 44 and 45 is, in all respects, complete, accurate and up-to-date and
continues to be necessary for the purposes of reporting, detection and blocking in
accordance with this Regulation, as well as facilitating and monitoring of accurate
detection technologies and processes. In particular, as regards the uniform resource
locators contained in the database referred to Article 44(1), point (a), the EU Centre
shall, where necessary in cooperation with the Coordination Authorities, regularly
verify that the conditions of Article 36(1), point (b), continue to be met. Those
verifications shall include audits, where appropriate. Where necessary in view of
those verifications, it shall immediately complement, adjust or delete the data.
8. The EU Centre shall ensure that the data contained in the databases referred to in
Articles 44 and 45 is stored in a secure manner and that the storage is subject to
appropriate technical and organisational safeguards. Those safeguards shall ensure,
in particular, that the data can be accessed and processed only by duly authorised
persons for the purpose for which the person is authorised and that a high level of
security is achieved. The EU Centre shall regularly review those safeguards and
adjust them where necessary.
Article 47
Delegated acts relating to the databases
The Commission shall be empowered to adopt delegated acts in accordance with Article 86 in
order to supplement this Regulation with the necessary detailed rules concerning:
(a) the types, precise content, set-up and operation of the databases of indicators referred
to in Article 44(1), including the indicators and the necessary additional information
to be contained therein referred to in Article 44(2);
(b) the processing of the submissions by Coordinating Authorities, the generation of the
indicators, the compilation of the list of uniform resource locators and the record-
keeping, referred to in Article 44(3);
(c) the precise content, set-up and operation of the database of reports referred to in
Article 45(1);
(d) access to the databases referred to in Articles 44 and 45, including the modalities of
the access referred to in Article 46(1) to (5), the content, processing and assessment
of the requests referred to in Article 46(6), procedural matters related to such
requests and the necessary measures referred to in Article 46(6);
EN 88 EN
(e) the regular verifications and audits to ensure that the data contained in those
databases is complete, accurate and up-to-date referred to in Article 46(7) and the
security of the storage of the data, including the technical and organisational
safeguards and regular review referred to in Article 46(8).
Article 48
Reporting
1. The EU Centre shall expeditiously assess and process reports submitted by providers
of hosting services and providers of interpersonal communications services in
accordance with Article 12 to determine whether the reports are manifestly
unfounded or are to be forwarded.
2. Where the EU Centre considers that the report is manifestly unfounded, it shall
inform the provider that submitted the report, specifying the reasons why it considers
the report to be unfounded.
3. Where the EU Centre considers that a report is not manifestly unfounded, it shall
forward the report, together with any additional relevant information available to it,
to Europol and to the competent law enforcement authority or authorities of the
Member State likely to have jurisdiction to investigate or prosecute the potential
child sexual abuse to which the report relates.
Where that competent law enforcement authority or those competent law
enforcement authorities cannot be determined with sufficient certainty, the EU
Centre shall forward the report, together with any additional relevant information
available to it, to Europol, for further analysis and subsequent referral by Europol to
the competent law enforcement authority or authorities.
4. Where a provider that submitted the report has indicated that the report requires
urgent action, the EU Centre shall assess and process that report as a matter of
priority and, where it forwards the report in accordance with paragraph 3 and it
considers that the report requires urgent action, shall ensure that the forwarded report
is marked as such.
5. Where the report does not contain all the information required in Article 13, the EU
Centre may request the provider that submitted the report to provide the missing
information.
6. Where so requested by a competent law enforcement authority of a Member State in
order to avoid interfering with activities for the prevention, detection, investigation
and prosecution of child sexual abuse offences, the EU Centre shall:
(a) communicate to the provider that submitted the report that it is not to inform
the user concerned, specifying the time period during which the provider is not
to do so;
(b) where the provider that submitted the report is a provider of hosting services
and the report concerns the potential dissemination of child sexual abuse
material, communicate to the provider that it is not to remove or disable access
EN 89 EN
to the material, specifying the time period during which the provider is not to
do so.
7. The time periods referred to in the first subparagraph, points (a) and (b), shall be
those specified in the competent law enforcement authority’s request to the EU
Centre, provided that they remain limited to what is necessary to avoid interference
with the relevant activities and does not exceed 18 months.
8. The EU Centre shall verify whether a provider of hosting services that submitted a
report concerning the potential dissemination of child sexual abuse material removed
or disabled access to the material, insofar as the material is publicly accessible.
Where it considers that the provider did not remove or disable access to the material
expeditiously, the EU Centre shall inform the Coordinating Authority of
establishment thereof.
Article 49
Searches and notification
1. The EU Centre shall have the power to conduct searches on hosting services for the
dissemination of publicly accessible child sexual abuse material, using the relevant
indicators from the database of indicators referred to in Article 44(1), points (a) and
(b), in the following situations:
(a) where so requested to support a victim by verifying whether the provider of
hosting services removed or disabled access to one or more specific items of
known child sexual abuse material depicting the victim, in accordance with
Article 21(4), point (c);
(b) where so requested to assist a Coordinating Authority by verifying the possible
need for the issuance of a detection order or a removal order in respect of a
specific service or the effectiveness of a detection order or a removal order that
the Coordinating Authority issued, in accordance with Article 25(7), points (c)
and (d), respectively.
2. The EU Centre shall have the power to notify, after having conducted the searches
referred to in paragraph 1, providers of hosting services of the presence of one or
more specific items of known child sexual abuse material on their services and
request them to remove or disable access to that item or those items, for the
providers’ voluntary consideration.
The request shall clearly set out the identification details of the EU Centre and a
contact point, the necessary information for the identification of the item or items, as
well as the reasons for the request. The request shall also clearly state that it is for the
provider’s voluntary consideration.
3. Where so requested by a competent law enforcement authority of a Member State in
order to avoid interfering with activities for the prevention, detection, investigation
and prosecution of child sexual abuse offences, the EU Centre shall not submit a
notice, for as long as necessary to avoid such interference but no longer than 18
months.
EN 90 EN
Article 50
Technologies, information and expertise
1. The EU Centre shall make available technologies that providers of hosting services
and providers of interpersonal communications services may acquire, install and
operate, free of charge, where relevant subject to reasonable licensing conditions, to
execute detection orders in accordance with Article 10(1).
To that aim, the EU Centre shall compile lists of such technologies, having regard to
the requirements of this Regulation and in particular those of Article 10(2).
Before including specific technologies on those lists, the EU Centre shall request the
opinion of its Technology Committee and of the European Data Protection Board.
The Technology Committee and the European Data Protection Board shall deliver
their respective opinions within eight weeks. That period may be extended by a
further six weeks where necessary, taking into account the complexity of the subject
matter. The Technology Committee and the European Data Protection Board shall
inform the EU Centre of any such extension within one month of receipt of the
request for consultation, together with the reasons for the delay.
2. The EU Centre shall collect, record, analyse and make available relevant, objective,
reliable and comparable information on matters related to the prevention and
combating of child sexual abuse, in particular:
(a) information obtained in the performance of its tasks under this Regulation
concerning detection, reporting, removal or disabling of access to, and
blocking of online child sexual abuse;
(b) information resulting from the research, surveys and studies referred to in
paragraph 3;
(c) information resulting from research or other activities conducted by Member
States’ authorities, other Union institutions, bodies, offices and agencies, the
competent authorities of third countries, international organisations, research
centres and civil society organisations.
3. Where necessary for the performance of its tasks under this Regulation, the EU
Centre shall carry out, participate in or encourage research, surveys and studies,
either on its own initiative or, where appropriate and compatible with its priorities
and its annual work programme, at the request of the European Parliament, the
Council or the Commission.
4. The EU Centre shall provide the information referred to in paragraph 2 and the
information resulting from the research, surveys and studies referred to in paragraph
3, including its analysis thereof, and its opinions on matters related to the prevention
and combating of online child sexual abuse to other Union institutions, bodies,
offices and agencies, Coordinating Authorities, other competent authorities and other
public authorities of the Member States, either on its own initiative or at request of
the relevant authority. Where appropriate, the EU Centre shall make such
information publicly available.
EN 91 EN
5. The EU Centre shall develop a communication strategy and promote dialogue with
civil society organisations and providers of hosting or interpersonal communication
services to raise public awareness of online child sexual abuse and measures to
prevent and combat such abuse.
Section 3
Processing of information
Article 51
Processing activities and data protection
1. In so far as is necessary for the performance of its tasks under this Regulation, the
EU Centre may process personal data.
2. The EU Centre shall process personal data as strictly necessary for the purposes of:
(a) providing the opinions on intended detection orders referred to in Article 7(3);
(b) cooperating with and responding to requests of Coordinating Authorities in
connection to intended blocking orders as referred to in Article 16(2);
(c) receiving and processing blocking orders transmitted to it pursuant to Article
17(3);
(d) cooperating with Coordinating Authorities in accordance with Articles 20 and
21 on tasks related to victims’ rights to information and assistance;
(e) maintaining up-to-date records of contact points and legal representatives of
providers of relevant information society services as provided in accordance
with Article 23(2) and Article 24(6);
(f) creating and maintaining an online register listing the Coordinating Authorities
and their contact points referred to in Article 25(6);
(g) providing assistance to Coordinating Authorities in accordance with Article
25(7);
(h) assisting the Commission, upon its request, in connection to its tasks under the
cooperation mechanism referred to in Article 37;
(i) create, maintain and operate the databases of indicators referred to in Article
44;
(j) create, maintain and operate the database of reports referred to in Article 45;
(k) providing and monitoring access to the databases of indicators and of reports in
accordance with Article 46;
(l) performing data quality control measures in accordance with Article 46(7);
EN 92 EN
(m) assessing and processing reports of potential online child sexual abuse in
accordance with Article 48;
(n) cooperating with Europol and partner organisations in accordance with Articles
53 and 54, including on tasks related to the identification of victims;
(o) generating statistics in accordance with Article 83.
3. The EU Centre shall store the personal data referred to in paragraph 2 only where
and for as long as strictly necessary for the applicable purposes listed in paragraph 2.
4. It shall ensure that the personal data is stored in a secure manner and that the storage
is subject to appropriate technical and organisational safeguards. Those safeguards
shall ensure, in particular, that the personal data can be accessed and processed only
for the purpose for which it is stored, that a high level of security is achieved and that
the personal data is deleted when no longer strictly necessary for the applicable
purposes. It shall regularly review those safeguards and adjust them where necessary.
Section 4
Cooperation
Article 52
Contact officers
1. Each Coordinating Authority shall designate at least one contact officer, who shall be
the main contact point for the EU Centre in the Member State concerned. The contact
officers may be seconded to the EU Centre. Where several contact officers are
designated, the Coordinating Authority shall designate one of them as the main
contact officer.
2. Contact officers shall assist in the exchange of information between the EU Centre
and the Coordinating Authorities that designated them. Where the EU Centre
receives reports submitted in accordance with Article 12 concerning the potential
dissemination of new child sexual abuse material or the potential solicitation of
children, the contact officers designated by the competent Member State shall
facilitate the process to determine the illegality of the material or conversation, in
accordance with Article 36(1).
3. The Management Board shall determine the rights and obligations of contact officers
in relation to the EU Centre. Contact officers shall enjoy the privileges and
immunities necessary for the performance of their tasks.
4. Where contact officers are seconded to the EU Centre, the EU Centre shall cover the
costs of providing them with the necessary premises within the building and
adequate support for contact officers to perform their duties. All other costs that arise
in connection with the designation of contact officers and the performance of their
tasks shall be borne by the Coordinating Authority that designated them.
EN 93 EN
Article 53
Cooperation with Europol
1. Where necessary for the performance of its tasks under this Regulation, within their
respective mandates, the EU Centre shall cooperate with Europol.
2. Europol and the EU Centre shall provide each other with the fullest possible access
to relevant information and information systems, where necessary for the
performance of their respective tasks and in accordance with the acts of Union law
regulating such access.
Without prejudice to the responsibilities of the Executive Director, the EU Centre
shall maximise efficiency by sharing administrative functions with Europol,
including functions relating to personnel management, information technology (IT)
and budget implementation.
3. The terms of cooperation and working arrangements shall be laid down in a
memorandum of understanding.
Article 54
Cooperation with partner organisations
1. Where necessary for the performance of its tasks under this Regulation, the EU
Centre may cooperate with organisations and networks with information and
expertise on matters related to the prevention and combating of online child sexual
abuse, including civil society organisations and semi-public organisations.
2. The EU Centre may conclude memoranda of understanding with organisations
referred to in paragraph 1, laying down the terms of cooperation.
Section 5
Organisation
Article 55
Administrative and management structure
The administrative and management structure of the EU Centre shall comprise:
(a) a Management Board, which shall exercise the functions set out in Article 57;
(b) an Executive Board which shall perform the tasks set out in Article 62;
(c) an Executive Director of the EU Centre, who shall exercise the responsibilities
set out in Article 64;
(d) a Technology Committee as an advisory group, which shall exercise the tasks
set out in Article 66.
EN 94 EN
Part 1: Management Board
Article 56
Composition of the Management Board
1. The Management Board shall be composed of one representative from each Member
State and two representatives of the Commission, all as members with voting rights.
2. The Management Board shall also include one independent expert observer
designated by the European Parliament, without the right to vote.
Europol may designate a representative to attend the meetings of the Management
Board as an observer on matters involving Europol, at the request of the Chairperson
of the Management Board.
3. Each member of the Management Board shall have an alternate. The alternate shall
represent the member in his/her absence.
4. Members of the Management Board and their alternates shall be appointed in the
light of their knowledge in the field of combating child sexual abuse, taking into
account relevant managerial, administrative and budgetary skills. Member States
shall appoint a representative of their Coordinating Authority, within four months of
[date of entry into force of this Regulation]. All parties represented in the
Management Board shall make efforts to limit turnover of their representatives, in
order to ensure continuity of its work. All parties shall aim to achieve a balanced
representation between men and women on the Management Board.
5. The term of office for members and their alternates shall be four years. That term
may be renewed.
Article 57
Functions of the Management Board
1. The Management Board shall:
(a) give the general orientations for the EU Centre's activities;
(b) contribute to facilitate the effective cooperation with and between the
Coordinating Authorities;
(c) adopt rules for the prevention and management of conflicts of interest in
respect of its members, as well as for the members of the Technological
Committee and of any other advisory group it may establish and publish
annually on its website the declaration of interests of the members of the
Management Board;
(d) adopt the assessment of performance of the Executive Board referred to in
Article 61(2);
(e) adopt and make public its Rules of Procedure;
EN 95 EN
(f) appoint the members of the Technology Committee, and of any other advisory
group it may establish;
(g) adopt the opinions on intended detection orders referred to in Article 7(4), on
the basis of a draft opinion provided by the Executive Director;
(h) adopt and regularly update the communication and dissemination plans
referred to in Article 77(3) based on an analysis of needs.
Article 58
Chairperson of the Management Board
1. The Management Board shall elect a Chairperson and a Deputy Chairperson from
among its members. The Chairperson and the Deputy Chairperson shall be elected by
a majority of two thirds of the members of the Management Board.
The Deputy Chairperson shall automatically replace the Chairperson if he/she is
prevented from attending to his/her duties.
2. The term of office of the Chairperson and the deputy Chairperson shall be four years.
Their term of office may be renewed once. If, however, their membership of the
Management Board ends at any time during their term of office, their term of office
shall automatically expire on that date.
Article 59
Meetings of the Management Board
1. The Chairperson shall convene the meetings of the Management Board.
2. The Executive Director shall take part in the deliberations, without the right to vote.
3. The Management Board shall hold at least two ordinary meetings a year. In addition,
it shall meet on the initiative of its Chairperson, at the request of the Commission, or
at the request of at least one-third of its members.
4. The Management Board may invite any person whose opinion may be of interest to
attend its meetings as an observer.
5. The members of the Management Board and their alternates may, subject to its rules
of procedure, be assisted at the meetings by advisers or experts.
6. The EU Centre shall provide the secretariat for the Management Board.
Article 60
Voting rules of the Management Board
1. Unless provided otherwise in this Regulation, the Management Board shall take
decisions by absolute majority of its members.
EN 96 EN
2. Each member shall have one vote. In the absence of a member, his/her alternate shall
be entitled to exercise his/her right to vote.
3. The Executive Director shall not take part in the voting.
4. The Management Board's rules of procedure shall establish more detailed voting
arrangements, in particular the circumstances in which a member may act on behalf
of another member.
Part 2: Executive Board
Article 61
Composition and appointment of the Executive Board
1. The Executive Board shall be composed of the Chairperson and the Deputy
Chairperson of the Management Board, two other members appointed by the
Management Board from among its members with the right to vote and two
representatives of the Commission to the Management Board. The Chairperson of
the Management Board shall also be the Chairperson of the Executive Board.
The Executive Director shall participate in meetings of the Executive Board without
the right to vote.
2. The term of office of members of the Executive Board shall be four years. In the
course of the 12 months preceding the end of the four-year term of office of the
Chairperson and five members of the Executive Board, the Management Board or a
smaller committee selected among Management Board members including a
Commission representative shall carry out an assessment of performance of the
Executive Board. The assessment shall take into account an evaluation of the
Executive Board members’ performance and the EU Centre’s future tasks and
challenges. Based on the assessment, the Management Board may extend their term
of office once.
Article 62
Tasks of the Executive Board
1. The Executive Board shall be responsible for the overall planning and the execution
of the tasks conferred on the EU Centre pursuant to Article 43. The Executive Board
shall adopt all the decisions of the EU Centre with the exception of the decisions that
shall be taken by the Management Board in accordance with Article 57.
2. In addition, the Executive Board shall have the following tasks:
(a) adopt, by 30 November of each year, on the basis of a proposal by the
Executive Director, the draft Single Programming Document, and shall
transmit it for information to the European Parliament, the Council and the
Commission by 31 January the following year, as well as any other updated
version of the document;
EN 97 EN
(b) adopt the draft annual budget of the EU Centre and exercise other functions in
respect of the EU Centre’s budget;
(c) assess and adopt a consolidated annual activity report on the EU Centre's
activities, including an overview of the fulfilment of its tasks and send it, by 1
July each year, to the European Parliament, the Council, the Commission and
the Court of Auditors and make the consolidated annual activity report public;
(d) adopt an anti-fraud strategy, proportionate to fraud risks taking into account the
costs and benefits of the measures to be implemented, an efficiency gains and
synergies strategy, a strategy for cooperation with third countries and/or
international organisations, and a strategy for the organisational management
and internal control systems
(e) adopt rules for the prevention and management of conflicts of interest in
respect of its members;
(f) adopt its rules of procedure;
(g) exercise, with respect to the staff of the EU Centre, the powers conferred by the
Staff Regulations on the Appointing Authority and by the Conditions of
Employment of Other Servants on the EU Centre Empowered to Conclude a
Contract of Employment51 ("the appointing authority powers");
(h) adopt appropriate implementing rules for giving effect to the Staff Regulations
and the Conditions of Employment of Other Servants in accordance with
Article 110(2) of the Staff Regulations;
(i) appoint the Executive Director and remove him/her from office, in accordance
with Article 65;
(j) appoint an Accounting Officer, who may be the Commission's Accounting
Officer, subject to the Staff Regulations and the Conditions of Employment of
other servants, who shall be totally independent in the performance of his/her
duties;
(k) ensure adequate follow-up to findings and recommendations stemming from
the internal or external audit reports and evaluations, as well as from
investigations of the European Anti-Fraud Office (OLAF);
(l) adopt the financial rules applicable to the EU Centre;
(m) take all decisions on the establishment of the EU Centre's internal structures
and, where necessary, their modification.
(n) appoint a Data Protection Officer;
51 Regulation (EEC, Euratom, ECSC) No 259/68 of the Council of 29 February 1968 laying down the
Staff Regulations of Officials and the Conditions of Employment of Other Servants of the European
Communities and instituting special measures temporarily applicable to officials of the Commission
(OJ L 56, 4.3.1968, p. 1)
EN 98 EN
(o) adopt internal guidelines further specifying the procedures for the processing of
information in accordance with Article 51, after consulting the European Data
Protection Supervisor;
(p) authorise the conclusion of memoranda of understanding referred to in Article
53(3) and Article 54(2).
3. With respect to the powers mentioned in paragraph 2 point (g) and (h), the Executive
Board shall adopt, in accordance with Article 110(2) of the Staff Regulations, a
decision based on Article 2(1) of the Staff Regulations and Article 6 of the
Conditions of Employment, delegating relevant appointing authority powers to the
Executive Director. The Executive Director shall be authorised to sub-delegate those
powers.
4. In exceptional circumstances, the Executive Board may by way of a decision
temporarily suspend the delegation of the appointing authority powers to the
Executive Director and any sub-delegation by the latter and exercise them itself or
delegate them to one of its members or to a staff member other than the Executive
Director.
5. Where necessary because of urgency, the Executive Board may take certain
provisional decisions on behalf of the Management Board, in particular on
administrative management matters, including the suspension of the delegation of the
appointing authority powers and budgetary matters.
Article 63
Voting rules of the Executive Board
1. The Executive Board shall take decisions by simple majority of its members. Each
member of the Executive Board shall have one vote. The Chairperson shall have a
casting vote in case of a tie.
2. The representatives of the Commission shall have a right to vote whenever matters
pertaining to Article 62(2), points (a) to (l) and (p) are discussed and decided upon.
For the purposes of taking the decisions referred to in Article 62(2), points (f) and
(g), the representatives of the Commission shall have one vote each. The decisions
referred to in Article 62(2), points (b) to (e), (h) to (l) and (p), may only be taken if
the representatives of the Commission casts a positive vote. For the purposes of
taking the decisions referred to in Article 62(2), point (a), the consent of the
representatives of the Commission shall only be required on the elements of the
decision not related to the annual and multi-annual working programme of the EU
Centre.
The Executive Board's rules of procedure shall establish more detailed voting
arrangements, in particular the circumstances in which a member may act on behalf
of another member.
EN 99 EN
Part 3: Executive Director
Article 64
Responsibilities of the Executive Director
1. The Executive Director shall manage the EU Centre. The Executive Director shall be
accountable to the Management Board.
2. The Executive Director shall report to the European Parliament on the performance
of his/her duties when invited to do so. The Council may invite the Executive
Director to report on the performance of his/her duties.
3. The Executive Director shall be the legal representative of the EU Centre.
4. The Executive Director shall be responsible for the implementation of the tasks
assigned to the EU Centre by this Regulation. In particular, the Executive Director
shall be responsible for:
(a) the day-to-day administration of the EU Centre;
(b) preparing decisions to be adopted by the Management Board;
(c) implementing decisions adopted by the Management Board;
(d) preparing the Single Programming Document and submitting it to the
Executive Board after consulting the Commission;
(e) implementing the Single Programming Document and reporting to the
Executive Board on its implementation;
(f) preparing the Consolidated Annual Activity Report (CAAR) on the EU
Centre’s activities and presenting it to the Executive Board for assessment and
adoption;
(g) preparing an action plan following-up conclusions of internal or external audit
reports and evaluations, as well as investigations by the European Anti-Fraud
Office (OLAF) and by the European Public Prosecutor’s Office (EPPO) and
reporting on progress twice a year to the Commission and regularly to the
Management Board and the Executive Board;
(h) protecting the financial interests of the Union by applying preventive measures
against fraud, corruption and any other illegal activities, without prejudicing
the investigative competence of OLAF and EPPO by effective checks and, if
irregularities are detected, by recovering amounts wrongly paid and, where
appropriate, by imposing effective, proportionate and dissuasive
administrative, including financial penalties;
(i) preparing an anti-fraud strategy, an efficiency gains and synergies strategy, a
strategy for cooperation with third countries and/or international organisations
and a strategy for the organisational management and internal control systems
for the EU Centre and presenting them to the Executive Board for approval;
EN 100 EN
(j) preparing draft financial rules applicable to the EU Centre;
(k) preparing the EU Centre’s draft statement of estimates of revenue and
expenditure and implementing its budget;
(l) preparing and implementing an IT security strategy, ensuring appropriate risk
management for all IT infrastructure, systems and services, which are
developed or procured by the EU Centre as well as sufficient IT security
funding.
(m) implementing the annual work programme of the EU Centre under the control
of the Executive Board;
(n) drawing up a draft statement of estimates of the EU Centre’s revenue and
expenditure as part of the EU Centre’s Single Programming Document and
implementing the budget of the EU Centre pursuant to Article 67;
(o) preparing a draft report describing all activities of the EU Centre with a section
on financial and administrative matters;
(p) fostering recruitment of appropriately skilled and experienced EU Centre staff,
while ensuring gender balance.
5. Where exceptional circumstances so require, the Executive Director may decide to
locate one or more staff in another Member State for the purpose of carrying out the
EU Centre’s tasks in an a more efficient, effective and coherent manner. Before
deciding to establish a local office, the Executive Director shall obtain the prior
consent of the Commission, the Management Board and the Member State
concerned. The decision shall be based on an appropriate cost-benefit analysis that
demonstrates in particular the added value of such decision and specify the scope of
the activities to be carried out at the local office in a manner that avoids unnecessary
costs and duplication of administrative functions of the EU Centre. A headquarters
agreement with the Member State(s) concerned may be concluded.
Article 65
Executive Director
1. The Executive Director shall be engaged as a temporary agent of the EU Centre
under Article 2(a) of the Conditions of Employment of Other Servants.
2. The Executive Director shall be appointed by the Executive Board, from a list of
candidates proposed by the Commission, following an open and transparent selection
procedure.
3. For the purpose of concluding the contract with the Executive Director, the EU
Centre shall be represented by the Chairperson of the Executive Board.
4. The term of office of the Executive Director shall be five years. Six months before
the end of the Executive Director’s term of office, the Commission shall complete an
assessment that takes into account an evaluation of the Executive Director's
performance and the EU Centre's future tasks and challenges.
EN 101 EN
5. The Executive Board, acting on a proposal from the Commission that takes into
account the assessment referred to in paragraph 3, may extend the term of office of
the Executive Director once, for no more than five years.
6. An Executive Director whose term of office has been extended may not participate in
another selection procedure for the same post at the end of the overall period.
7. The Executive Director may be dismissed only upon a decision of the Executive
Board acting on a proposal from the Commission.
8. The Executive Board shall take decisions on appointment, extension of the term of
office or dismissal of the Executive Director by a majority of two-thirds of its
members with voting rights.
Subsection 5: Technology Committee
Article 66
Establishment and tasks of the Technology Committee
1. The Technology Committee shall consist of technical experts appointed by the
Management Board in view of their excellence and their independence, following the
publication of a call for expressions of interest in the Official Journal of the
European Union.
2. Procedures concerning the appointment of the members of the Technology
Committee and its operation shall be specified in the rules of procedure of the
Management Board and shall be made public.
3. The members of the Committee shall be independent and shall act in the public
interest. The list of members of the Committee shall be made public and shall be
updated by the EU Centre on its website.
4. When a member no longer meets the criteria of independence, he or she shall inform
the Management Board. Alternatively, the Management Board may declare, on a
proposal of at least one third of its members or of the Commission, a lack of
independence and revoke the person concerned. The Management Board shall
appoint a new member for the remaining term of office in accordance with the
procedure for ordinary members.
5. The mandates of members of the Technology Committee shall be four years. Those
mandates shall be renewable once.
6. The Technology Committee shall
(a) contribute to the EU Centre’s opinions referred to in Article 7(3), first
subparagraph, point (d);
(b) contribute to the EU Centre’s assistance to the Coordinating Authorities, the
Management Board, the Executive Board and the Executive Director, in
respect of matters related to the use of technology;
EN 102 EN
(c) provide internally, upon request, expertise on matters related to the use of
technology for the purposes of prevention and detection of child sexual abuse
online.
Section 6
Establishment and Structure of the Budget
Subsection 1
Single Programming Document
Article 67
Budget establishment and implementation
1. Each year the Executive Director shall draw up a draft statement of estimates of the
EU Centre’s revenue and expenditure for the following financial year, including an
establishment plan, and shall send it to the Executive Board.
2. The Executive Board shall, on the basis of the draft statement of estimates, adopt a
provisional draft estimate of the EU Centre’s revenue and expenditure for the
following financial year and shall send it to the Commission by 31 January each
year.
3. The Executive Board shall send the final draft estimate of the EU Centre’s revenue
and expenditure, which shall include a draft establishment plan, to the European
Parliament, the Council and the Commission by 31 March each year.
4. The Commission shall send the statement of estimates to the European Parliament
and the Council, together with the draft general budget of the Union.
5. On the basis of the statement of estimates, the Commission shall enter in the draft
general budget of the Union the estimates that it considers necessary for the
establishment plan and the amount of the contribution to be charged to the general
budget, which it shall place before the European Parliament and the Council in
accordance with Articles 313 and 314 of the Treaty on the Functioning of the
European Union.
6. The European Parliament and the Council shall authorise the appropriations for the
contribution from the Union to the EU Centre.
7. The European Parliament and the Council shall adopt the EU Centre’s establishment
plan.
8. The EU Centre’s budget shall be adopted by the Executive Board. It shall become
final following the final adoption of the general budget of the Union. Where
necessary, it shall be adjusted accordingly.
9. The Executive Director shall implement the EU Centre’s budget.
10. Each year the Executive Director shall send to the European Parliament and the
Council all information relevant to the findings of any evaluation procedures.
EN 103 EN
Article 68
Financial rules
The financial rules applicable to the EU Centre shall be adopted by the Executive Board after
consultation with the Commission. They shall not depart from Delegated Regulation (EU)
2019/71552 unless such a departure is specifically required for the operation of the EU Centre
and the Commission has given its prior consent.
1.
Subsection 2
Presentation, implementation and control of the budget
Article 69
Budget
1. Estimates of all revenue and expenditure for the EU Centre shall be prepared each
financial year, which shall correspond to the calendar year, and shall be shown in the
EU Centre’s budget, which shall be balanced in terms of revenue and of expenditure.
2. Without prejudice to other resources, the EU Centre’s revenue shall comprise a
contribution from the Union entered in the general budget of the Union.
3. The EU Centre may benefit from Union funding in the form of delegation
agreements or ad hoc grants in accordance with its financial rules referred to in
Article 68 and with the provisions of the relevant instruments supporting the policies
of the Union.
4. The EU Centre’s expenditure shall include staff remuneration, administrative and
infrastructure expenses, and operating costs.
5. Budgetary commitments for actions relating to large-scale projects extending over
more than one financial year may be broken down into several annual instalments.
Article 70
Presentation of accounts and discharge
1. The EU Centre’s accounting officer shall send the provisional accounts for the
financial year (year N) to the Commission's accounting officer and to the Court of
Auditors by 1 March of the following financial year (year N + 1).
2. The EU Centre shall send a report on the budgetary and financial management for
year N to the European Parliament, the Council and the Court of Auditors by 31
March of year N + 1.
52 OJ L 122, 10.5.2019, p. 1.
EN 104 EN
3. The Commission's accounting officer shall send the EU Centre’s provisional
accounts for year N, consolidated with the Commission's accounts, to the Court of
Auditors by 31 March of year N + 1.
4. The Management Board shall deliver an opinion on the EU Centre’s final accounts
for year N.
5. The EU Centre’s accounting officer shall, by 1 July of year N + 1, send the final
accounts for year N to the European Parliament, the Council, the Commission, the
Court of Auditors and national parliaments, together with the Management Board's
opinion.
6. The final accounts for year N shall be published in the Official Journal of the
European Union by 15 November of year N + 1.
7. The Executive Director shall send to the Court of Auditors, by 30 September of year
N + 1, a reply to the observations made in its annual report. He or she shall also send
the reply to the Management Board.
8. The Executive Director shall submit to the European Parliament, at the latter's
request, any information required for the smooth application of the discharge
procedure for year N.
9. On a recommendation from the Council acting by a qualified majority, the European
Parliament shall, before 15 May of year N + 2, grant a discharge to the Executive
Director in respect of the implementation of the budget for year N.
Section 7
Staff
Article 71
General provisions
1. The Staff Regulations and the Conditions of Employment of Other Servants and the
rules adopted by agreement between the institutions of the Union for giving effect
thereto shall apply to the EU Centre for all matters not covered by this Regulation.
2. The Executive Board, in agreement with the Commission, shall adopt the necessary
implementing measures, in accordance with the arrangements provided for in Article
110 of the Staff Regulations.
3. The EU Centre staff, in particular those working in areas related to detection,
reporting and removal of online child sexual abuse, shall have access to appropriate
counselling and support services.
Article 72
Seconded national experts and other staff
1. The EU Centre may make use of seconded national experts or other staff not
employed by it.
EN 105 EN
2. The Executive Board shall adopt rules related to staff from Member States, including
the contact officers referred to in Article 52, to be seconded to the EU Centre and
update them as necessary. Those rules shall include, in particular, the financial
arrangements related to those secondments, including insurance and training. Those
rules shall take into account the fact that the staff is seconded and to be deployed as
staff of the EU Centre. They shall include provisions on the conditions of
deployment. Where relevant, the Executive Board shall aim to ensure consistency
with the rules applicable to reimbursement of the mission expenses of the statutory
staff.
Article 73
Privileges and immunities
Protocol No 7 on the Privileges and Immunities of the European Union annexed to
the Treaty on the Functioning of the European Union shall apply to the EU Centre
and its staff.
Privileges and immunities of contact officers and members of their families shall be
subject to an agreement between the Member State where the seat of the EU Centre
is located and the other Member States. That agreement shall provide for such
privileges and immunities as are necessary for the proper performance of the tasks of
contact officers.
Article 74
Obligation of professional secrecy
1. Members of the Management Board and the Executive Board, and all members of
the staff of the EU Centre, including officials seconded by Member States on a
temporary basis, and all other persons carrying out tasks for the EU Centre on a
contractual basis, shall be subject to the requirements of professional secrecy
pursuant to Article 339 of the Treaty on the Functioning of the European Union even
after their duties have ceased.
2. The Executive Board shall ensure that individuals who provide any service, directly
or indirectly, permanently or occasionally, relating to the tasks of the EU Centre,
including officials and other persons authorised by the Executive Board or appointed
by the coordinating authorities for that purpose, are subject to requirements of
professional secrecy equivalent to those in paragraph 1.
3. The EU Centre shall establish practical arrangements for implementing the
confidentiality rules referred to in paragraphs 1 and 2.
4. The EU Centre shall apply Commission Decision (EU, Euratom) 2015/44453.
53 Commission Decision (EU, Euratom) 2015/444 of 13 March 2015 on the security rules for protecting
EU classified information (OJ L 72, 17.3.2015, p. 53).
EN 106 EN
Article 75
Security rules on the protection of classified and sensitive non-classified information
1. The EU Centre shall adopt its own security rules equivalent to the Commission’s
security rules for protecting European Union Classified Information (EUCI) and
sensitive non-classified information, as set out in Commission Decisions (EU,
Euratom) 2015/44354 and (EU, Euratom) 2015/444. The security rules of the EU
Centre shall cover, inter alia, provisions for the exchange, processing and storage of
such information. The Executive Board shall adopt the EU Centre’s security rules
following approval by the Commission.
2. Any administrative arrangement on the exchange of classified information with the
relevant authorities of a third country or, in the absence of such arrangement, any
exceptional ad-hoc release of EUCI to those authorities, shall be subject to the
Commission’s prior approval.
Section 8
General provisions
Article 76
Language arrangements
The provisions laid down in Regulation No 155 shall apply to the EU Centre. The translation
services required for the functioning of the EU Centre shall be provided by the Translation
Centre for the bodies of the European Union.
Article 77
Transparency and communication
1. Regulation (EC) No 1049/200156 shall apply to documents held by the EU Centre.
The Management Board shall, within six months of the date of its first meeting,
adopt the detailed rules for applying that Regulation.
2. The processing of personal data by the EU Centre shall be subject to Regulation
(EU) 2018/1725. The Management Board shall, within six months of the date of its
first meeting, establish measures for the application of that Regulation by the EU
Centre, including those concerning the appointment of a Data Protection Officer of
the EU Centre. Those measures shall be established after consultation of the
European Data Protection Supervisor.
54 Commission Decision (EU, Euratom) 2015/443 of 13 March 2015 on Security in the Commission (OJ L
72, 17.3.2015, p. 41). 55 Regulation No 1 determining the languages to be used by the European Economic Community (OJ 17,
6.10.1958, p. 385/58). 56 Regulation (EC) No 1049/2001 of the European Parliament and of the Council of 30 May 2001
regarding public access to European Parliament, Council and Commission documents, Official Journal
L 145 , 31/05/2001 P. 0043 – 0048.
EN 107 EN
3. The EU Centre may engage in communication activities on its own initiative within
its field of competence. Communication activities shall be carried out in accordance
with relevant communication and dissemination plans adopted by the Management
Board.
Article 78
Anti-fraud measures
1. In order to combat fraud, corruption and other unlawful activities, Regulation (EU,
Euratom) No 883/201357 shall apply.
2. The EU Centre shall accede to the Interinstitutional Agreement of 25 May 1999
between the European Parliament, the Council of the European Union and the
Commission of the European Communities concerning internal investigations by
OLAF within six months from [date of start of operations as set out in Article 82]
and shall adopt the appropriate provisions applicable to its staff using the template
set out in the Annex to that Agreement.
3. The European Court of Auditors shall have the power of audit, on the basis of
documents and on the spot, over all grant beneficiaries, contractors and
subcontractors who have received Union funds from the EU Centre.
4. OLAF may carry out investigations, including on-the-spot checks and inspections
with a view to establishing whether there has been fraud, corruption or any other
illegal activity affecting the financial interests of the Union in connection with a
grant or a contract funded by the EU Centre, in accordance with the provisions and
procedures laid down in Regulation (EU, Euratom) No 883/2013 and Council
Regulation (Euratom, EC) No 2185/9658.
5. Without prejudice to paragraphs 1, 2, 3, and 4, cooperation agreements with third
countries and international organisations, contracts, grant agreements and grant
decisions of the EU Centre shall contain provisions expressly empowering the
European Court of Auditors and OLAF to conduct such audits and investigations, in
accordance with their respective competences.
Article 79
Liability
1. The EU Centre's contractual liability shall be governed by the law applicable to the
contract in question.
57 Regulation (EU, Euratom) No 883/2013 of the European Parliament and of the Council of 11
September 2013 concerning investigations conducted by the European Anti-Fraud Office (OLAF) and
repealing Regulation (EC) No 1073/1999 of the European Parliament and of the Council and Council
Regulation (Euratom) No 1074/1999. (OJ L 248, 18.9.2013, p. 1). 58 Council Regulation (Euratom, EC) No 2185/96 of 11 November 1996 concerning on-the-spot checks
and inspections carried out by the Commission in order to protect the European Communities' financial
interests against fraud and other irregularities. (OJ L 292, 15.11.1996, p. 2).
EN 108 EN
2. The Court of Justice of the European Union shall have jurisdiction to give judgment
pursuant to any arbitration clause contained in a contract concluded by the EU
Centre.
3. In the case of non-contractual liability, the EU Centre shall, in accordance with the
general principles common to the laws of the Member States, make good any
damage caused by its departments or by its staff in the performance of their duties.
4. The Court of Justice of the European Union shall have jurisdiction in disputes over
compensation for damages referred to in paragraph 3.
5. The personal liability of its staff towards the Centre shall be governed by the
provisions laid down in the Staff Regulations or Conditions of Employment
applicable to them.
Article 80
Administrative inquiries
The activities of the EU Centre shall be subject to the inquiries of the European Ombudsman
in accordance with Article 228 of the Treaty on the Functioning of the European Union.
Article 81
Headquarters Agreement and operating conditions
1. The necessary arrangements concerning the accommodation to be provided for the
EU Centre in the Member State where the seat of the EU Centre is located and the
facilities to be made available by that Member State, together with the specific rules
applicable in that Member State to the Executive Director, members of the Executive
Board, EU Centre staff and members of their families shall be laid down in a
Headquarters Agreement between the EU Centre and the Member State where the
seat of the EU Centre is located, concluded after obtaining the approval of the
Executive Board and no later than [2 years after the entry into force of this
Regulation].
2. The Member State where the seat of the EU Centre is located shall provide the best
possible conditions to ensure the smooth and efficient functioning of the EU Centre,
including multilingual, European-oriented schooling and appropriate transport
connections.
Article 82
Start of the EU Centre's activities
1. The Commission shall be responsible for the establishment and initial operation of
the EU Centre until the Executive Director has taken up his or her duties following
his or her appointment by the Executive Board in accordance with Article 65(2). For
that purpose:
(a) the Commission may designate a Commission official to act as interim
Executive Director and exercise the duties assigned to the Executive Director;
EN 109 EN
(b) by derogation from Article 62(2)(g) and until the adoption of a decision as
referred to in Article 62(4), the interim Executive Director shall exercise the
appointing authority power;
(c) the Commission may offer assistance to the EU Centre, in particular by
seconding Commission officials to carry out the activities of the EU Centre
under the responsibility of the interim Executive Director or the Executive
Director;
(d) the interim Executive Director may authorise all payments covered by
appropriations entered in the EU Centre's budget after approval by the
Executive Board and may conclude contracts, including staff contracts,
following the adoption of the EU Centre's establishment plan.
EN 110 EN
CHAPTER V
DATA COLLECTION AND TRANSPARENCY REPORTING
Article 83
Data collection
1. Providers of hosting services, providers of interpersonal communications services
and providers of internet access services shall collect data on the following topics
and make that information available to the EU Centre upon request:
(a) where the provider has been subject to a detection order issued in accordance
with Article 7:
– the measures taken to comply with the order, including the technologies
used for that purpose and the safeguards provided;
– the error rates of the technologies deployed to detect online child sexual
abuse and measures taken to prevent or remedy any errors;
– in relation to complaints and cases submitted by users in connection to
the measures taken to comply with the order, the number of complaints
submitted directly to the provider, the number of cases brought before a
judicial authority, the basis for those complaints and cases, the decisions
taken in respect of those complaints and in those cases, the average time
needed for taking those decisions and the number of instances where
those decisions were subsequently reversed;
(b) the number of removal orders issued to the provider in accordance with Article
14 and the average time needed for removing or disabling access to the item or
items of child sexual abuse material in question;
(c) the total number of items of child sexual abuse material that the provider
removed or to which it disabled access, broken down by whether the items
were removed or access thereto was disabled pursuant to a removal order or to
a notice submitted by a Competent Authority, the EU Centre or a third party or
at the provider’s own initiative;
(d) the number of blocking orders issued to the provider in accordance with Article
16;
(e) the number of instances in which the provider invoked Article 8(3), Article
14(5) or (6) or Article 17(5), together with the grounds therefor;
2. The Coordinating Authorities shall collect data on the following topics and make that
information available to the EU Centre upon request:
EN 111 EN
(a) the follow-up given to reports of potential online child sexual abuse that the
EU Centre forwarded in accordance with Article 48(3), specifying for each
report:
– whether the report led to the launch of a criminal investigation,
contributed to an ongoing investigation, led to taking any other action or
led to no action;
– where the report led to the launch of a criminal investigation or
contributed to an ongoing investigation, the state of play or outcome of
the investigation, including whether the case was closed at pre-trial stage,
whether the case led to the imposition of penalties, whether victims were
identified and rescued and if so their numbers differentiating by gender
and age, and whether any suspects were arrested and any perpetrators
were convicted and if so their numbers;
– where the report led to any other action, the type of action, the state of
play or outcome of that action and the reasons for taking it;
– where no action was taken, the reasons for not taking any action;
(b) the most important and recurrent risks of online child sexual abuse, as reported
by providers of hosting services and providers of interpersonal
communications services in accordance with Article 3 or identified through
other information available to the Coordinating Authority;
(c) a list of the providers of hosting services and providers of interpersonal
communications services to which the Coordinating Authority addressed a
detection order in accordance with Article 7;
(d) the number of detection orders issued in accordance with Article 7, broken
down by provider and by type of online child sexual abuse, and the number of
instances in which the provider invoked Article 8(3);
(e) a list of providers of hosting services to which the Coordinating Authority
issued a removal order in accordance with Article 14;
(f) the number of removal orders issued in accordance with Article 14, broken
down by provider, the time needed to remove or disable access to the item or
items of child sexual abuse material concerned, and the number of instances in
which the provider invoked Article 14(5) and (6);
(g) the number of blocking orders issued in accordance with Article 16, broken
down by provider, and the number of instances in which the provider invoked
Article 17(5);
(h) a list of relevant information society services to which the Coordinating
Authority addressed a decision taken pursuant to Articles 27, 28 or 29, the type
of decision taken, and the reasons for taking it;
(i) the instances in which the opinion of the EU Centre pursuant to Article 7(4)(d)
substantially deviated from the opinion of the Coordinating Authority,
EN 112 EN
specifying the points at which it deviated and the main reasons for the
deviation.
3. The EU Centre shall collect data and generate statistics on the detection, reporting,
removal of or disabling of access to online child sexual abuse under this Regulation.
The data shall be in particular on the following topics:
(a) the number of indicators in the databases of indicators referred to in Article 44
and the development of that number as compared to previous years;
(b) the number of submissions of child sexual abuse material and solicitation of
children referred to in Article 36(1), broken down by Member State that
designated the submitting Coordinating Authorities, and, in the case of child
sexual abuse material, the number of indicators generated on the basis thereof
and the number of uniform resource locators included in the list of uniform
resource locators in accordance with Article 44(3);
(c) the total number of reports submitted to the EU Centre in accordance with
Article 12, broken down by provider of hosting services and provider of
interpersonal communications services that submitted the report and by
Member State the competent authority of which the EU Centre forwarded the
reports to in accordance with Article 48(3);
(d) the online child sexual abuse to which the reports relate, including the number
of items of potential known and new child sexual abuse material and instances
of potential solicitation of children, the Member State the competent authority
of which the EU Centre forwarded the reports to in accordance with Article
48(3), and type of relevant information society service that the reporting
provider offers;
(e) the number of reports that the EU Centre considered manifestly unfounded, as
referred to in Article 48(2);
(f) the number of reports relating to potential new child sexual abuse material and
solicitation of children that were assessed as not constituting child sexual abuse
material of which the EU Centre was informed pursuant to Article 36(3),
broken down by Member State;
(g) the results of the searches in accordance with Article 49(1), including the
number of images, videos and URLs by Member State where the material is
hosted;
(h) where the same item of potential child sexual abuse material was reported more
than once to the EU Centre in accordance with Article 12 or detected more than
once through the searches in accordance with Article 49(1), the number of
times that that item was reported or detected in that manner.
(i) the number of notices and number of providers of hosting services notified by
the EU Centre pursuant to Article 49(2);
EN 113 EN
(j) number of victims of online child sexual abuse assisted by the EU Centre
pursuant to Article 21(2), and the number of these victims that requested to
receive such assistance in a manner accessible to them due to disabilities.
4. The providers of hosting services, providers of interpersonal communications
services and providers of internet access services, the Coordinating Authorities and
the EU Centre shall ensure that the data referred to in paragraphs 1, 2 and 3,
respectively, is stored no longer than is necessary for the transparency reporting
referred to in Article 84. The data stored shall not contain any personal data.
5. They shall ensure that the data is stored in a secure manner and that the storage is
subject to appropriate technical and organisational safeguards. Those safeguards
shall ensure, in particular, that the data can be accessed and processed only for the
purpose for which it is stored, that a high level of security is achieved and that the
information is deleted when no longer necessary for that purpose. They shall
regularly review those safeguards and adjust them where necessary.
Article 84
Transparency reporting
1. Each provider of relevant information society services shall draw up an annual report
on its activities under this Regulation. That report shall compile the information
referred to in Article 83(1). The providers shall, by 31 January of every year
subsequent to the year to which the report relates, make the report available to the
public and communicate it to the Coordinating Authority of establishment, the
Commission and the EU Centre.
2. Each Coordinating Authority shall draw up an annual report on its activities under
this Regulation. That report shall compile the information referred to in Article
83(2). It shall, by 31 March of every year subsequent to the year to which the report
relates, make the report available to the public and communicate it to the
Commission and the EU Centre.
3. Where a Member State has designated several competent authorities pursuant to
Article 25, it shall ensure that the Coordinating Authority draws up a single report
covering the activities of all competent authorities under this Regulation and that the
Coordinating Authority receives all relevant information and support needed to that
effect from the other competent authorities concerned.
4. The EU Centre, working in close cooperation with the Coordinating Authorities,
shall draw up an annual report on its activities under this Regulation. That report
shall also compile and analyse the information contained in the reports referred to in
paragraphs 2 and 3. The EU Centre shall, by 30 June of every year subsequent to the
year to which the report relates, make the report available to the public and
communicate it to the Commission.
5. The annual transparency reports referred to in paragraphs 1, 2 and 3 shall not include
any information that may prejudice ongoing activities for the assistance to victims or
the prevention, detection, investigation or prosecution of child sexual abuse offences.
They shall also not contain any personal data.
EN 114 EN
6. The Commission shall be empowered to adopt delegated acts in accordance with
Article 86 in order to supplement this Regulation with the necessary templates and
detailed rules concerning the form, precise content and other details of the reports
and the reporting process pursuant to paragraphs 1, 2 and 3.
CHAPTER VI
FINAL PROVISIONS
Article 85
Evaluation
1. By [five years after the entry into force of this Regulation], and every five years
thereafter, the Commission shall evaluate this Regulation and submit a report on its
application to the European Parliament and the Council.
2. By [five years after the entry into force of this Regulation], and every five years
thereafter, the Commission shall ensure that an evaluation in accordance with
Commission guidelines of the EU Centre’s performance in relation to its objectives,
mandate, tasks and governance and location is carried out. The evaluation shall, in
particular, address the possible need to modify the tasks of the EU Centre, and the
financial implications of any such modification.
3. On the occasion of every second evaluation referred to in paragraph 2, the results
achieved by the EU Centre shall be assessed, having regard to its objectives and
tasks, including an assessment of whether the continuation of the EU Centre is still
justified with regard to those objectives and tasks.
4. The Commission shall report to the European Parliament and the Council the
findings of the evaluation referred to in paragraph 3. The findings of the evaluation
shall be made public.
5. For the purpose of carrying out the evaluations referred to in paragraphs 1, 2 and 3,
the Coordinating Authorities and Member States and the EU Centre shall provide
information to the Commission at its request.
6. In carrying out the evaluations referred to in paragraphs 1, 2 and 3, the Commission
shall take into account the relevant evidence at its disposal.
7. Where appropriate, the reports referred to in paragraphs 1 and 4 shall be
accompanied by legislative proposals.
Article 86
Exercise of the delegation
1. The power to adopt delegated acts is conferred on the Commission subject to the
conditions laid down in this Article.
EN 115 EN
2. The power to adopt delegated acts referred to in Articles 3, 8, 13, 14, 17, 47 and 84
shall be conferred on the Commission for an indeterminate period of time from [date
of adoption of the Regulation].
3. The delegation of power referred to in Articles 3, 8, 13, 14, 17, 47 and 84 may be
revoked at any time by the European Parliament or by the Council. A decision to
revoke shall put an end to the delegation of the power specified in that decision. It
shall take effect the day after the publication of the decision in the Official Journal of
the European Union or at a later date specified therein. It shall not affect the validity
of any delegated acts already in force.
4. Before adopting a delegated act, the Commission shall consult experts designated by
each Member State in accordance with the principles laid down in the Inter-
institutional Agreement of 13 April 2016 on Better Law-Making.
5. As soon as it adopts a delegated act, the Commission shall notify it simultaneously to
the European Parliament and to the Council.
6. A delegated act adopted pursuant to Articles 3, 8, 13, 14, 17, 47 and 84 shall enter
into force only if no objection has been expressed either by the European Parliament
or the Council within a period of two months of notification of that act to the
European Parliament and the Council or if, before the expiry of that period, the
European Parliament and the Council have both informed the Commission that they
will not object. That period shall be extended by two months at the initiative of the
European Parliament or of the Council.
Article 87
Committee procedure
1. For the purposes of the adoption of the implementing acts referred to in Article
39(4), the Commission shall be assisted by a committee. That committee shall be a
committee within the meaning of Regulation (EU) No 182/2011.
2. Where reference is made to this paragraph, Article 4 of Regulation (EU) No
182/2011 shall apply.
Article 88
Repeal
Regulation (EU) 2021/1232 is repealed from [date of application of this Regulation].
Article 89
Entry into force and application
This Regulation shall enter into force on the twentieth day following that of its publication in
the Official Journal of the European Union.
EN 116 EN
It shall apply from 6 months after its entry into force.
This Regulation shall be binding in its entirety and directly applicable in all Member States.
Done at Brussels,
For the European Parliament For the Council
The President The President
EN 117 EN
LEGISLATIVE FINANCIAL STATEMENT
1. FRAMEWORK OF THE PROPOSAL/INITIATIVE
1.1. Title of the proposal/initiative
1.2. Policy area concerned
1.3. The proposal relates to
1.4. Objectives
1.5. Grounds for the proposal/initiative
1.6. Duration and financial impact of the proposal/initiative
1.7. Management modes planned
2. MANAGEMENT MEASURES
2.1. Monitoring and reporting rule
2.2. Management and control systems
2.3. Measures to prevent fraud and irregularities
3. ESTIMATED FINANCIAL IMPACT OF THE PROPOSAL/INITIATIVE
3.1. Headings of the multiannual financial framework and expenditure budget line
3.2. Estimated impact on expenditure
3.3. Estimated impact on revenue
EN 118 EN
LEGISLATIVE FINANCIAL STATEMENT
1. FRAMEWORK OF THE PROPOSAL/INITIATIVE
1.1. Title of the proposal/initiative
Regulation of the European Parliament and of the Council laying down rules to prevent and combat
child sexual abuse
1.2. Policy area(s) concerned
Policy area: Security
Activity: EU strategy for a more effective fight against child sexual abuse 59
1.3. The proposal relates to
a new action
a new action following a pilot project/preparatory action60
the extension of an existing action
a merger of one or more actions towards another/a new action
1.4. Objective(s)
General objective(s)
The general objective is to improve the functioning of the internal market by introducing harmonised
EU rules aimed at better identifying, protecting of and supporting victims of Child Sexual Abuse
(CSA), ensuring effective prevention and facilitating investigations, notably through a clarification of
the role and responsibilities of online service providers when it comes to CSA.
This objective directly contributes to achieving the most relevant SDGs for this initiative, 5.2.,
eliminate all forms of violence against women and girls, and 16.2., end abuse, exploitation, trafficking
and all forms of violence against children, and partially addresses SDG 17 with respect to the collection
of data on children with disabilities seeking information and assistance from the EU Centre.
Specific objective(s)
Specific objective No
1. ensure the effective detection, reporting and removal of online child sexual abuse,
59 EU strategy for a more effective fight against child sexual abuse, COM(2020)607 of 24/7/20 60 As referred to in Article 58(2)(a) or (b) of the Financial Regulation.
EN 119 EN
2. improve legal certainty, transparency and accountability, and ensure that protection of fundamental
rights,
3. reduce the proliferation and effects of child sexual abuse through better coordination.
EN 120 EN
Expected result(s) and impact
Providers of information society services are expected to benefit from the legal certainty of harmonised
EU rules on the detection, reporting and removal of online child sexual abuse, and from higher levels
of trust where their services demonstrate greater accountability through the adoption of safer-by-design
methods, and through improved and standardised transparency reporting.
All internet users and especially child users are expected to benefit from a more structured approach to
preventing, detecting, reporting and removing online child sexual abuse across the Union, facilitated by
the EU Centre, and from higher levels of trust in online services that adopt safer-by-design methods.
National authorities are expected to benefit from the EU Centre facilitation of the detection, reporting
and removal process, and in particular contributing to ensure that the reports on online child sexual
abuse received by national law enforcement agencies are relevant and contain sufficient information for
law enforcement to act. National authorities will also benefit from the facilitation of the exchange of
expertise provided by the EU Centre in terms of sharing best practices and lessons learned across the
EU and globally on prevention and assistance to victims.
Indicators of performance
A dedicated monitoring framework, including a number of indicators per the specific objectives, is
described in the Impact Assessment Report accompanying the proposal.
In addition, detailed objectives and expected results including performance indicators will be
established by the EU Centre’s annual work programme, while the multi-annual work programme will
set out overall strategic objectives, expected results and performance indicators.
1.5. Grounds for the proposal/initiative
1.5.1. Requirement(s) to be met in the short or long term including a detailed timeline for roll-out of
the implementation of the initiative
The proposal is based on Article 114 TFEU, focused on the establishment and functioning of the
internal market.
The choice of legal basis reflects the main objectives and scope of the initiative given that the Internet
is by nature cross-border. Article 114 is the appropriate legal basis to address differences between
provisions of Member States’ laws which are such as to obstruct the fundamental freedoms and thus
have a direct effect on the functioning of the internal market, and to prevent the emergence of future
obstacles to trade resulting from differences in the way national laws have developed.
This initiative aims to ensure common rules creating the best conditions for maintaining a safe online
environment with responsible and accountable behaviour of service providers. At the same time, the
intervention provides for the appropriate supervision of relevant service providers and cooperation
between authorities at EU level, with the involvement and support of the EU Centre where appropriate.
As such, the initiative should increase legal certainty, trust, innovation and growth in the single market
for digital services.
EN 121 EN
A five year timeline from the date of the legislation coming into force is envisaged for the proposed EU
Centre to achieve full operational capacity. Commission resources would also be deployed to support
the setting up of the Centre during this lead-in period.
1.5.2. Added value of Union involvement (it may result from different factors, e.g. coordination
gains, legal certainty, greater effectiveness or complementarities). For the purposes of this
point 'added value of Union involvement' is the value resulting from Union intervention which
is additional to the value that would have been otherwise created by Member States alone.
Reasons for action at European level
A satisfactory improvement as regards the rules applicable to relevant online service providers active
on the internal market aimed at stepping up the fight against CSA cannot be sufficiently achieved by
Member States acting alone or in an uncoordinated way. In particular, a single Member State cannot
effectively prevent or stop the circulation online of a CSA image or video, or the online grooming of a
child, without the ability to cooperate and coordinate with the private entities who provide services in
several (if not all) Member States.
In the absence of EU action, Member States would have to keep adopting individual national laws to
respond to current and emerging challenges with the likely consequence of fragmentation and
diverging laws likely to negatively affect the internal market, particularly with regard to online service
providers active in more than one Member State.
Expected added value for the Union
The expected added value for the Union of the initiative includes the following:
- Reduce fragmentation and compliance/operational costs, improving the functioning of the
internal market. The EU Centre will contribute notably by facilitating the implementation of the
obligations on service providers to detect, report and remove CSA online, and the action of law
enforcement to follow up on those reports.
- Facilitate and support Member States’ action on prevention and assistance to victims to increase
efficiency and effectiveness. The EU Centre will contribute notably by facilitating the exchange of best
practices and serving as a knowledge hub for Member States.
- Reduce dependence on and facilitate cooperation with third countries. The EU Centre will
contribute notably by exchanging best practices with third countries, and facilitate Member States’
access to expertise and lessons learned from actions to fight against CSA around the globe.
1.5.3. Lessons learned from similar experiences in the past
This proposal is informed by two items of sectoral legislation addressing the area of child sexual abuse.
The first is Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children
and child pornography, and more recently Regulation 2021/1232/EU on a temporary derogation from
certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-
independent interpersonal communications services for the processing of personal and other data for
the purpose of combating online child sexual abuse.
EN 122 EN
The 2011 Directive, which then represented an important step forward, must be fully transposed by
Member States as a matter of urgency. The Commission will continue to use its enforcement powers
under the Treaties through infringement procedures to ensure swift implementation. In parallel to this,
and as indicated in the EU Strategy for a more effective fight against child sexual abuse, the
Commission has initiated a study to prepare the evaluation of the 2011 Directive and its possible future
revision.
The aim of Regulation 2021/1232/EU (the “Interim Regulation”) was to enable certain online
communications services to continue the use technologies to detect and report child sexual abuse online
and remove child sexual abuse material on their services. It has a limited duration and narrow scope
limited to voluntary activities of certain online services during an interim period of maximum 3 years,
which will expire in August 2024.
The current proposal builds on the 2011 Directive, in particular for the definition of child sexual abuse
offences, and on the Interim Regulation, in particular on its safeguards for the detection of online child
sexual abuse.
1.5.4. Compatibility with the Multiannual Financial Framework and possible synergies with other
appropriate instruments
The 2020 EU Strategy for a more effective fight against CSA set out eight initiatives as highlighting the
importance of a holistic response to this crime area. Legislation is one such element. Accordingly, this
proposal sets out to develop and implement an appropriate legal framework, enhance the law
enforcement response, and stimulate coordinated multi-stakeholder action on prevention, investigation
and assistance to victims.
This proposal is reflected under the heading of ‘Promoting our European way of life’ in the
Commission Work Programme 2021.
This proposal will build on the necessity of the proposed Digital Services Act to ensure the best
conditions for innovative cross-border digital services to develop in the EU across national territories,
and at the same time maintain a safe online environment for all EU citizens.
This proposal’s aim to create a specific EU framework to combat and prevent online CSA, with
elements similar to that of the Terrorist Content Online Regulation, and building on the Digital
Services Act provisions to create a harmonised baseline to address all illegal content by targeting child
sexual abuse online and grooming in particular.
The EU Centre, a fundamental component to support the implementation of the obligations on service
providers to detect, report and remove online child sexual abuse, is expected to generate important
efficiency gains for Member States by facilitating their cooperation and in mutualising resources for
technical assistance at EU level.
1.5.5. Assessment of the different available financing options, including scope for redeployment
Central to an assessment of the different financing options was the need that the proposed EU Centre
must be independent in order to serve as a facilitator of the work of providers of information society
EN 123 EN
services in detecting, reporting, and removing online child sexual abuse, and of the work of law
enforcement in following up on those reports from service providers.
Other options for the EU Centre were addressed in the accompanying Impact Assessment where, for
example, in terms of incorporating the EU Centre into the EU Centre for Fundamental Rights (FRA)
Agency, it was found, inter alia, that this would result in a significant imbalance in FRA’s mandate: as
it would double in size, half of it dedicated to CSA and the other half to its current tasks, and that this
would result in further complications associated with rewiring the FRA’s governance and underlying
legislation.
Accordingly, in order to further support the Centre’s independence it is proposed that the Centre be
financially independent and be funded by the EU.
The Centre should also be independent from national public entities of the Member State that
would host it in order to avoid the risk of prioritising and favouring efforts in this particular
Member State. This is without prejudice to the opportunity to draw on the expertise of
Member States and EU Justice and Home Affairs agencies to assist with building a critical
mass of expertise within the proposed EU Centre.
EN 124 EN
1.6. Duration and financial impact of the proposal/initiative
limited duration
Proposal/initiative in effect from [DD/MM]YYYY to [DD/MM]YYYY
Financial impact from YYYY to YYYY
unlimited duration
– Implementation with a 5-year start-up period from 2025 onwards,
followed by full-scale operation.
1.7. Management mode(s) planned61
Direct management by the Commission through
executive agencies
Shared management with the Member States
Indirect management by entrusting budget implementation tasks to:
international organisations and their agencies (to be specified);
the EIB and the European Investment Fund;
bodies referred to in Articles 70 and 71;
public law bodies;
bodies governed by private law with a public service mission to the extent that they provide
adequate financial guarantees;
bodies governed by the private law of a Member State that are entrusted with the implementation of
a public-private partnership and that provide adequate financial guarantees;
persons entrusted with the implementation of specific actions in the CFSP pursuant to Title V of the
TEU, and identified in the relevant basic act.
Comments
The level of EU contribution to the CSA Centre has been identified based on the Impact Assessment
carried out.
61 Details of management modes and references to Financial Regulation found on the BudgWeb site
EN 125 EN
2. MANAGEMENT MEASURES
2.1. Monitoring and reporting rules
The implementation and functioning of the Regulation will be reviewed and evaluated periodically
through reporting.
To monitor the implementation of the regulation, the EU Centre (along with service providers and
Coordinating Authorities) shall collect and analyse data relevant for measuring the effectiveness of the
detection, reporting and removal obligations. Coordinating Authorities and hosting or interpersonal
communication service providers will contribute to data collection and reporting on aspects falling into
their realm of responsibility. The data collected by the EU Centre should be made available to the
Coordinating Authorities and to the Commission to enable assessment of implementation.
The EU Centre shall publish annual transparency reports. These reports, to be made public and
communicated to the Commission, should compile and analyse the information contained in the annual
reports from relevant information service providers and Coordinating Authorities, complemented with
other relevant sources, and include information on the activities of the Centre.
Drawing on the statistics and information gathered from the structured processes and transparency
mechanisms provided for under this Regulation, the Commission should carry out an evaluation of this
Regulation within five years of the date of its entry into force, and then every 5 years thereafter. The
Commission will report on the findings of the evaluation to the European Parliament and the Council.
All Union agencies work under a strict monitoring system involving an internal control coordinator, the
Internal Audit Service of the Commission, the Management Board, the Commission, the Court of
Auditors and the Budgetary Authority. This system is reflected and laid down in Chapter 4 of the
proposed regulation setting up the EU Centre to Prevent and Combat Child Sexual Abuse.
In accordance with the Joint Statement on the EU decentralised agencies, the annual work programme
of the Centre shall comprise detailed objectives and expected results including performance indicators.
The Centre will accompany its activities included in its working programme by key performance
indicators. The activities of the Centre will be then measured against these indicators in the Annual
Activity Report.
The annual work programme shall be coherent with the multi-annual work programme and both shall
be included in an annual single programming document which shall be submitted to European
Parliament, the Council and the Commission.
The Management Board of the EU Centre will be responsible for general orientation of the EU Centre’s
activities. An Executive Board will be responsible efficient and effective administrative, budgetary and
operational management of the EU Centre, , and would adopt a budget estimate for the Centre before
relaying same to the Commission.
2.2. Management and control system(s)
2.2.1. Justification of the management mode(s), the funding implementation mechanism(s), the
payment modalities and the control strategy proposed
EN 126 EN
Given that the majority of funding under this proposal relates to setting up a new EU Centre the EU
budget financing will be implemented via indirect management.
An appropriate internal control strategy will be instituted to ensure that this budget is implemented in
an effective and efficient manner.
Regarding ex-post controls, the EU Centre, as a decentralised agency, is subject to:
- internal audit by the Internal Audit Service of the Commission;
- annual reports by the European Court of Auditors, giving a statement of assurance as to the reliability
of the annual accounts and the legality and regularity of the underlying transactions;
- annual discharge granted by the European Parliament;
- possible investigations conducted by OLAF to ensure, in particular, that the resources allocated to
agencies are put to proper use.
As a Justice and Home Affairs Agency partner to DG HOME the EU Centre will be subject to DG
HOME’s Control Strategy on decentralised agencies to ensure reliable reporting in the framework of its
Annual Activity Report. While decentralised agencies have full responsibility for the implementation
of their budget, DG HOME is responsible for regular payment of annual contributions established by
the Budgetary Authority.
The activities of the EU Centre will also be subject to the supervision of the Ombudsman in accordance
with Article 228 of the Treaty.
2.2.2. Information concerning the risks identified and the internal control system(s) set up to mitigate
them
As the Centre will be a new EU Centre there is a risk that the recruitment process may not be on
schedule, and will impact the Centre’s operational capacity. Here support of the parent DG is crucial
with respect to the roles of Authorising Officer and the exercise of powers conferred by the Staff
Regulations on the appointing authority (AIPN)62 until the Centre achieves full administrative
autonomy.
Frequent meetings and regular contacts will be required between the parent DG and the Centre
throughout the 5-year start-up phase to ensure that the Centre is autonomous and operational as
scheduled.
A risk to the effective implementation of this proposal takes account of the Regulatory aim to improve
and enhance detection, reporting and removal of online CSA across the Union, and where the wider
application of the Regulation would be a significant increase in the volume and quality of reporting.
Whereas the impact assessment has provided estimates on the number of reports expected, the actual
amount of reports that the Centre will receive, and therefore the Centre’s workload, may vary from the
estimates.
62 C(2013) 3288 final of the 4th of June 4/6/2013
EN 127 EN
The EU Centre will be required to implement an Internal Control Framework in line with the European
Commission’s Internal Control Framework. Information on the EU Centre’s internal controls will be
included in the Centre’s annual reports.
An Internal Audit capability will be established to take account of risks specific to the operation of the
EU Centre, and bring a systematic and disciplined approach to evaluate the effectiveness of risk
management, control, and governance processes, and by issuing recommendations for their
improvement.
DG HOME runs an annual risk management exercise to identify and assess potential high risks related
to agencies’ operations. Risks considered as critical are reported annually in DG HOME management
plan and are accompanied by an action plan stating the mitigating action.
2.2.3. Estimation and justification of the cost-effectiveness of the controls (ratio of "control costs ÷
value of the related funds managed"), and assessment of the expected levels of risk of error (at
payment & at closure)
The ratio of “control costs/value of the related funds managed” is reported on by the Commission. DG
HOME’s 2020 Annual Activity Report reports 0.16% for this ratio in relation to Indirect Management
Entrusted Entities and Decentralised Agencies.
2.3. Measures to prevent fraud and irregularities
The existing fraud prevention measures applicable to the Commission will cover the additional
appropriations necessary for this Regulation.
Concerning the proposed EU Centre, DG HOME has developed and regularly updates an in-house anti-
fraud strategy by reference to that provided by OLAF.
The proposed EU Centre, established as a decentralised agency would fall within scope of this strategy.
DG HOME, in its 2020 Annual Activity Report, concluded that the fraud prevention and detection
processes provided reasonable assurance on the achievement of the internal control objectives.
EN 128 EN
3. ESTIMATED FINANCIAL IMPACT OF THE PROPOSAL/INITIATIVE
3.1. Heading(s) of the multiannual financial framework and expenditure budget line(s) affected
New budget lines requested
In order of multiannual financial framework headings and budget lines.
Heading of
multiannual
financial
framework
Budget line Type of
expenditure Contribution
Number
Diff./non-
diff.
from
EFTA
countries
from
candidate
countries
from third
countries
within the meaning
of Article 21(2)(b) of the Financial
Regulation
5 12 10 04 EU Centre to prevent and
counter child sexual abuse “CSA” Non-diff. YES/NO YES/NO YES/NO YES/NO
EN 129 EN
3.2. Estimated impact on expenditure *
3.2.1. Summary of estimated impact on expenditure
EUR million (to three decimal places)
Heading of multiannual financial
framework 5 Security and Defence
CSA 2022 2023 2024 2025 63
2026 2027
TOTAL
MFF
2021-
2027
2028 2029 2030
Title 1:
Commitments (1) 11,122 10,964 16,497 38,583 22,269 26,694 28,477
Payments (2) 11,122 10,964 16,497 38,583 22,269 26,694 28,477
Title 2: Commitments (1a)
Payments (2a)
Title 3: Commitments (3a)
Payments (3b)
TOTAL appropriations for CSA
Commitments =1+1a
+3a
11,122 10,964 16,497 38,583 22,269 26,694 28,477
Payments =2+2a+3
b
11,122 10,964 16,497 38,583 22,269 26,694 28,477
* Note : All calculations have been made on a Brussels-based assumption, as the seat of the EU Centre is not yet determined. The start-up period for the
establishment of the EU Centre has been assessed to five years commencing in 2025, with a full operational capacity by end 2029, with a total Centre
expenditure figure of € 28,477 m in year 2030 where the first full-year staff costing of full staff complement falls due. The overall budget of the Centre
increases by 2% every year to cover inflation.
63 Year 1 includes €5 million initial set-up costs for infrastructure (i.e. a database of indicators and building)
EN 130 EN
Heading of multiannual financial
framework 7 ‘Administrative expenditure’
EUR million (to three decimal places)
2022 2023 2024 2025 2026 2027 TOTAL
DG: HOME
Human Resources 0,201 0,780 1,174 1,197 1,221 1,245 5,818
Other administrative expenditure - 0,660 0,660 0,330 - - 1,650
TOTAL DG HOME Appropriations 0,201 1,440 1,834 1,527 1,221 1,245 7,468
2022 2023 2024 2025 2026 2027 TOTAL
DG: HOME
Human Resources 0,201 0,780 1,174 1,197 1,221 1,245 5,818
Other administrative expenditure - 0,660 0,660 0,330 - - 1,650
TOTAL DG HOME Appropriations 0,201 1,440 1,834 1,527 1,221 1,245 7,468
TOTAL appropriations
under HEADING 7 of the multiannual financial framework
(Total commitments =
Total payments) 0,201 1,440 1,834 1,527 1,221 1,245 7,468
EUR million (to three decimal places)
2022 2023 2024 2025 2026 2027 TOTAL
TOTAL appropriations
under HEADINGS 1 to 7 of the multiannual financial framework
Commitments 0,201 1,440 1,834 12,649 12,185 17,742 46,051
Payments 0,201 1,440 1,834 12,649 12,185 17,742 46,051
EN 131 EN
3.2.2. Estimated impact on CSA body's appropriations
The proposal/initiative does not require the use of operational appropriations
The proposal/initiative requires the use of operational appropriations, as explained below:
Commitment appropriations in EUR million
Indicate objectives & outputs Year Year Year
Total MFF
2021-27 Year Year Year
2025 2026 2027
2028 2029 2030
Type Avg.
cost N o
.
Cost N o
.
Cost N o
.
Cost N o
.
Cost N o
.
Cost N o
.
Cost N o
.
Cost
SPECIFIC OBJECTIVE NO 1
Effective detection, reporting and removal of online child sexual abuse
- Output Services and supports to public authorities and service providers 1,919 3,741 5,835 11,494 8,017 9,700 10,448
- Output Communication and facilitation activities 0,411 0,802 1,250 2,463 1,718 2,079 2,239
- Output Research, audit and investigative activities 0,411 0,802 1,250 2,463 1,718 2,079 2,239
Subtotal for specific objective N°1 2,741 5,344 8,335 16,420 11,453 13,857 14,926
SPECIFIC OBJECTIVE NO 2
Improved legal certainty, ensuring the protection of fundamental rights,
transparency and accountability
- Output Services and supports to assist implementation of the Regulation 0,582 1,136 1,771 3,489 2,434 2,944 3,172
- Output Communication and facilitation activities 0,103 0,200 0,313 0,616 0,429 0,520 0,560
Subtotal for specific objective N°2 0,685 1,336 2,084 4,105 2,863 3,464 3,732
SPECIFIC OBJECTIVE NO 3
Reduction in the proliferation and effects of child sexual abuse through increased
coordination of efforts
- Output Services and supports to public authorities, providers and experts 6,887 2,999 4,255 14,141 5,567 6,561 6,873
- Output Communication and facilitation activities 0,404 0,643 0,912 1,959 1,193 1,406 1,473
- Output Research and evaluation– Victim assistance and Prevention 0,404 0,643 0,912 1,959 1,193 1,406 1,473
Subtotal for specific objective N°3 7,696 4,284 6,078 18,058 7,953 9,373 9,819
TOTAL 11,122 10,964 16,497 38,583 22,269 26,694 28,477
EN 132 EN
EN 133 EN
3.2.3. Estimated impact on CSA body's human resources
Summary
The proposal/initiative does not require the use of appropriations of an administrative
nature
The proposal/initiative requires the use of appropriations of an administrative nature,
as explained below:
EUR million (to three decimal places)
2025 2026 2027
Total MFF
2021-27 2028 2029 2030
Temporary agents (AD
Grades)
1,166
3,229
5,547
9,942
7,956
9,919
11,037
Temporary agents
(AST grades)
0,500
1,445
2,687
4,631
3,978
4,779
5,151
Contract staff
0,226
0,690
1,173
2,089
1,675
2,197
2,490
Seconded National
Experts
TOTAL 1,892 5,363 9,407 16,662 13,610 16,895 18,677
Staff requirements (FTE):
2025 2026 2027
Total MFF
2021-27 2028 2029 2030
Temporary agents (AD
Grades) 14 24 40 60 50 60 60
Temporary agents
(AST grades) 6 11 20 20 25 28 28
Contract staff 5 10 15 15 20 25 25
Seconded National
Experts
TOTAL 25 45 75 75 95 113 113
For new recruitment, a calculation of 50% of the staff costs for the year 2022 and 50% of the
additional staff costs for the following years has been applied.
EN 134 EN
3.2.4. Estimated requirements of human resources for the parent DG HOME
The proposal/initiative does not require the use of human resources.
The proposal/initiative requires the use of human resources, as explained below:
Estimate to be expressed in full amounts (or at most to one decimal place)
2022 2023 2024 2025 2026 2027
Establishment plan posts (officials and
temporary staff)
20 01 02 01 and 20 01 02 02 (Headquarters and
Commission’s Representation Offices) 2 5 5 5 5 5
20 01 02 03 (Delegations)
01 01 01 01 (Indirect research)
10 01 05 01 (Direct research)
External staff (in Full Time Equivalent
unit: FTE) 64
20 02 01 (AC, END, INT from the ‘global
envelope’) 1 4 4 4 4 4
20 02 03 (AC, AL, END, INT and
JPD in the Delegations)
Budget line(s)
(specify) 65
- at Headquarters 66
- in Delegations
01 01 01 02 (AC, END, INT –
Indirect research)
10 01 05 02 (AC, END, INT –
Direct research)
Other budget lines (specify)
TOTAL 3 9 9 9 9 9
The human resources required will be met by staff from the DG who are already assigned to
management of the action and/or have been redeployed within the DG, together if necessary with any
additional allocation which may be granted to the managing DG under the annual allocation procedure
and in the light of budgetary constraints.
64 AC = Contract Staff; AL = Local Staff; END = Seconded National Expert; INT = agency staff; JPD =
Junior Professionals in Delegations. 65 Sub-ceiling for external staff covered by operational appropriations (former ‘BA’ lines). 66 Mainly for the EU Cohesion Policy Funds, the European Agricultural Fund for Rural Development
(EAFRD) and the European Maritime Fisheries and Aquaculture Fund (EMFAF).
EN 135 EN
Description of tasks to be carried out:
Officials and temporary
staff
Commission staff drawn from DG HOME will work on 1) preparing the ground for the setting up
of the Centre as involves the development of work programme and activity reporting, 2)
preparing guidance on operational processes relating to the risk, detection, reporting and removal
obligations under the legislation, 3) continuing to advance Centre related activities in the
prevention and victim assistance areas, 4) providing administrative support for the setting-up of
the Centre5) provide secretariat to the Centre’s Management Board as established
External staff External staff as incrementally recruited into the EU Centre as established will assume certain
responsibilities from Commission staff, and operationalise the Centre’s systems and processes as
related to the detection, reporting and removal processes. Centre staff will also begin to assist
with building networks of expertise across the span of its responsibilities. Details of the tasks of
the EU Centre are included in Chapter 4, Section 2 of the above proposed Regulation.
Description of the calculation of cost for FTE units included in section 4 of the below Annex.
EN 136 EN
3.2.5. Compatibility with the current multiannual financial framework
The proposal/initiative is compatible the current multiannual financial framework.
The proposal/initiative will entail reprogramming of the relevant heading in the
multiannual financial framework.
The proposal includes additional financial and human resources for CSA Centre. The budgetary
impact of the additional financial resources for CSA will be offset through a compensatory reduction
from programmed spending under Heading 5.
The proposal/initiative requires application of the flexibility instrument or revision of
the multiannual financial framework67.
3.2.6. Third-party contributions
The proposal/initiative does not provide for co-financing by third parties.
The proposal/initiative provides for the co-financing estimated below:
EUR million (to three decimal places)
Year
N
Year N+1
Year N+2
Year N+3
Enter as many years as necessary
to show the duration of the
impact (see point 1.6)
Total
Specify the co-financing
body
TOTAL appropriations
co-financed
67 See Articles 12 and 13 of Council Regulation (EU, Euratom) No 2093/2020 of 17 December 2020
laying down the multiannual financial framework for the years 2021 to 2027.
EN 137 EN
3.3. Estimated impact on revenue
The proposal/initiative has no financial impact on revenue.
The proposal/initiative has the following financial impact:
– on own resources
– on other revenue
– please indicate, if the revenue is assigned to expenditure lines
EUR million (to three decimal places)
Budget revenue line:
Appropriation
s available for
the current
financial year
Impact of the proposal/initiative 68
Year N
Year N+1
Year N+2
Year N+3
Enter as many years as necessary to show
the duration of the impact (see point 1.6)
Article ………….
For miscellaneous ‘assigned’ revenue, specify the budget expenditure line(s) affected.
[…]
Specify the method for calculating the impact on revenue.
[…]
68 As regards traditional own resources (customs duties, sugar levies), the amounts indicated must be net
amounts, i.e. gross amounts after deduction of 20 % for collection costs.
EN 138 EN
1. ANNEX TO THE LEGISLATIVE FINANCIAL STATEMENT
Name of the proposal/initiative:
Regulation of the European Parliament and of the Council laying down rules to prevent and
combat child sexual abuse
1. NUMBER and COST of HUMAN RESOURCES CONSIDERED NECESSARY
2. COST of OTHER ADMINISTRATIVE EXPENDITURE
3. TOTAL ADMINISTRATIVE COSTS
4. METHODS of CALCULATION USED for ESTIMATING COSTS
4.1. Human resources
4.2. Other administrative expenditure
This annex must accompany the legislative financial statement when the inter-services consultation is
launched.
The data tables are used as a source for the tables contained in the legislative financial statement.
They are strictly for internal use within the Commission.
EN 139 EN
EN 140 EN
1. Cost of human resources considered necessary
The proposal/initiative does not require the use of human resources
The proposal/initiative requires the use of human resources, as explained below:
EUR million (to three decimal places)
HEADING 7
of the multiannual financial framework
2022 2023 2024 2025 2026 2027
TOTAL
FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations
Establishment plan posts (officials and temporary staff)
20 01 02 01 - Headquarters and Representation offices
AD 2 0,157 5 0,560 5 0,817 5 0,833 5 0,850 5 0,867 5 4.084
AST
20 01 02 03 - Union Delegations
AD
AST
External staff 69
20 02 01 and 20 02 02 – External personnel – Headquarters and Representation offices
AC 0 0,000 3 0,130 3 0,265 3 0,271 3 0,276 3 0,282 3 1,224
END 1 0,044 1 0,090 1 0,092 1 0,093 1 0,095 1 0,097 1 0,511
INT
20 02 03 – External personnel - Union Delegations
AC
AL
END
INT
JPD
Other HR related budget lines (specify)
Subtotal HR – HEADING 7
3 0,201 9 0,780 9 1,174 9 1,197 9 1,221 9 1,245 9 5,818
69 AC = Contract Staff; AL = Local Staff; END = Seconded National Expert; INT= agency staff; JPD= Junior Professionals in Delegations.
EN 141 EN
The human resources required will be met by staff from the DG who are already assigned to management of the action and/or have been redeployed within the DG, together if necessary with
any additional allocation which may be granted to the managing DG under the annual allocation procedure and in the light of budgetary constraints.
EN 142 EN
Outside HEADING 7
of the multiannual financial framework
2022 2023 2024 2025 2026 2027
TOTAL
FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations
Establishment plan posts (officials and temporary staff)
01 01 01 01 Indirect
Research70
01 01 01 11 Direct Research
Other (please specify)
AD
AST
External staff 71
External staff from operational appropriations (former ‘BA’ lines).
- at Headquarters
AC
END
INT
- in Union delegations
AC
AL
END
INT
JPD
01 01 01 02 Indirect Research
01 01 01 12 Direct research
Other (please specify)72
AC
END
INT
Other budget lines HR related (specify)
Subtotal HR – Outside HEADING 7
70 Please choose the relevant budget line, or specify another if necessary; in case more budget lines are concerned, staff should be differentiated by each budget line concerned 71 AC = Contract Staff; AL = Local Staff; END = Seconded National Expert; INT= agency staff; JPD= Junior Professionals in Delegations. 72 Please choose the relevant budget line, or specify another if necessary; in case more budget lines are concerned, staff should be differentiated by each budget line concerned
EN 143 EN
Outside HEADING 7
of the multiannual financial framework
2022 2023 2024 2025 2026 2027
TOTAL
FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations
Total HR (all MFF Headings)
3 0,201 9 0,780 9 1,174 9 1,197 9 1,221 9 1,245 9 5,818
Outside HEADING 7
of the multiannual financial framework
2022 2023 2024 2025 2026 2027
TOTAL
FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations
Establishment plan posts (officials and temporary staff)
01 01 01 01 Indirect
Research73
01 01 01 11 Direct Research
Other (please specify)
AD
AST
External staff 74
External staff from operational appropriations (former ‘BA’ lines).
- at Headquarters
AC
END
INT
- in Union delegations
AC
AL
END
INT
JPD
01 01 01 02 Indirect Research
01 01 01 12 Direct research
Other (please specify)75
AC
END
INT
73 Please choose the relevant budget line, or specify another if necessary; in case more budget lines are concerned, staff should be differentiated by each budget line concerned 74 AC = Contract Staff; AL = Local Staff; END = Seconded National Expert; INT= agency staff; JPD= Junior Professionals in Delegations. 75 Please choose the relevant budget line, or specify another if necessary; in case more budget lines are concerned, staff should be differentiated by each budget line concerned
EN 144 EN
Outside HEADING 7
of the multiannual financial framework
2022 2023 2024 2025 2026 2027
TOTAL
FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations FTE Appropriations
Other budget lines HR related (specify)
Subtotal HR – Outside HEADING 7
Total HR (all MFF Headings)
3 0,201 9 0,780 9 1,174 9 1,197 9 1,221 9 1,245 9 5,818
EN 145 EN
2. Cost of other administrative expenditure
The proposal/initiative does not require the use of administrative appropriations
The proposal/initiative requires the use of administrative appropriations, as explained below:
EUR million (to three decimal places)
HEADING 7
of the multiannual financial framework 2022 2023 2024 2025 2026 2027 Total
At headquarters or within EU territory:
20 02 06 01 - Mission and representation expenses 0,000 0,200 0,200 0,100 0,000 0,000 0,500
20 02 06 02 - Conference and meeting costs 0,000 0,460 0,460 0,230 0,000 0,000 1,150
20 02 06 03 - Committees76
20 02 06 04 Studies and consultations
20 04 – IT expenditure (corporate)77
Other budget lines non-HR related (specify where necessary)
In Union delegations
20 02 07 01 - Missions, conferences and representation expenses
20 02 07 02 - Further training of staff
20 03 05 – Infrastructure and logistics
Other budget lines non-HR related (specify where necessary)
Subtotal Other - HEADING 7
of the multiannual financial framework 0,000 0,660 0,660 0,330 0,000 0,000 1,650
76 Specify the type of committee and the group to which it belongs. 77 The opinion of DG DIGIT – IT Investments Team is required (see the Guidelines on Financing of IT, C(2020)6126 final of 10.9.2020, page 7)
EN 146 EN
EUR million (to three decimal places)
Outside HEADING 7
of the multiannual financial framework 2022 2023 2024 2025 2026 2027 Total
Expenditure on technical and administrative assistance (not including external staff) from operational appropriations (former 'BA' lines):
- at Headquarters
- in Union delegations
Other management expenditure for research
Policy IT expenditure on operational programmes78
Corporate IT expenditure on operational programmes79
Other budget lines non-HR related (specify where necessary)
Sub-total Other – Outside HEADING 7
of the multiannual financial framework
Total Other admin expenditure (all MFF Headings)
0,000 0,660 0,660 0,330 0,000 0,000 1,650
78 The opinion of DG DIGIT – IT Investments Team is required (see the Guidelines on Financing of IT, C(2020)6126 final of 10.9.2020, page 7) 79 This item includes local administrative systems and contributions to the co-financing of corporate IT systems (see the Guidelines on Financing of IT, C(2020)6126 final of 10.9.2020)
EN 147 EN
3. Total administrative costs (all Headings MFF)
EUR million (to three decimal places)
Summary 2022 2023 2024 2025 2026 2027 Total
Heading 7 - Human Resources 0,201 0,780 1,174 1,197 1,221 1,245 5,818
Heading 7 – Other administrative expenditure
0,660 0,660 0,330
1,650
Sub-total Heading 7
Outside Heading 7 – Human Resources
Outside Heading 7 – Other administrative expenditure
Sub-total Other Headings
TOTAL
HEADING 7 and Outside HEADING 7 0,201 1,440 1,834 1,527 1,221 1,245 7,468
The administrative appropriations required will be met by the appropriations which are already assigned to management of the action and/or which have been redeployed, together if necessary with any
additional allocation which may be granted to the managing DG under the annual allocation procedure and in the light of existing budgetary constraints.
EN 148 EN
4. Methods of calculation used to estimate costs
4.1 Human resources
This part sets out the method of calculation used to estimate the human resources considered necessary
(workload assumptions, including specific jobs (Sysper 2 work profiles), staff categories and the corresponding
average costs)
HEADING 7 of the multiannual financial framework
NB: The average costs for each category of staff at Headquarters are available on BudgWeb:
https://myintracomm.ec.europa.eu/budgweb/EN/pre/legalbasis/Pages/pre-040-020_preparation.aspx
Officials and temporary staff The costs for the officials in the parent DG HOME have been calculated on the basis of the following average cost: EUR 157,000 per year (reference: Circular note of DG BUDGET to RUF, Ares(2021)7378761 of 30/11/2021), by applying an inflation increase of 2% per year from 2023. The LFS proposes to use additional human resources in the parent DG (DG HOME), that is to say an additional 9 FTEs on top of those already working in the Security in the Digital Age policy area on the wider EU CSA Strategy and in administrative support. The human resources are split as follows (in FTE): * 5 AD
External staff The costs for the Seconded National Expert and Contractual Agents in the partner DG have been calculated on the basis of the following average cost: EUR 88,000 and EUR 85,000 per year, (reference: Circular note of DG BUDGET to RUF, Ares(2021)7378761 of 30/11/2021), by applying an inflation increase of 2% per year from 2023. The human resources are split as follows (in FTE): * 1 SNE and 3 AC
Outside HEADING 7 of the multiannual financial framework
Only posts financed from the research budget
External staff
Outside HEADING 7 of the multiannual financial framework
Only posts financed from the research budget
External staff
4.2 Other administrative expenditure
Give details of the method of calculation used for each budget line and in particular the underlying assumptions
(e.g. number of meetings per year, average costs, etc.)
HEADING 7 of the multiannual financial framework
EN 149 EN
These costs will cover: operational activities (e.g. tech. meetings with stakeholders); support to expert networks (coord. activities, meetings); translation and interpretation; publishing and research dissemination; communication (incl. campaigns).
Outside HEADING 7 of the multiannual financial framework