| Dokumendiregister | Justiits- ja Digiministeerium |
| Viit | 7-1/9632 |
| Registreeritud | 30.12.2025 |
| Sünkroonitud | 02.01.2026 |
| Liik | Sissetulev kiri |
| Funktsioon | 7 EL otsustusprotsessis osalemine ja rahvusvaheline koostöö |
| Sari | 7-1 EL institutsioonide otsustusprotsessidega seotud dokumendid (eelnõud, töögruppide materjalid, õigustiku ülevõtmise tähtajad) (Arhiiviväärtuslik) |
| Toimik | 7-1/2025 |
| Juurdepääsupiirang | Avalik |
| Juurdepääsupiirang | |
| Adressaat | Veriff OÜ |
| Saabumis/saatmisviis | Veriff OÜ |
| Vastutaja | Kristiina Krause (Justiits- ja Digiministeerium, Kantsleri vastutusvaldkond, Üldosakond, Kommunikatsiooni ja väliskoostöö talitus) |
| Originaal | Ava uues aknas |
Changes to GDPR Article 9
To: Ministry of Justice and Digital Affairs
Date: 29.12.2025
From: Veriff OÜ, Aleksander Tsuiman [email protected]
The proposed amendment
Article 9 is amended as follows:
(a) in paragraph 2, the following points are added:
‘(k) processing in the context of the development and operation of an AI system as
defined in Article 3, point (1), of Regulation (EU) 2024/1689 or an AI model, subject to
the conditions referred to in paragraph 5.
(l) processing of biometric data is necessary for the purpose of confirming the identity of
a data subject (verification), where the biometric data or the means needed for the
verification is under the sole control of the data subject.’
(b) the following paragraph is added:
‘5. For processing referred to in point (k) of paragraph 2, appropriate organisational and
technical measures shall be implemented to avoid the collection and otherwise
processing of special categories of personal data. Where, despite the implementation of
such measures, the controller identifies special categories of personal data in the
datasets used for training, testing or validation or in the AI system or AI model, the
controller shall remove such data. If removal of those data requires disproportionate
effort, the controller shall in any event effectively protect without undue delay such data
Changes to GDPR Article 9
2
from being used to produce outputs, from being disclosed or otherwise made available
to third parties.’
Comments, observations, suggestions
Addition of Article 9 section 2 (k) and Article 9 section 5
- Not clear what is the intended purpose of the addition, specifically:
- How does the change correlate with article 10 section 5 of Regulation (EU)
2024/1689 (AI Act) allowing the processing of special categories of data for the
purposes of bias mitigation. It is unclear whether article 10 section 5 of the AI Act
would then be considered applicable only for high risk AI systems and article 9
section 2 (k) would then be then applicable to all non high risk systems OR all
systems, including high risk systems.
- The proposed wording, i.e. “processing in the context of /–/ operation of an AI
system /–/ or an AI model” makes it appear like operating an AI system is a
processing activity on its own, however, an AI system or a model is a means of
technology and should not receive different treatment from any other technological
means to process data. If read together with article 9 section 5, this wording can
even create a confusing obligation to remove data from processing.
- Furthermore, the wording is extremely specific focusing on technology as it stands
today and that is against the principle of technology neutrality that is a core
regulatory principle in the EU stating laws should focus on outcomes/functions, not
specific tech, to avoid stifling innovation, future-proof rules, and let markets choose
best solutions.
- The proposed wording does not take into account the realities around operating biometric
technology. The legislator should understand that as there are AI models and AI systems
Changes to GDPR Article 9
3
processing biometrics then those models and systems need to be properly trained.
Training and operating such models presumes the existence of such data as part of the
processing operations. This means that such data is used to produce outputs as otherwise
those systems would not be able to operate. Hence, the current wording of Article 9
section 5 is far off from how a well performing and “behaving” AI-based biometric system
would work.
- Additionally, requiring the controller to remove such data unless doing so involves
disproportionate effort imposes an extremely high standard. This could significantly
impede the development and functioning of many AI models and systems.
- The phrase “protect data from being used to produce output” is ambiguous because it does
not define what “produce output” entails. Since disclosure and making data available are
addressed separately, “produce” can only be interpreted as preventing training data from
contributing to outcomes. However, this would be technically infeasible, as training data
is inherently part of generating results.
- Suggestions:
- Remove the notion of “operation” from article 9 section 2 (k)
- Reword article 9 section 5 to only state “For processing referred to in point (k) of
paragraph 2, appropriate organisational and technical measures shall be
implemented to avoid the unnecessary processing of special categories of personal
data.”.
- Alternatively reword Article 9 section 5 should as follows: “For processing referred
to in point (k) of paragraph 2, appropriate organisational and technical measures
shall be implemented to avoid the collection and otherwise processing of special
categories of personal data. Where, despite the implementation of such measures,
the controller identifies special categories of personal data in the datasets used for
training, testing or validation or in the OPERATION OF THE AI system or AI model,
Changes to GDPR Article 9
4
the controller shall remove such data AT THE EARLIEST POSSIBILITY. If removal
of those data requires disproportionate effort OR IT CONTRADICTS THE
PURPOSE OF THE AI SYSTEM OR AI MODEL, the controller shall in any event
effectively protect without undue delay such data from being used to produce
outputs, from being disclosed or otherwise made available to third parties
IMPLEMENT APPROPRIATE ORGANISATIONAL AND TECHNICAL MEASURES.
Addition of Article 9 section 2 (l) and section (34) of the preamble
- Not clear what is the intended purpose of the addition, specifically:
- What is the provision meant to enable?
- What is the provision meant to protect (considering the exception from the general
prohibition set forth under article 9) against?
- Our approach is that alleviation should enable 2 elements:
i) create a framework for the usage of biometrics as a safe and an ever-widespread
security feature that businesses can apply and enable in the course of their regular
business activities while adhering to the already robust standard privacy
framework; and
ii) fraud prevention at scale while ensuring that a good and seamless UX. Therefore,
easing the requirements for user verification alone does not recognize that many
fraud prevention processes depend on the one-to-many (“authentication”), e.g.
enabling account login and password resetting using biometrics.
- In terms of protecting the data subjects against potential risks arising from the processing
of special categories of personal data the legislator seems to place emphasis on the “sole
control” of the data subject. “Sole Control” would refer to very few technological solutions
that are rather not used today – the technology mostly used would retain the data under
the control of the service provider for verification or authentication. To explain further why
Changes to GDPR Article 9
5
the “sole control” of the data subject does not fulfil the aim of protecting data subjects’
rights is that technical control over the process, considering the complexity of the
underlying technology, is not what is protecting the data subject against the actual risks.
The data subject does receive more protection when: i) the underlying technology is secure
(e.g. encryption is applied as already suggested in the proposal); and ii) the data subject is
aware of the processing and can choose whether to be subject to it or not. Therefore,
emphasis should be put on not the “sole control” as a technical means but rather the fact
that the purpose of confirming the identity of the data subject using special categories of
data (both verification and authentication) are made on the request of, or due to services
requested by, the data subject. The current wording of the preamble seems to be
addressing a very narrow technological solution that is only one, or a very limited set of,
privacy preserving technologies available.
- Suggestions:
- Expand the biometrics use-cases to authentication in addition to verification.
Ideally, the expansion would also include fraud prevention and detection;
- The concept of “sole control” should be revised so that control rests with the
service provider rather than exclusively with the data subject. This approach would
ensure technical security of the data while linking processing activities to
operations that are directly or indirectly related to services requested by the data
subject. The primary objectives should remain data security and transparency for
the data subject.
We remain available for further discussions.
Aleksander Tsuiman
From: Karmen Turk | TRINITI <[email protected]>
Sent: Monday, December 29, 2025 4:21 PM
To: Helen Uustalu - JUSTDIGI <[email protected]>
Cc: Kristi Värk - JUSTDIGI <[email protected]>
Subject: FW: Kutse andma tagasisidet EL digiomnibussile, millega koondatakse Euroopa andmekaitse, küberturva ja tehisaru regulatsioon ühe katuse alla
|
Tähelepanu!
Tegemist on välisvõrgust saabunud kirjaga. |
Tere Helen ja Kristi
Kuna ametlik tähtaeg on ammu möödas, siis panen otse Teile Veriff OÜ palvel nende kommentaari.
Head vana lõppu ka!
Parimate soovidega!
Karmen Turk
Vandeadvokaat / Partner
Advokaadibüroo TRINITI
Maakri Kvartal | Maakri 19/1, Tallinn
Kohtumised uutes kõrgustes, Maakri 19/1, 25. korrusel
T: +372 685 0950 |
M: +372 513 3181
Käesolev e-kiri ja selle lisad on konfidentsiaalsed ning võivad sisaldada kliendisaladust, mis on seadusega kaitstud
Juhul, kui Te ei ole e-kirja määratud adressaat, palume sellest meile teada anda ja e-kiri koos lisadega kustutada
Käesoleva e-kirja loata avalikustamine, kopeerimine, levitamine või selle sisu muul viisil kasutamine on keelatud
From: Helen Uustalu - JUSTDIGI <[email protected]>
Sent: Thursday, December 4, 2025 2:26 PM
To: Liia Hänni ([email protected])
<[email protected]>;
Dan Bogdanov <[email protected]>;
Nele Siitam <[email protected]>;
Karmen Turk | TRINITI <[email protected]>;
[email protected];
[email protected];
Pille Lehis - AKI <[email protected]>;
Liiri Oja - OKK <[email protected]>;
[email protected];
[email protected];
[email protected];
Paloma Krõõt Tupay <[email protected]>
Cc: Kristi Värk - JUSTDIGI <[email protected]>
Subject: Kutse andma tagasisidet EL digiomnibussile, millega koondatakse Euroopa andmekaitse, küberturva ja tehisaru regulatsioon ühe katuse alla
Lp Andmekaitse Nõukoja liikmed
Edastan Teile üleskutse esitada oma arvamus EL omnibussile ning teabe kaasamisseminari kohta.
Nagu kokku leppisime – kõik teie arvamused ja seisukohad arutame jaanuaris läbi ja lisame Eesti seisukohta vastavalt.
Justiits- ja digiministeerium kutsub andma tagasisidet EL digiomnibussile, millega koondatakse Euroopa andme-, küberturva- ja tehisaru regulatsioon ühe katuse alla
Euroopa Komisjon on algatanud ulatusliku Euroopa õigusaktide uuendamise paketi, mida nimetatakse EL digiomnibussiks. Tegemist on mahuka ettepanekute kogumiga, mis ühendab ja kaasajastab Euroopa andmekaitseõiguse, andmeõiguse, küberturvalisuse ja tehisaru kasutamise reegleid. Justiits- ja digiministeerium alustas paketi sisulist analüüsi ning osaleb Eesti seisukohtade kujundamisel, et tagada meie inimestele ja ettevõtetele selgem ning kasutajasõbralikum Euroopa õigusruum.
Siht on koondada teemad kokku ühte õigusakti, teha teema inimeste jaoks selgemaks ja lihtsamaks.
"Digireeglite maastik on killustatud ning seni ettevõtetele ja inimestele raskesti mõistetav. Nn digiomnibuss toob mitmed seni eraldi kehtinud digiõigusaktid ühe katuse alla, muudab reeglid selgemaks ja lihtsamini kohaldatavaks ning loob võimaluse vähendada tarbetut halduskoormust kogu ELis," ütles justiits- ja digiminister Liisa Pakosta.
Pakosta rõhutas veel, et tegemist on Euroopa õigusruumi jaoks otsustava tähtsusega hetkega: „Digiomnibuss on väga suur muutus kogu Euroopa digireeglite süsteemis. Selle eesmärk on tõsta õigusselgust, kõrvaldada õigusaktide vahel vastuolusid ja teha reeglid inimestele arusaadavamaks. Kui regulatsioon on selge ja toimiv, siis võidavad nii kasutajad, ettevõtted kui ka avalik sektor.“
Regulatsioon peab toetama innovatsiooni ja kaitsma inimeste põhiõigusi
Digiomnibussi ettepanekud puudutavad ka isikuandmete kaitse üldmääruse ehk IKÜM-i täpsustamist, n-ö küpsiste nõusoleku korrastamist ning tehisintellekti määruse lihtsustamist. Kõik muudatused on alles analüüsi faasis, kuid nende mõju on ulatuslik, puudutades nii andmete kasutamist, õigusi digikeskkonnas kui ka tehisintellekti rakendamise võimalusi.
Arutelud jätkuvad Eestis ja Euroopas
Digiomnibussi „sõit“ on alles algamas ning see, kas ja mis kujul täpsem sisu vastu võetakse, selgub nii liikmesriikides kui ka Euroopa institutsioonides toimuvates aruteludes. Eesti osaleb aktiivselt Euroopa tasandi diskussioonis ja sisendi kogumises ning andmises, et tagada tulevikus reeglid, mis toetavad digiriigi toimimist, ettevõtluskeskkonna konkurentsivõimet ja inimeste õigusi.
Selleks kaasame huvirühmad Eesti seisukohtade kujundamisse ning esitame määruste eelnõud arvamuse avaldamiseks.
Justiits- ja digiministeerium ootab arvamusi digiomnibussi ettepanekute kohta hiljemalt 19.12.2025 aadressile [email protected].
Digiomnibuss on kättesaadav siin ja AI teemal siin. Hetkel on kättesaadav vaid ingliskeelne versioon, kuid eestikeelse tõlke avaldamisel on see kättesaadav samuti eelpool toodud linkide kaudu.
Teemat tutvustame ka 09.12.2025 kell 12.00-14.00 toimuval kaasamisüritusel MS Teamsis. Osalemissoovist palume teada anda siin: Digiomnibusi kaasamise seminar. Teamsi link kuvatakse peale registreerimist samal vormil.
Lugupidamisega,
|
Helen Uustalu |
Changes to GDPR Article 9
To: Ministry of Justice and Digital Affairs
Date: 29.12.2025
From: Veriff OÜ, Aleksander Tsuiman [email protected]
The proposed amendment
Article 9 is amended as follows:
(a) in paragraph 2, the following points are added:
‘(k) processing in the context of the development and operation of an AI system as
defined in Article 3, point (1), of Regulation (EU) 2024/1689 or an AI model, subject to
the conditions referred to in paragraph 5.
(l) processing of biometric data is necessary for the purpose of confirming the identity of
a data subject (verification), where the biometric data or the means needed for the
verification is under the sole control of the data subject.’
(b) the following paragraph is added:
‘5. For processing referred to in point (k) of paragraph 2, appropriate organisational and
technical measures shall be implemented to avoid the collection and otherwise
processing of special categories of personal data. Where, despite the implementation of
such measures, the controller identifies special categories of personal data in the
datasets used for training, testing or validation or in the AI system or AI model, the
controller shall remove such data. If removal of those data requires disproportionate
effort, the controller shall in any event effectively protect without undue delay such data
Changes to GDPR Article 9
2
from being used to produce outputs, from being disclosed or otherwise made available
to third parties.’
Comments, observations, suggestions
Addition of Article 9 section 2 (k) and Article 9 section 5
- Not clear what is the intended purpose of the addition, specifically:
- How does the change correlate with article 10 section 5 of Regulation (EU)
2024/1689 (AI Act) allowing the processing of special categories of data for the
purposes of bias mitigation. It is unclear whether article 10 section 5 of the AI Act
would then be considered applicable only for high risk AI systems and article 9
section 2 (k) would then be then applicable to all non high risk systems OR all
systems, including high risk systems.
- The proposed wording, i.e. “processing in the context of /–/ operation of an AI
system /–/ or an AI model” makes it appear like operating an AI system is a
processing activity on its own, however, an AI system or a model is a means of
technology and should not receive different treatment from any other technological
means to process data. If read together with article 9 section 5, this wording can
even create a confusing obligation to remove data from processing.
- Furthermore, the wording is extremely specific focusing on technology as it stands
today and that is against the principle of technology neutrality that is a core
regulatory principle in the EU stating laws should focus on outcomes/functions, not
specific tech, to avoid stifling innovation, future-proof rules, and let markets choose
best solutions.
- The proposed wording does not take into account the realities around operating biometric
technology. The legislator should understand that as there are AI models and AI systems
Changes to GDPR Article 9
3
processing biometrics then those models and systems need to be properly trained.
Training and operating such models presumes the existence of such data as part of the
processing operations. This means that such data is used to produce outputs as otherwise
those systems would not be able to operate. Hence, the current wording of Article 9
section 5 is far off from how a well performing and “behaving” AI-based biometric system
would work.
- Additionally, requiring the controller to remove such data unless doing so involves
disproportionate effort imposes an extremely high standard. This could significantly
impede the development and functioning of many AI models and systems.
- The phrase “protect data from being used to produce output” is ambiguous because it does
not define what “produce output” entails. Since disclosure and making data available are
addressed separately, “produce” can only be interpreted as preventing training data from
contributing to outcomes. However, this would be technically infeasible, as training data
is inherently part of generating results.
- Suggestions:
- Remove the notion of “operation” from article 9 section 2 (k)
- Reword article 9 section 5 to only state “For processing referred to in point (k) of
paragraph 2, appropriate organisational and technical measures shall be
implemented to avoid the unnecessary processing of special categories of personal
data.”.
- Alternatively reword Article 9 section 5 should as follows: “For processing referred
to in point (k) of paragraph 2, appropriate organisational and technical measures
shall be implemented to avoid the collection and otherwise processing of special
categories of personal data. Where, despite the implementation of such measures,
the controller identifies special categories of personal data in the datasets used for
training, testing or validation or in the OPERATION OF THE AI system or AI model,
Changes to GDPR Article 9
4
the controller shall remove such data AT THE EARLIEST POSSIBILITY. If removal
of those data requires disproportionate effort OR IT CONTRADICTS THE
PURPOSE OF THE AI SYSTEM OR AI MODEL, the controller shall in any event
effectively protect without undue delay such data from being used to produce
outputs, from being disclosed or otherwise made available to third parties
IMPLEMENT APPROPRIATE ORGANISATIONAL AND TECHNICAL MEASURES.
Addition of Article 9 section 2 (l) and section (34) of the preamble
- Not clear what is the intended purpose of the addition, specifically:
- What is the provision meant to enable?
- What is the provision meant to protect (considering the exception from the general
prohibition set forth under article 9) against?
- Our approach is that alleviation should enable 2 elements:
i) create a framework for the usage of biometrics as a safe and an ever-widespread
security feature that businesses can apply and enable in the course of their regular
business activities while adhering to the already robust standard privacy
framework; and
ii) fraud prevention at scale while ensuring that a good and seamless UX. Therefore,
easing the requirements for user verification alone does not recognize that many
fraud prevention processes depend on the one-to-many (“authentication”), e.g.
enabling account login and password resetting using biometrics.
- In terms of protecting the data subjects against potential risks arising from the processing
of special categories of personal data the legislator seems to place emphasis on the “sole
control” of the data subject. “Sole Control” would refer to very few technological solutions
that are rather not used today – the technology mostly used would retain the data under
the control of the service provider for verification or authentication. To explain further why
Changes to GDPR Article 9
5
the “sole control” of the data subject does not fulfil the aim of protecting data subjects’
rights is that technical control over the process, considering the complexity of the
underlying technology, is not what is protecting the data subject against the actual risks.
The data subject does receive more protection when: i) the underlying technology is secure
(e.g. encryption is applied as already suggested in the proposal); and ii) the data subject is
aware of the processing and can choose whether to be subject to it or not. Therefore,
emphasis should be put on not the “sole control” as a technical means but rather the fact
that the purpose of confirming the identity of the data subject using special categories of
data (both verification and authentication) are made on the request of, or due to services
requested by, the data subject. The current wording of the preamble seems to be
addressing a very narrow technological solution that is only one, or a very limited set of,
privacy preserving technologies available.
- Suggestions:
- Expand the biometrics use-cases to authentication in addition to verification.
Ideally, the expansion would also include fraud prevention and detection;
- The concept of “sole control” should be revised so that control rests with the
service provider rather than exclusively with the data subject. This approach would
ensure technical security of the data while linking processing activities to
operations that are directly or indirectly related to services requested by the data
subject. The primary objectives should remain data security and transparency for
the data subject.
We remain available for further discussions.
Aleksander Tsuiman
| Nimi | K.p. | Δ | Viit | Tüüp | Org | Osapooled |
|---|