Difference between revisions of "DPIA"

From DE4A
Jump to navigation Jump to search
(Initial data entry)
 
(→‎Current inputs on data protection risks: Update on mandate risks with inputs from RVO)
 
(3 intermediate revisions by the same user not shown)
Line 60: Line 60:
 
|}
 
|}
  
== Current inputs on data protection risks ==
+
== Objectives and principles ==  
 
 
 
 
This section summarises current inputs on the risks to the rights and freedoms of data subjects (needed for point d above). The table should be concise - the goal is to collect inputs, not to draft the DPIA here.
 
  
All DE4A partners may provide inputs and suggestions. Examples of risks include (but are not limited to):  
+
All DE4A partners may provide inputs and suggestions. Examples of risks include:  
  
 
- illegitimate access to data (loss of confidentiality);
 
- illegitimate access to data (loss of confidentiality);
Line 71: Line 68:
 
- unwanted change (loss of integrity);
 
- unwanted change (loss of integrity);
  
- disappearance (loss or corruption) of data(loss of availability);
+
- disappearance (loss or corruption) of data (loss of availability);
  
 
- disproportionate collection of data;  
 
- disproportionate collection of data;  
Line 91: Line 88:
 
'''-''' '''when describing risks, the fact that other Member States lawfully use data differently than in the citizen's home country is not a risk.''' E.g. if data is retained for 5 yeas and cannot be shared with other administrations in Member State A, but Member State B allows retention for 20 years and sharing with designated other administrations, that is not considered a risk, since it is lawful and remains within the confines of the GDPR.  
 
'''-''' '''when describing risks, the fact that other Member States lawfully use data differently than in the citizen's home country is not a risk.''' E.g. if data is retained for 5 yeas and cannot be shared with other administrations in Member State A, but Member State B allows retention for 20 years and sharing with designated other administrations, that is not considered a risk, since it is lawful and remains within the confines of the GDPR.  
  
 +
'''- the scope of the DPIA is DE4A (not the SDGR), and specifically our implmentation of the technical system (not building blocks that we re-use, or national infrastructures).''' The objective is not to assess e-government in the EU in general (or any Member State specific elements). 
 +
 +
== Current inputs on data protection risks ==
 +
 +
 +
This section summarises current inputs on the risks to the rights and freedoms of data subjects (needed for point d above). The table should be concise - the goal is to collect inputs, not to draft the DPIA here.
 +
 +
Inputs from D2.1 and D2.2 have been taken into account; as well as inputs from the wiki; also inputs from the Commission's draft DPIA on the draft implementing act; and inputs provided for the DBA pilot area.
 
{| class="wikitable"
 
{| class="wikitable"
 
|+
 
|+
 
!Description of the data protection risk
 
!Description of the data protection risk
!Risk type
+
!Likelihood (low, medium, high)
!Likelihood of the risk (low, medium, high
+
!Severity (low, medium, high)
!Severity of the impact (low, medium, high)
+
!Applicable to all pilot areas, or pilot specific?
!Applicable to all pilot areas, or specific to a pilot?
+
!Have the risks been mitigated? Are any risks remaining?
 
|-
 
|-
|
+
|Identity mapping of the user relies on imperfect model -
|
+
 
|
+
no guarantee that the evidence relates to the exact user
|
+
|Low
|
+
|High
 +
|All pilots
 +
|Some mitigation through best practices for identity mapping; but risks remain.
 +
User intervention pattern mitigates this too.
 
|-
 
|-
|
+
|Powers of representation/mandates has no mature system
|
+
|High
|
+
|Medium
|
+
|Pilot specific - DBA (representation of companies),
|
+
 
 +
or representation of families
 +
|Mitigation through national infrastructure (national eID linked to national
 +
company registers); but this becomes difficult cross border. Also, DBA does not pilot
 +
 
 +
three MS scenarios, so within DE4A that risk has been eliminated for our project
 +
 
 +
(although it is still a general compliance risk that needs to be managed).
 
|-
 
|-
|
+
|Preview with evidence requesting administration,
|
+
 
|
+
rather than per data source. Requires all data providers to
|
+
 
|
+
maintain perfect compliance
 +
|High
 +
|Low
 +
|All pilots
 +
|Not mitigated
 +
|-
 +
|SSO towards evidence providers - no reidentification
 +
allowed, reliance on eIDAS identification
 +
|High
 +
|Low
 +
|All pilots
 +
|Not mitigated (other than via identity mapping, which has its own risks)
 +
|-
 +
|Structured and unstructured evidences - requires original
 +
and canonical evidence, which could mismatch
 +
|Medium
 +
|Medium
 +
|All pilots
 +
|Both are exchanged, but this imposes a duty of diligence on the data requester
 +
|-
 +
|Translation and misinterpretation - semantics
 +
|High
 +
|Medium
 +
|All pilots
 +
|Semantic work is ongoing, but requires long term sustained commitment.
 +
|-
 +
|Identity fraud by using outdated evidence - evidence is
 +
exchanged successfully, but changes afterwards; the
 +
 
 +
receving administration is not notified.
 +
|High
 +
|Low
 +
|All pilots
 +
|Can be mitigated through the subscription and lookup patterns;
 +
but this is not explicitly supported under the SDGR.
 +
|-
 +
|Data minimisation not perfectly implemented -
 +
standardised rather than tailored data is exchanged
 +
|Medium
 +
|Low
 +
|All pilots
 +
|The problem is not worse than for traditional paper based exchanges
 +
|-
 +
|Evidence duplication - multiple sources, which could
 +
contain conflicting data, leading to evidence shopping
 +
|Low
 +
|Low
 +
|All pilots
 +
|The problem is not worse than for traditional paper based exchanges
 +
|-
 +
|Evidence is outdated or inaccurate
 +
|Medium
 +
|High
 +
|All pilots
 +
|Mitigated by preview requirement, but this is controlled by users only.
 +
No overview of authoritative sources (or their guarantees) exists at this stage.
 +
|-
 +
|Data control is removed from the user after the exchange -
 +
receiving administrations can use evidence as required
 +
 
 +
by their own laws
 +
|High
 +
|Low
 +
|All pilots
 +
|The risk is inherent to cross border exchanges. Does not qualify as a risk as
 +
long as the receiving administration respects its own laws.
 +
|-
 +
|Where the DE and DO apply separate authentication
 +
processes (e.g. in USI pattern), a second person could do
 +
 
 +
the second authentication, thus enabling identity fraud
 +
|Medium
 +
|High
 +
|All pilots (but depending on the patterns used)
 +
|The risk is more for the data consumer, rather than for the data subject, since
 +
in principle the attack requires collusion between two data subjects. While this
 +
 
 +
leads to less reliable data processing, it is not a risk for the data subjects created
 +
 
 +
by DE4A - it is a risk created  by their unlawful behaviour (which may however
 +
 
 +
impact third parties that can also be data subjects).
 
|}
 
|}

Latest revision as of 10:08, 25 November 2021

General goal and planning

As a part of the ethics management in DE4A, the project is required to provide a data protection impact assessment (DPIA), that satisfies the requirements of European data protection law (principally the GDPR).

This is not a formal deliverable, but delivery was a recommendation in our ethics review, and DE4A has committed to provide it.

A DPIA must contain:

(a) a systematic description of the envisaged processing operations and the purposes of the processing

(b) an assessment of the necessity and proportionality of the processing operations in relation to the purposes

(c) an assessment of the risks to the rights and freedoms of data subjects

(d) the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with the GDPR

The wiki will be used principally to collect inputs on points c and d; points a and b can be based on existing deliverables.

Timeline

Given that the DPIA is not a formal deliverable, there is some flexibility. However, in order to be useful, the DPIA should be done by mid December at the latest. This will also allow its content to be re-used in the deliverable D7.2 (Initial Report on legal and ethical recommendations and best practices), which must be submitted by the end of December.

For that reason, the following timeline will be followed:

Step Description Deadline
1 Initial wiki set-up - collecting initial inputs from prior deliverables 5/11/2021
2a Draft inputs collected from pilot participants and Member States 26/11/2021
2b WP7 team collects data on points a and b of the DPIA (see introduction above) 26/11/2021
3 WP7 bundles, reviews and completes first draft DPIA, containing all four points above 5/12/2021
4 Draft is circulated to pilot participants and Member States 6/12/2021
5 Feedback and review 13/12/2021
6 Finalisation 17/12/2021

Objectives and principles

All DE4A partners may provide inputs and suggestions. Examples of risks include:

- illegitimate access to data (loss of confidentiality);

- unwanted change (loss of integrity);

- disappearance (loss or corruption) of data (loss of availability);

- disproportionate collection of data;

- unlawful monitoring or crosslinking of data

- inadequate transparency on data collection, use or access

- disregard of data subject rights (loss of access or deletion rights)

- unlawful data sharing or re-use

- disproportionate retention

However, for the avoidance of doubt:

- a DPIA should identify risks for data subjects. Risks for systems, data, or public administrations can be included only if they are presented from the perspective of the data subject. E.g. rather than saying "the government database could be corrupted", consider saying "the citizen's data could get corrupted"; rather than saying "the servers could go offline", consider saying "the citizen may not be able to use the e-government service".

- when describing risks, the fact that other Member States lawfully use data differently than in the citizen's home country is not a risk. E.g. if data is retained for 5 yeas and cannot be shared with other administrations in Member State A, but Member State B allows retention for 20 years and sharing with designated other administrations, that is not considered a risk, since it is lawful and remains within the confines of the GDPR.

- the scope of the DPIA is DE4A (not the SDGR), and specifically our implmentation of the technical system (not building blocks that we re-use, or national infrastructures). The objective is not to assess e-government in the EU in general (or any Member State specific elements).

Current inputs on data protection risks

This section summarises current inputs on the risks to the rights and freedoms of data subjects (needed for point d above). The table should be concise - the goal is to collect inputs, not to draft the DPIA here.

Inputs from D2.1 and D2.2 have been taken into account; as well as inputs from the wiki; also inputs from the Commission's draft DPIA on the draft implementing act; and inputs provided for the DBA pilot area.

Description of the data protection risk Likelihood (low, medium, high) Severity (low, medium, high) Applicable to all pilot areas, or pilot specific? Have the risks been mitigated? Are any risks remaining?
Identity mapping of the user relies on imperfect model -

no guarantee that the evidence relates to the exact user

Low High All pilots Some mitigation through best practices for identity mapping; but risks remain.

User intervention pattern mitigates this too.

Powers of representation/mandates has no mature system High Medium Pilot specific - DBA (representation of companies),

or representation of families

Mitigation through national infrastructure (national eID linked to national

company registers); but this becomes difficult cross border. Also, DBA does not pilot

three MS scenarios, so within DE4A that risk has been eliminated for our project

(although it is still a general compliance risk that needs to be managed).

Preview with evidence requesting administration,

rather than per data source. Requires all data providers to

maintain perfect compliance

High Low All pilots Not mitigated
SSO towards evidence providers - no reidentification

allowed, reliance on eIDAS identification

High Low All pilots Not mitigated (other than via identity mapping, which has its own risks)
Structured and unstructured evidences - requires original

and canonical evidence, which could mismatch

Medium Medium All pilots Both are exchanged, but this imposes a duty of diligence on the data requester
Translation and misinterpretation - semantics High Medium All pilots Semantic work is ongoing, but requires long term sustained commitment.
Identity fraud by using outdated evidence - evidence is

exchanged successfully, but changes afterwards; the

receving administration is not notified.

High Low All pilots Can be mitigated through the subscription and lookup patterns;

but this is not explicitly supported under the SDGR.

Data minimisation not perfectly implemented -

standardised rather than tailored data is exchanged

Medium Low All pilots The problem is not worse than for traditional paper based exchanges
Evidence duplication - multiple sources, which could

contain conflicting data, leading to evidence shopping

Low Low All pilots The problem is not worse than for traditional paper based exchanges
Evidence is outdated or inaccurate Medium High All pilots Mitigated by preview requirement, but this is controlled by users only.

No overview of authoritative sources (or their guarantees) exists at this stage.

Data control is removed from the user after the exchange -

receiving administrations can use evidence as required

by their own laws

High Low All pilots The risk is inherent to cross border exchanges. Does not qualify as a risk as

long as the receiving administration respects its own laws.

Where the DE and DO apply separate authentication

processes (e.g. in USI pattern), a second person could do

the second authentication, thus enabling identity fraud

Medium High All pilots (but depending on the patterns used) The risk is more for the data consumer, rather than for the data subject, since

in principle the attack requires collusion between two data subjects. While this

leads to less reliable data processing, it is not a risk for the data subjects created

by DE4A - it is a risk created by their unlawful behaviour (which may however

impact third parties that can also be data subjects).