Guidelines on DPIA, new recommendations from the Italian Institute for Privacy

Guidelines on DPIA, new recommendations from the Italian Institute for Privacy

Rome, 12 September 2017 – A task force of experts from the Italian Institute for Privacy (IIP, www.istitutoprivacy.it) has analysed the new ISO/IEC 29134:2017 on Data Protection Impact Assessment and also, more specifically in matter of EU personal data protection, the proposed Guidelines on DPIA released by the Article 29 Working Party of European DP Supervisors on the 4th of April, 2017.

Reading these documents, several aspects were taken into account and discussed by IIP’s experts; some major doubts raised during their work, particularly concerning the possible wrong or, at least, incomplete interpretation of such a juridical methodology – the DPIA – which should never be intended as a mere data security measure.

Luca Bolognini, President of the IIP, stated: “A DPIA, according to art. 35 of the General Data Protection Regulation no. 2016/679(UE), should focus on the likely high risks, as implied by a personal data processing, to the rights and freedoms of natural persons: this means that a preliminary analysis in a DPIA should not consist (only) in a personal information risk analysis, but it should be also seen as something more legal, “humanistic” and less technological than the other types of analysis to be carried out according to the different articles 32 and 33 GDPR in order to prevent data breaches.

The apparent misinterpretation of DPIA as a mere data security tool can be partially deduced both reading the new ISO and the WP29’s draft Guidelines – particularly, their Annex 2 (that contains, moreover, an evident error, referring rights and freedoms to “data subjects” and not to natural persons in general), while in Chapter III.C of the same Guidelines the WP29 appears to have correctly focused the legal and “humanistic” point of view.

The above mentioned documents and their suggested methods/templates seem to be unbalanced when considering only the “pathological” condition of a data processing (that can be due to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed) implying risks for the rights and freedoms of natural persons. These models, instead, seem not to be focused enough on the – even more relevant, following the clear ratio legis of art. 35 GDPR – possible intrinsic risks for rights and freedoms implied by a data processing in itself, physiologically, without any accidents or unlawful actions.

Bolognini underlined: “It would be a lost opportunity to finally adopt official Guidelines not taking into account the fact that a personal data processing activity, even if perfectly secured, could still be not safe for human beings and could impact negatively on rights and freedoms of natural persons: in principle, it is clear that this perspective (focusing on potential physiological danger deriving from a data processing) was just one of the main reasons that brought the EU legislator to approve the current formulation of art. 35 GDPR.”

The Italian Institute for Privacy strongly recommends to EU DPAs to correct and clarify the relevant parts of the WP29’s Guidelines on DPIA, and suggests to envisage two different phases of preliminary risk assessment in a DPIA:

1. the first phase, that will be dedicated to the analysis of the severity and likelyhood of risks threatening the personal data and the whole processing activity/supporting assets; this first phase will be necessary in order to assess the possible “IT pathological/extrinsic risks” implied by data breaches;

2. the second phase, that will analyse the severity and likelyhood of risks of impacts on the rights and freedoms of natural persons, as implied:
2.1. by the possible data breaches taken into account in the first phase,
but also – element that is missing in the current draft –
2.2. as a possible/likely “physiological” and intrinsic consequence of that specific data processing, considered in itself even in absence of breaches.

Good mitigation measures, in line with the spirit of the GDPR, could be found only taking into account both the “pathological/extrinsic” and the “physiological/intrinsic” risks of a personal data processing to the rights and freedoms of natural persons.