The automation of privacy- and data protection impact assessments
Salzburg, 23 February 2018, 16:00-17:30
Organized by: Brussels Laboratory for Data Protection & Privacy Impact Assessments (d.pia.lab) at the Vrije Universiteit Brussel (VUB)
Chairs: Dariusz Kloza (VUB, BE), Marco Giacalone (VUB, BE), István Böröcz (VUB, BE)
Invitees: Georg Philip Krog (Signatu, NO); Michele Marrali (Studio Storti, IT); Robert Sindlinger (OneTrust, DE), Erwin Rigter (Privacy Company, NL).
It is well known that the new legal framework for personal data protection in the European Union, applicable from May 2018, brings to the fore a plethora of novel solutions aiming, inter alia, at better safeguarding the interests of individuals whenever their personal data are being handled. One of these novelties is an obligation imposed on data controllers to conduct a data protection impact assessment (DPIA) for those processing operations that could present a “high risk” to the “rights and freedoms of natural persons”. Despite these novelties still posing questions as to their practical application, they already have caused development of dedicated guidance material, templates, tools, etc. – all aimed at making possible and facilitating their usage, including the conduct of a DPIA.
Amongst these, in particular, we have recently observed a proliferation of automated tools for conducting a DPIA. Various software developers have already offered to diverse clients varied pieces of software that – upon being fed with some descriptions, etc. – would help making an analysis of the intended processing operations and often recommend measures to lower the level of risk to the individuals, maximize benefits or even – if possible – to eliminate it entirely. These tools merit academic attention not only because of their novel character, uncharted potential or vulnerabilities, doubts about their quality and effectiveness, but – more importantly – because their usage has ramifications for the protection of individual interests. Thus there is a need to evaluate the extent to which these tools can contribute, if ever, to offering such a protection.
To that end, we want to explore some of the existing software for conducting DPIAs and – more broadly – privacy impact assessments (PIAs). In preparation for this panel, we have invited a number of software developers to make their tools available to us in advance, for analysis and testing. Having done so, we come to this interactive panel with observations, questions, and comments, which we will deliver and discuss right after these invited developers have presented their tools. A discussion with the public will follow.
8 December 2017
Vie privée, données personnelles et risques. Quels paramètres pour leur cohabitation à venir?
Cette journée d’étude est organisée par:
Chaire Géopolitique du Risque, Ecole normale supérieure (ENS)
Brussels Laboratory for Data Protection and Privacy Impact Assessment (d.pia.lab), Research Group on Law, Science, Technology and Society (LSTS), Vrije Universiteit Brussel (VUB)
Laboratoire Connaissance Organisation et Systèmes TECHniques (COSTECH), Département Technologies et Sciences de l’Homme, Université de Technologie de Compiègne (UTC)
Avec l’entrée en vigueur du Règlement Général sur la Protection des Données Personnelles (RGPD) le 28 mai 2018, l’Europe écrira une nouvelle page dans l’évolution rapide du numérique. Il s’agit d’un mouvement à double tranchant : d’un côté les données personnelles sont de plus en plus souvent mises au service d’un prédictibilité calculée, supposée ou réelle, d’un avenir imprévisible et dont l’incertitudes s’impose de manière croissante ; de l’autre côté de nouvelles formes de gestion de risque sont en train de se développer au service des entreprises soucieuses des risques juridiques et commerciaux générés par le RGPD.
La collecte, le stockage et le traitement des données augmentent à une allure redoutable et dans des proportions qui paraissent impossible à freiner ou à gouverner. Le numérique nous menace et nous offre le salut. Les nouvelles incertitudes engendrées par les traitements massifs de données personnelles sont abordées sous le prisme du « risque ». Mais que recouvre exactement ce terme de risque ? Comment est-il construit ? Une approche unifiée à la question, serait-elle envisageable ?
Nous proposons de nous pencher sur les nouvelles articulations de risque et d’incertitude dans le nouveau discours sur la protection de données. La journée d’étude vise à réunir des chercheurs et chercheuses en droit, philosophie, sciences politiques et sociologie des sciences et des techniques (STS).
17 November 2017
Policy debate, organized by d.pia.lab and Nemzeti Adatvédelmi és Információszabadság Hatóság
Data Protection Impact Assessments in the European Union: Complementing the New Legal Framework
Chair: Vagelis Papakonstantinou, VUB-LSTS (BE)
Speakers: Dariusz Kloza, VUB-LSTS (BE), PRIO (NO), István Böröcz, VUB-LSTS (BE)
Discussants: David Wright, TRI (UK), Bálint Halász, Knight Bird & Bird Iroda (HU)
The Nr. 1/2017 policy brief of the d.pia.lab provides recommendations for the European Union (EU) to complement the requirement for data protection impact assessment (DPIA), as set forth in the General Data Protection Regulation (GDPR), with a view of achieving a more robust protection of personal data. In April 2016 the EU concluded the core part of the reform of its legal framework for personal data protection. The Union is currently preparing implementing measures and guidelines to give full effect to the new legal provisions before their applicability from May 2018. This reform introduces, among other ‘novelties’, a legal requirement to conduct a DPIA. However, this requirement bears a few weak points. In order to remedy that by informing this on-going policy-making process, the policy brief attempts to draft a best practice for a generic type of impact assessment.
26 October 2017
Data protection and ethics. Does more ethics imply more duties for controllers?
Chair: Dariusz Kloza, VUB-LSTS (BE), PRIO (NO)
Speaker: J. Peter Burgess, ENS (FR), KU (DK), VUB-LSTS (BE)
Intervention: Wojciech R Wiewiórowski, Assistant European Data Protection Supervisor (EDPS)
The European project is many things. Among these, it is an ethical project: the enactment of a certain number of core values in a changing material reality. Theses values are everywhere evoked in the principal texts of the European Union, and unambiguously supported by the Charter of Fundamental Rights and other key documents. Our digital age imposes technological novelty on European society at a break-neck speed, challenging the meaning and relevance of these traditional principles and values, initiating us to new applications and new invitations. Although this has always been the work of ethics, the velocity of technological innovation today creates the need for a more continuous and far-reaching ethical reflection on application of the European core values to the technological transformations of out time. In short, if we understand by ‘digital ethics’ a systematic reflection on the core values of the European project, interpreted and applied to a new generation of challenges to data protection and privacy, then our duties, while not more numerous, are certainly more pressing.
25-27 January 2017
CPDP 2017 - Computers, Privacy & Data Protection: The Age of Intelligent Machines
Roundtable: Brace for Impact Assessments - How to be prepared? (organized by d.pia.lab)
Chair: Paul Quinn, VUB-LSTS (BE)
Roundtable: Massimo Attoresi, EDPS (EU), Roger Clarke, Xamax Consultancy (AU), Dariusz Kloza, VUB-LSTS (BE) & PRIO (NO) Eugenio Mantovani, VUB-LSTS (BE), Claudia Quelle, Tilburg University (NL), Paolo Sinibaldi, European Investment Fund (LU), Niels van Dijk, VUB-LSTS (BE)
With the adoption of the European Union’s General Data Protection Regulation and the Criminal Justice Data Protection Directive in April 2016, furthermore with the ongoing modernisation of Council of Europe’s Convention 108, the well-established concept of ‘impact assessment’ was adapted to the needs and reality of European data protection law. Some have welcomed this novelty with enthusiasm, some with reserve. In any case, it has sparked continuous debates on its rationale, efficiency and practical application, further urged by the imminently upcoming applicability of the new laws. Therefore, this roundtable will tackle four pertinent issues:
25-27 January 2017
CPDP 2017 - Computers, Privacy & Data Protection: The Age of Intelligent Machines
EDPL Young Scholar Award (organised by European Data Protection Law Review (EDPL))
Chair: Bart van der Sloot, TILT (NL)
Panel: István Böröcz, VUB (BE), Worku Gedefa Urgessa, University of Oslo (NO), Raphaël Gellert, LSTS-VUB (BE), Franziska Boehm, Karlsruhe Institute for Technologies (DE), Maja Brkan, Maastricht University (NL)
Award presented by: Serge Gutwirth, LSTS-VUB (BE)
The Young Scholars Award, hosted by the European Data Protection Law Review, is given annually to outstanding emerging researchers in the field of privacy and data protection law. During this panel session, the three best young academics will present their research, discuss it with the competition jury members and the audience. Serge Gutwirth (LSTS, VUB) will present the award to the winning young scholar during a ceremony at the end of the panel.
The topics of the finalists:
Risk to the right to the protection of personal data - an analysis through the lenses of Hermagoras /István Böröcz/
One of the novelties of the General Data Protection Regulation will be the application of the risk-based approach in European data protection law on a larger scale. Although the Regulation uses the term ‘risk’ in numerous provisions, it does not answer the question ‘What is risk to a right and how should it be assessed?’ Although Article 35. (Data Protection Impact Assessment, DPIA) provides a tool to assess these risks, to keep the GDPR suitable for assessing new technologies, the conduct of a DPIA should be based on solid and clear understanding of the provisions. The applicability and suitability of a risk assessment process is yet to be discovered if the risk relates to a fundamental right. A unified perception of risk to a right is necessary as it is the core element of the risk-based approach, furthermore, a varying perception of risk to a right would undermine the endeavours of the GDPR relating to harmonisation. This contribution elaborates on the attributes of risk to a right and advises a unified understanding of risk to a right and risk to the right to the protection of personal data.
We Have Always Managed Risks in Data Protection Law: Understanding the Similarities and Differences Between the Rights-Based and the Risk-Based Approaches to Data Protection /Raphaël Gellert/
Recent years have seen the emergence of a so-called risk-based approach to data protection. It is meant to address the purported shortcomings of the traditional EU data protection principles (such as data minimisation, purpose limitation, etc) with regard to evolving data processing practices (eg, profiling, big data). It does so by replacing these principles with risk analysis tools, the goal of which is to assess the benefits and harms of each processing operation and on this basis to manage the risk, that is, to take a decision whether or not to undertake the processing at stake. Such risk-based approach has been hailed as diametrically opposite to the legal, rightsbased nature of data protection. This contribution investigates this opposition and finds that the two approaches (risk-based and rights-based) are actually much more similar than is currently acknowledged. Both aim at managing the risks stemming from data rocessing operations. This is epitomised by the fact that they have the exact same modus operandi namely, two balancing tests, with risk reduction measures (known as safeguards in the legal context) associated to the second balancing. Yet, if both approaches manage data processing risks, they nonetheless do so differently. Whereas the risk-based approach manages risks in a contextual, tailor-made manner, the rights-based approach manages risks from the outset once and for all. The contribution concludes with a discussion and possible policy recommendations highlighting the benefits and drawbacks of each approach.
25-26 November 2016
Brno, Czech Republic
Cyberspace 2016 - 14th International Conference
Data Protection & Privacy Impact Assessments (special track organized by d.pia.lab)
Impact assessment in the European Union’s new data protection law /Dariusz Kloza, István Böröcz/
The reform process of the European Union’s legal framework for personal data protection was culminated on 27 April 2016 with the enactment of General Data Protection Regulation and – less popular – Police and Criminal Justice Data Protection Directive. Both instruments bring to the fore multiple uncharted novelties and one of them is a ‘data protection impact assessment’ (‘DPIA’). Upon the entry into force of the new legal framework (28 May 2018), an obligation will be imposed on data controllers to conduct such an assessment for personal data handlings that are “likely to result in a high risk to the rights and freedoms of natural persons” (cf. Art 35 of the Regulation and Art 27 of the Directive). All these novelties have sparked continuous debates on their effectiveness, efficiency and practical application, further urged by the imminently upcoming applicability of the new laws.
Therefore we could not help but to take part in this debate and reflect on the way the well-established concept of impact assessment was adapted to the needs and reality of European data protection law. Having briefly overviewed the history of impact assessments in the areas of environment, technology and privacy, we critically assess the two legal requirements for a ‘DPIA’ set forth by the new Regulation and the Directive. We point out their positive, acceptable and negative elements. We conclude that these ‘DPIA’ requirements – predominantly due to their limited scope – have rather failed to live up to the expectations vested therein. Yet this failure could be remedied by a complimentary policy on impact assessment that would genuinely safeguard both individual and collective interests related to privacy. We therefore conclude with a few modest suggestions as to the contents of such a policy.
Addressing issues with DPIA methodologies: What can we learn from law? /Raphaël Gellert, Niels van Dijk/
The introduction of data protection impact assessments (DPIAs) are one of the novelties of the General Data Protection Regulation. They present new elements and challenges to data protection practice. At their core, DPIAs seem to consist of risk management methodologies (imported from organisational and business spheres), aiming to assess and manage “risks to the rights and freedoms of data subjects” resulting from data processing operations.
The idea of assessing risks to rights is however not as straightforward as it might seem. Beyond the fact that risks and rights are very different practices (one probabilistic and anticipatory, the other drawing on legal knowledge and operating ex post), this contribution wants to focus on challenges concerning DPIA methodology. Risk management methodologies have faced serious criticism in other assessment fields like environmental and health law. Of particular importance, their pretence at objectivity has been at the centre of discussions, due to their tendency to reduce “the full range of uncertainties to the more comforting illusion of controllable, probabilistic but deterministic processes”. This however already presupposes a number of epistemic and theoretical commitments, which often mask subjective choices. In other words, whereas these methodologies present themselves as objective, the notion of risk has an inherent subjective dimension.
These findings have serious implications for the type and quality of data protection to be expected from DPIAs. Depending on the methodological choices, DPIAs could amount to little more than a managerialisation of data protection, telling us very little about what “risks to the rights and freedoms of data subjects” are, and could ultimately even undermine the data protection legal framework. Alternatively, a robust management methodology might have the potential to improve the protection of personal data, not least because of its anticipatory nature.
In this paper we argue that one way to ensure that DPIAs amount to more than “the new risk-based box-ticking” is to integrate lessons from legal practices with experience in articulating relations between risks and fundamental rights. This requires an analysis of case law concerning privacy and data protection on the one hand, and impact assessments on the other hand. We will extract two kinds of lessons. The procedural lessons will relate to how to organize the process of assessment, the status of risk as contestable evidence, the participation of those affected by the technology, and the proportional balancing of risk and right based knowledge. The substantive legal lessons will relate to the concepts of risk, harm and probability at the core of DPIAs. We will explore whether the incorporation of such foundational legal lessons can have the potential of transforming the DPIA into a tool that can anticipate data protection issues in a legal fashion, which we call a court of upstream adjudication.
The Potential for Impact Assessments in Projects Related to eHealth and mHealth /Paul Quinn/
The use of Impact assessments has gradually become more common in areas of technological innovation or novel practices where questions of privacy arise. This trend will likely take further root given the requirement set forth by article 35 of the General Regulation on Data Protection (GDPR). This article requires that data controllers conduct an impact assessment in a number of instances, including where the rights and freedoms of data subjects are at risk. As this presentation will discuss, the nature of such an impact assessment and the situations in which it is required make it ideal for use in projects related to eHealth and mHealth. Such projects frequently make use of large amounts of sensitive data and raise risks in terms of a number of important rights, including but not limited to rights linked to data protection. The broad nature of the impact assessment invoked in article 35 GDPR is suitable for not only considering questions linked to data protection and privacy but also issues related to stigmatisation, discrimination and other ethical issues that are often linked to health care projects. This presentation will discuss the potential use of impact assessments in such instances and discuss the benefits they can bring.