WS12

WORKSHOP

SUMMARY

WORKSHOP • 26 OCTOBER 2017

LUNCHTIME WORKSHOP:

“Data protection and ethics. Does more ethics imply more duties for controllers?”


by Laura Drechsler, Brussels Privacy Hub, LSTS, VUB

SUMMARY


On 26 October 2017, the Brussels Privacy Hub in collaboration with Brussels Laboratory for Data Protection & Privacy Impact Assessments (d.pia.lab) organised an event on “Data protection and ethics. Does more ethics imply more duties for controllers?” with a keynote speech by Professor J. Peter Burgess (Ecole Normale Supérieure, Københavns Universitet and Vrije Universiteit Brussel) and interventions of Wojciech R. Wiewiórowski (Assistant European Data Protection Supervisor). Dariusz Kloza Vrije Universiteit Brussel  (VUB) – Brussels Laboratory for Data Protection & Privacy Impact Assessments (d.pia.lab) and Peace Research Institute Oslo (PRIO)) chaired the seminar.


The event aimed for an ethical introduction into data protection concepts and highlighted some potential (ethical) implications of the entering into force of the General Data Protection Regulation (GDPR).


The event started with introductory remarks by Dariusz Kloza, who highlighted the parallels between the development of the European regulatory framework on data protection and the European integration project itself, as both started initially as market integration initiatives (i.e. they were, to a large extent, ‘economy-oriented’) and progressively moved towards the incorporation of some fundamental values. As far as the European Union (EU) is concerned, democracy, human rights and the rule of law were gradually added to the initial focus on the market integration, now institutionalised in the Charter of Fundamental Rights of the EU (CFR EU) and, for example, the Copenhagen criteria for potential new EU Member States. Similarly, the story of the regulation of the processing of personal data in the EU started as internal market affair when it was realised what practical implication diverging data protection rules have for businesses working cross-border in the EU. These beginnings can still be found in the dual aim of the Data Protection Directive 95/46/EC, in which the protection of data pertaining to an individual and the free flow thereof are both put forward as legislative objectives. With the Lisbon Treaty and the CFR EU, personal data protection is now a fundamental right, which for many people means that the main aim for data protection is the guarding of the individual, and the functioning of the internal market is rather a secondary aim. However, the importance European integration project and, within it, the importance of regulation of personal data protection, necessitates looking also beyond democracy, human rights and the rule of law. The reasoning behind these values is equally important. This is where ethics can come with an answer.


Professor J. Peter Burgess started his keynote speech by dissecting the words “duty” and “more ethics” of the title of the talk. For duty, he explained that it can essentially mean two things: On the one hand a duty can be a quantifiable and verifiable task. Duties would then be tasks to be completed under compulsion. It is clear, when a task is over, how to assess its outcome, and who is responsible for it. It is a chore or a closed plan. The compulsion to complete a task is external. When executing the duty, it is because there is an obligation that comes from outside of a person. Such duties can sometimes contradict personal interest, which would not make it incoherent, since the obligation is external. Act and compulsion are the major elements of this type of duty. On the other hand, the second type of duty is an obligation that is internal. It is not because of the will of an external authority that such a duty is fulfilled, but because of an internal compulsion. Actor and act are inseparable. To understand an act one needs to understand the motivation. To evaluate such duties, the internal motivation (morals) need to be known. Unlike the first type of duty, the second type cannot be evaluated from outside, but only from the point of view of the person who is carrying it out (self-assessment). This type of duty is also endless, more richly moral, and infinite. Such a duty does not stop with death – but it goes on. It is inseparable from a moral conscience. The duty in the title of the talk needs to be understood as the second type of duty.


Professor Burgess also reflected on the meaning of “more or less ethics” of the title. More ethics could be understood as meaning more hours to be spend on ethical evaluations, more thinking on ethics or more reports on ethics. However, the idea of more ethics is only meaningful when understanding that being human is being ethical. By saying one is more ethical, it should mean being more aligned with one’s humanity and attuned to the humanness of the whole situation and its values, feelings of belief, sociability and culture. That what it should mean to be more ethical.


These explanations serve to situate data protection in the universe of ethics. Data protection like duties can be understood as checklist type tasks or as a moral obligation. Both have different cultural, social and political consequences. If data protection is understood just as the first type of duty, the value is lost. If it is just a checklist to be completed, it will soon be done by algorithms and the human element and therefore the ethical aspect would be lost. Data protection cannot mean adapting to the external rule book. There needs to be an implicit connection to the data protection rules. This link can be found in ethics.


In the second part of his keynote Professor Burgess reflected on the essence of ethics. In general, ethical reflections are often issued in two types of statements: it is and it should be - the normative and the ethical. This differentiates between how the world is and how it should be. It rests on the common assumption that the ethical is external to the world. People put a division between the world and ethics. Ethics is seen as a cookbook that needs to be followed. We need it to manage life, but we do not reflect anymore, on its origins and on the why we are doing it. Obeying the ethical code externally makes us blind to the real ethics beneath it, and to the reasons for that ethical decision.

.

Linking this back to the European integration project mentioned in the introduction, an example of this lack of real ethical thinking can be found in a central concept of European culture and European value – the person. Personhood is the pillar of the European way of life, of the rule of law, and human rights. However, nowadays we take the person for given and assume it is sacrosanct. There is a need to think about what is being done. This is the core idea of ethics.

 


Similarly, the relationship between ethics and law needs to be reflected upon. Law as a body of rules, that are more or less concrete, and an institution, needs to be combined with ethics, which is deeply cultural and spiritual. There needs to be a balance between self-reflection (ethics) and the check-list (law). Law should include a systematic self-reflection on its purpose.


In the final part of his keynote Professor Burgess elaborated on data protection and ethics. He pointed out that “Data” as a Latin word translates roughly to “given”. As such it can be found in all Western languages. Data are what is given. However, we should ask more about who is giving the data. Another expression for “given” is “granted”: data are taken for granted. Date are the lowest common denominator of information, below meaning and property. The GDPR has no solution for property. Data as the single form cannot be owned, only with a link to a person it can be personal data. There exists a need to attach information to a person to be an object of data protection. The tradition of property law and the philosophy of property stop at data. This is the one thing that is given and cannot be owned. Data protection is only meaningful when it is in relation to a person. Personhood is needed to understand data.


As mentioned the person is taken for granted, just like data. All fundamental documents of European construction take the person more or less for granted. In Europe, the definition for person was brought forward by Kant: A person is defined by rationality, liberty, and autonomy. Ethics for Kant was reflecting about the person. It was a reflection on what are the conditions for freedom. Given that personhood is at the core of data protection and the fact persons are taken for granted, it is hard to see ethics in the digital age. In the digital age the definition of person is changing. As a consequence, there needs to be vigorous reflection on how that affects the way we think about ethics. There is a worrying tendency for persons to become “data zombies”, that are being governed by data and converge machines and human beings. These are issues to deal with when rethinking the definition of personhood.

 


As a way forward, Professor Burgess urged to stop taking the person for granted. People need to stop being Kantians and update their definitions. The result should not be a new fixed concept of personhood – but a reflection mechanism on how the concept of personhood has impact. Furthermore, there is a need to rethink ethics and law. This requires an examination of the concept of dignity as the core value after personhood of the European tradition. Dignity means that the person should never be the mean, but the end upon itself (Kant). We might need a new kind of vocabulary and fresh concepts to talk about this evolution of the person.


In the end Professor Burgess answered the initial question “Does more ethics imply more duties for controllers?” with a resolute “no”. However, this “no” stands only if duty is meant in the ethical sense explained at the start. Understanding duty ethically means the GDPR just brings a continuation of the existing ethical tasks controllers are already doing it. Compliance simply needs to be upgraded in argument and vocabulary, and reflect upon what a person is, and how this is being changed by technology. If that is done, the technological challenges of this new era can be tackled in an ethical way.


This keynote was concluded with some reflective remarks by Wojciech R. Wiewiorowski, who remarked one problem of ethics in data protection lays in the lack of clear outlines. Without even knowing what the right ethical questions are regulators and consumers feel at a loss. In practice, most controllers pound more on the practicalities of ethics. Of course, if ethics is just ticking boxes on a form, the essential reflective (ethical) part would be lost. However, without guidance the controllers will just think ethics is used deliberatively to create an everlasting discussion and to to make their life harder. Including ethics in data protection law leads to an uncertainty with what a controller needs to comply with. Complying with the law will not be enough. Controllers are concerned about this, and fear that the debate on ethics is never-ending. For them more ethics does not mean more ethical behaviour but more endless discussions. Controllers also often criticise, perhaps rightly, that regulators are escaping the law, by framing difficult decisions as an ethical issue. Like this, regulators are escaping from their role as a regulator, which includes making decisions. It is the task of regulators to check compliance with legal requirements and not to discuss about moral duties. To summarize, Wiewiorowski pleaded for more clarity on what the questions are, and for the proposal of a way how answers could be found, even if the discussions are ethically never-ending.


The presentation was followed by a lively debate with the participants about the difficult balance of ethics as a purposefully endless debate and giving practical guidance to controllers, the possible institutionalisation of ethics, enforcement of ethics, the conflict between locally differing ethics and harmonizing laws, and the necessity of keeping ethics flexible.


Connect with us


Brussels Privacy Hub

Law Science Technology & Society (LSTS)

Vrije Universiteit Brussel

Pleinlaan 2 • 1050 Brussels

Belgium

info@brusselsprivacyhub.eu

@privacyhub_bru

Stay informed


Keep up to date of our activities and developments. Sign up to our newsletter:

My Newsletter

Copyright © Brussels Privacy Hub