Brussels Privacy Hub has moved to a new website as of 18 May 2022. The new website is available at www.brusselsprivacyhub.com. This version of the website will be stored for archiving purposes. Please see the new website for the latest updates.
WORKSHOP • 6 JUNE 2018
by Laura Drechsler, Brussels Privacy Hub, LSTS, VUB
On 6 June 2018, the Brussels Privacy Hub organised a third edition in its Brussels Privacy Hub Law Enforcement Data Access Series titled Encrypted data: Challenges and Opportunities” chaired by Catherine Van de Heyning (FRC, VUB). The event split into two rounds of discussions, first, Graham Willmott (Head of Unit DG Home, European Commission) and Owen Bennett (Mozilla) elaborated about “Encryption and back doors”. In a second round, Joachim Meese (University of Antwerpen) and Jan Kerkhofs (Belgian Federal Prosecutor’s office, Counter Terrorism Unit & Cybercrime Unit) debated on “Encryption and self-incrimination”.
The event was opened by a short introductory remark by Van de Heyning, who illustrated why encryption was such a contagious issue by pointing out recent stories of the FBI inflating the numbers of phones they could not access because of encryption and the disagreement of different courts in Europe on whether or not court orders can be used to open up encryption. She highlighted that this debate was further complicated by both sides having good arguments for or against encryption, which were represented by the different speakers of the panel.
The first debate on “Encryption and back doors” started with Willmott explaining the European Commission’s stance towards encryption. He noted that the Commission has been looking into the different issues surrounding encryption since it was mandated to do so by the Council in 2016. Since then the Commission tried to collect the opinions and needs of the different EU bodies, such as the Fundamental Rights Agency (FRA), European Union Agency for Network and Information Security (ENISA), EUROJUST and EUROPOL as well as national stakeholders, and published some specific recommendation. For him it was clear from this process that encryption was a necessary tool in order to secure fundamental rights such as privacy and secure communication. However, he could also see that there was a need to find a solution to the issues law enforcement agencies are having with encryption, meaning that they cannot access information necessary for conducting investigations.
For those reason, Willmott explained, the Commission proposed different types of measures. A first set of measures concerned the enhancement of technical capabilities within the existing agencies and law enforcement bodies, e.g. providing equipment for guessing the password of a device. For the better exchange of skills, EUROPOL would now provide a forum for different experts to exchange best practices and experiences. Additionally, the Commission recommended training of police forces and provided funding for it. In a next step, the Commission attempted to set up an observatory that would monitor technological developments and assess how they affect law enforcement in practice. Finally, the Commission would set up a dialogue with civil society to build trust and exchange best practices cross-sectors, e.g. communicate private sector solution to the police forces. Eventually this dialogue should also include academics. However, Willmott made clear that currently the Commission was not planning to have any legislation for dealing with encryption, though this could change in the future.
The debate continued with Bennett representing the private sector viewpoint on “Encryption and backdoors”. He argued that for Mozilla encryption was a crucial part of providing internet security both for the company itself and its users, which was part of its founding mission. Since encryption was such an essential part of security on the otherwise unsafe internet, Bennett strongly opposed the plea by law enforcement authorities to build in back doors into the different operating systems so that law enforcement agencies could still access the data they would need for their investigations, as this would undermine the security over all. As Bennett pointed out, it was impossible to have a backdoor that can only be used by law enforcement, once there was one, it could also be exploited by the criminal forces operating online. He similarly criticised another proposed solution, namely that operating systems with encrypted communications should develop an additional key, that could be given to law enforcement, to also de-encrypt a conversation (in normal encryption only the end-user has such a key). For Bennett, this would open similar vulnerabilities as the first approach, as, in addition to the engineering problems connected to the creation of such a key, it was equally difficult to prevent any wrong hands to get access to that additional key.
Bennett argued that the debate would need to be reframed, as currently it is portrayed as a conflict between privacy and security, while in reality it is all about privacy or all about security as they are intertwined. Weak encryption would also weaken data security and data integrity and hence privacy protection. He emphasised again that it was impossible to create backdoors that would only work for law enforcement and not for criminals. For him a better solution would be a closer cooperation of technology designers and law enforcement, so there was a legitimate way for them to receive their investigation data, therefore Mozilla is heavily involved in the e-evidence proposal the Commission brought forward. In the end, while encrypted content data might be out of reach for law enforcement, they would still be able to access a range of meta-data that could provide equally useful information. For Bennett, it was clear that law enforcement had more tools at their disposal today than at any moment in the past.
In the following debate with the other panellists and the audience, Kerkhofs reacted to Bennett’s remarks by highlighting that actually law enforcement agencies were having less and less tools at their disposal, as the data retention rules were invalidated by the Court of Justice of the European Union (CJEU) in Digital Rights Ireland and the level of encryption as evolved a massive step forward. Moreover, as he pointed out, additional problems stem from the fact that many service providers are located in the United States (US), where there are hardly any tools to access data. It therefore remained for Kerkhofs a conflict between security and privacy, where new encryption methods left no possibility for law enforcement to intercept any data, therefore law enforcement increasingly has to resort to more privacy-invasive methods, as they would then monitor whole groups of people surrounding a potential suspect in the hope that someone would receive a decrypted message. Encryption for Kerkhofs took away the opportunity for law enforcement to choose a precise least privacy invasive method for investigating. While he agreed with some of the arguments against backdoors, he proposed that instead encryption could be offered on different levels, with normal citizen only accessing a lower level that could still be decrypted by law enforcement. Alternatively, providers could work more closely with law enforcement and offer the possibility to install malware on targeted phones in order to be able to intercept communication, e.g. from a suspected terrorist.
To guide over into the second debate on “Encryption and self-incrimination”, De Heyning raised the issue of potentially using production orders, so that suspects are forced by court orders to provide the password to decrypt their data. So far cases have been mixed on this both within the US and Europe. Within Europe, the prohibition of self-incrimination could pose limitations on such production orders. De Heyning illustrated the diverging opinions by pointing out that Belgium so far had decisions for and against the possibility of using production orders, so that currently the Belgian Supreme Court has to make a final decision.
In his intervention, Meese focused on the question, whether forcing a potential suspect to unlock his or her data for law enforcement conflicts with the prohibition of self-incrimination inherent in both the European Convention on Human Rights (ECHR) and the Charter of Fundamental Rights of the European Union (CFREU). He explained that this issue can especially arise in countries, such as the United Kingdom (UK) or Belgium, where legislation does provide for the possibility to force a suspect to unlock his or her data by providing for example the password of a phone. Such legislation usually foresees sanction in case a suspect is not complying, in the UK these sanctions can translate into up to five years of imprisonment. For Meese it could be possible that such prison sentences would violate the right not to self-incriminate.
So far there has been no case by the higher Courts in Europe on this issue. Only comparable case for Meese arose at the European Court of Human Rights (ECtHR) in O’Halloran and Francis v UK, where the Court was asked to rule on a UK law that allowed the authorities to ask owners of vehicles that were caught on speeding cameras, whether or not they were driving the car at that moment. If the owners refused to answer, the law allowed to convict the owner of speeding, as he or she has not cooperated. The ECtHR analysed whether such a conviction for non-cooperation would interfere with the right not to self-incriminate and found an interference that was however justified, as the fine for speeding was small, there was no alternative method of figuring out the truth and traffic safety was a worthy public interest. As Meese highlighted a case with a production order would be a bit different however, as there was a much higher penalty (prison) and thus the interference with the right not to self-incriminate was more severe. For him, considering this case and the circumstances surrounding production orders, it seems likely that they might not be allowed considering the prohibition of self-incrimination held up by European human rights.
Finally, Meese pointed out that the issue might not concern so much the right not to self-incriminate, but the right to remain silent as in the end it concerns the possibility of a suspect not to reply to a question (the request for the password or code) posed to him or her. The fair trial rules would not allow to force a suspect to answer a question by threatening him or her with imprisonment, therefore the issue is clearer when viewed through the right to remain silence.
A different viewpoint on issues of self-incrimination were presented by Kerkhofs. He argued that the debate surrounding production orders was too theoretical, in practice the issue would also be that asking suspects for their passwords also risks them providing the so-called kill-code, namely a code that erases everything on the device. Therefore, production orders would not actually provide such a useful tool, as it would always be necessary to also be able to access the data without cooperation of the suspects. Regarding the question, whether production orders pose an unjustified interference with the prohibition against self-incrimination, Kerkhofs argued that a password or log-in to data was not proof in itself, so a suspect would not give evidence against him or herself but would lead to potential evidence. However, he admitted that in some cases, e.g. when an encrypted device included child pornography, the password itself could be assimilated with proof. He concluded though, that an order to provide a code or password cannot be considered violating the prohibition of self-incrimination, as data there might not necessarily concern the person giving the password or code but others. He proposed that the right not to self-incriminate could be better preserved in a second step, where the judge in a trial surveying all the data could exclude as evidence from the trial those data found that would be self-incriminating.
Concerning the right to remain silent, Kerkhofs noted that it was not an absolute right. While it was important to prevent forced or tainted evidence, e.g. evidence obtained via torture or undue pressure, those issues are not at stake when asking a suspect for a code or a password. Kerkhofs praised a recent French judgement, that made a clear distinction in that regard and stated that asking for a code or password would not taint the truth as it was a request for a fact. It was hence not an application of the right to silence as it concerned facts. For him this was comparable to other instances in the law, e.g. when the police stops someone and asks for identification. The name of a person was also a fact, lying about it to law enforcement was a criminal offense. For Kerkhofs, a password or code should be treated similarly.
The event concluded with an intense debate with the audience on all mentioned issues. One audience member emphasised the importance of strong encryption for safeguarding personal data, for example when processed in health care services.
Keep up to date of our activities and developments. Sign up to our newsletter:
Copyright © Brussels Privacy Hub