Nuala O’Connor, President and CEO of the Center for Democracy and Technology (CDT), for a roundtable on “Current Privacy and Security Challenges in the United States, Belgium, and the EU” during her visit to Brussels at the invitation of the German Marshall Fund of the United States as part of a program with the US Embassy to strengthen the Belgium-United States (US) dialogue on data protection. The roundtable was opened by Prof Paul De Hert (LSTS, VUB) and moderated by Laura Drechsler (Brussels Privacy Hub). The following summary gives an overview of the questions and answers that followed.
In her initial statement O’Connor explained the aims her organisation, CDT, which focuses inter alia on the areas of privacy and data, freedom of expression, security and technology, platforms, and net neutrality and copyright. For O’Connor the Snowdon revelations led to a new notion in her area of advocacy, namely that citizens must be better informed about the use of their data by the government. To this aim, the CDT is a long-term proponent of a federal privacy law for the US, an idea which seems to be back in discussion after the most recent Cambridge-Analytica scandal. This is the more urgent, as the current US approach using sectoral laws led to fragmentation and confusion.
As for the perceived differences in the US and EU approaches to data protection, O’Connor pointed out that this divide is overstated, as governments everywhere have the same aim of keeping their citizens safe, which leads to an insatiable demand for data fostered by the belief that more data will increase security. This demand however is short-sighted in both the US and the EU. It is not more data that is needed, but a more targeted approach meaning a narrow scope for a legitimate use of information for the most serious national security challenges (which cannot be as comprehensive as Snowdon showed). The divide in this debate is in the end not so much between EU and US data protection experts but between security and data protection experts globally.
Being asked about the current debate on encryption in the US, O’Connor pointed out that the debate on encryption was not new in the US and actually resembles largely the one that was going on in the 90ies. CDT had always had a hard-line on encryption, as it seems clear that breaking encryption does more harm than good as most crucial infrastructure including nuclear reactors use encryption for their data. If there would be permanent back-doors, this would just create bull eyes on them as well. So far court cases in the US Supreme Court have seen this danger and ended in favour of encryption as was seen in the Apple case with the phone of the shooter in San Bernardino.
On the current US administration stance towards privacy and data protection, O’Connor informed that there has not been much movement yet. However, due to the scandal that erupted after Cambridge Analytica leading to the senate hearing of Facebook CEO Mark Zuckerberg, activity could rise as citizens are getting concerned about their data and have already prompted a considerable amount of state-level bills. As the Facebook reveals concern both political parties in the US, privacy might also be a theme in the upcoming midterm elections and if there is an overhaul in either house or senate there might even be a chance for a federal privacy law being proposed. This development is further driven by the facts that with the GDPR EU citizens enjoy more protection than their US counter-parts and that companies are increasingly annoyed by the patchwork of applicable rules and seem therefore open for a more unified approach.
Responding to the questions on threats posed to democracy by algorithms, O’Connor highlighted that the current internet has free content because of advertising which is built on personal data profiles. This basic constellation is unfit for democracy, as it creates bubbles with information that people want to see, creating different “facts” for different people. The claim of platforms that they are neutral algorithm is especially harmful in that context. No algorithm is neutral, they always reflect the values and worldviews of the creators. Platforms need to accept the responsibility they have towards democracy and be more transparent in their dealings. O’Connor also suggested that algorithms be audited in a way to better account for bias. Discrimination can then be best be tackled, using a risk-based approach (unlike the procedure-based approach prominent in the EU). Such an approach would focus on the harms and be more narrowly tailored. In the end, different policy levels will be needed to adequately address these challenges, from the federal level to public opinion to the engagement of the CEOs of the big platforms.
On the issue of enforcement of data protection and privacy rules, O’Connor argued that the reason the GDPR is now being taken seriously is because of the threat of big fines. For companies, compliance is always a risk analysis, and as soon as the risk of non-compliance becomes too big, they will comply. In this way, the EU is similar to the US, where fines have proven an effective deterrent, may they come through class actions or be imposed by the Federal Trade Commission (FTC). Hopefully, stronger compliance will lead to the realisation that everyone dealing with personal data has duties. The CDT in this context is not for any radical approach, in the sense of prohibiting or outlawing certain big platforms, but rather supports a response from within business. First step would be that all actors recognise their responsibility and implement it accordingly in their structure, with the help of engineers, editors and other experts.
Reacting to the concerns that freedom of speech is being controlled by big platforms, O’Connor reassured that with the right amount of transparency freedom of speech can be preserved and even fostered. The most important aspect is that platforms do not become the custodian of history, meaning that they can amend it according to their financial or reputational needs. Algorithms in that context are not fit to play the rule of human editors as they lack the sort of transparency required and have the wrong basic model, promoting content that has many views instead of qualitative criteria. Platforms and companies need to admit more their responsibility towards free speech and democracy in that regard.
Finally, O’Connor reacting to the main concern emerging during the roundtable, namely that platforms have too much power online and are not complying with the rules, stated that she believes real regulatory change will come for platforms. For her, it is clear that sooner or later platform will be faced with the choice of lesser profits or no profits, as due to stricter rules their business model becomes unfeasible. Platforms will need to adhere to values to remain relevant.
Copyright © Brussels Privacy Hub