Press Release Forum Privatheit illuminates the European path

23. November 2020

At the annual conference of the "Forum Privatheit", experts looked back on seven years of research and asked: How can we lead a self-determined digital life? The main focus was on how this path can be shaped in Europe. Exciting answers to these questions were provided by economist Aline Blankertz, computer scientist Lorrie Cranor and philosopher Judith Simon, among others.

 

Data custodians should act in the interest of consumers

Economist Aline Blankertz, for example, spoke about the relationship between entrepreneurial competition and the protection of privacy. Due to the market power of a few platform providers such as Google, Facebook and Amazon, consumers have little choice. This, she said, undermines data protection. Consumers are also often overburdened to make competent decisions about their privacy settings, which explains the 'privacy paradox'. This describes the fact that consumers indicate in surveys that they are very concerned about privacy, while at the same time they give out their personal data very freely on the Internet. Blankertz, Project Leader of the Foundation New Responsibility, suggests collective negotiators as a possible countermeasure to this excessive demand: "I would like services that act in the interest of consumers, a kind of data trustee to whom I can entrust the protection of my privacy.” Blankertz welcomes the fact that many protective measures are already laid down in the GDPR. As an example, she cited privacy icons – easy-to-understand symbols that users can use to quickly configure their data protection settings. However, Blankertz criticized the lack of practical implementation. In the area of privacy icons, there are currently no more than a few drafts.

Privacy Icons should simplify the handling of data protection settings

Lorrie Cranor, however, highlighted how difficult it is to translate a legal requirement into technology. The computer scientist, who teaches at Carnegie Mellon University in Pennsylvania, explained why it is so important that privacy settings are not only useful but also usable for consumers. According to her, the settings should be easy to find, understand and use. In practice, she said, they are often the opposite: hard to find, tedious to understand, time-consuming to read and to use. Some are misleading or even deliberately lead to the wrong direction. Cranor's team therefore developed privacy icons in a lengthy development and testing process that represent a mixture of symbols and accompanying keywords. Cranor also emphasized with regard to the GDPR that it is important that users understand the data protection settings that affect them in order to be able to protect their privacy effectively. The use of symbols could also be helpful here. In her opinion, however, there should be an institution that introduces and standardizes these generally understandable data protection symbols so that every user quickly learns which data protection settings are predefined – and which he or she can select.

Values should be considered in the development of technology

Judith Simon, Professor for Ethics in Information Technology at the University of Hamburg, asked questions that at first glance seemed less practical than fundamental. She explained how rights and values can be taken into account in complex data ecosystems and asked: "When are digital technologies morally good - and who actually makes that decision? For a long time, research has assumed that technology is not neutral and that values must be taken into account in its design. However, according to Simon, co-author of the German Data Ethics Commission's report, this aspect of technology development is particularly challenging: "We all agree that justice is important. But if I want to implement justice in technology, it gets very difficult.” This also applies to automated face recognition. It functions and evaluates very differently depending on gender and face color. Inequities that are already present in society are inscribed in the software. Here, it is necessary to look carefully at how this can be avoided. "Our understanding of what is just differs according to context," says the philosopher, who has been a member of the German Ethics Council since 2018. Accordingly, the design of technology cannot be a purely technical decision, but must also take the context into account. Simon also considers the question of who can make decisions about technologies to be very important: "We should not forget that software engineers often do not have the power to make decisions. The big platforms have a lot of leverage to demand – or even prevent – privacy protection.” Simon's conclusion: "To develop technologies that meet our ethical requirements, we need an interplay of forces from legislation, self-regulation, investment in research and education, and a change in the underlying business models.”

"With the interdisciplinary research of the "Forum Privatheit", business models can be developed in which data protection is a selling point," said Ingo Hoellein from the German Federal Ministry of Education and Research. He praised the work of the committee, which is highly regarded in expert circles, and gave the following outlook: "We know that we want to pursue a European path of digitization. For us, the research association "Forum Privatheit" is a kind of lantern that illuminates this path well. After all, in addition to purely technological research, we also need a discussion framework that reflects both our European values and our ethical ideas.”

In the Forum Privatheit, experts from seven scientific institutions deal with questions concerning the protection of privacy in an interdisciplinary, critical and independent manner. The project is coordinated by Fraunhofer ISI. Further partners are Fraunhofer SIT, the University of Duisburg-Essen, the Scientific Center for Information Technology Design (ITeG) at the University of Kassel, the Eberhard Karls University of Tübingen, the Ludwig-Maximilians-University of Munich (LMU) and the Independent State Center for Data Protection Schleswig-Holstein (ULD). The Federal Ministry of Education and Research supports the Forum Privatheit to stimulate public discourse on the topics of privacy and data protection.

For all press and image enquiries please contact:

Barbara Ferrarese, M.A.
Fraunhofer-Institut für System- und Innovationsforschung ISI
+49 (0) 721 / 6809 – 678
barbara.ferrarese@forum-privatheit.de
www.forum-privatheit.de
https://twitter.com/ForumPrivatheit



Press Release
Press Release