Securing Software With Privacy-Preserving Enablers

RomyGeneral News

01 February 2021

Securing Software With Privacy-Preserving Enablers

The right to privacy is one of the fundamental rights included in more than a hundred national constitutions. It sets boundaries that protect individuals from external interference. The debate around privacy has gained traction since Edward Snowden’s revelations about governmental mass surveillance programmes and, more recently, with the advent of artificial intelligence and data mining.

Privacy software computer

In this context, the term Privacy by Design broadly refers to the application of data protection best practices to system design. It is based on the idea that building privacy into a product or a service from the beginning of the design process is preferable to the alternative of adding privacy on top of an already existing system as an afterthought. Similarly, Privacy by Default designates a situation where the default settings in a product or a service provide the user with protection against privacy risks by themselves, without the need for any additional configuration or other changes.

These principles mandate clearly stating

  • the purposes for which data is being processed (purpose specification)
  • the limitations on what data can be collected (collection limitation)
  • the minimisation of the collected data (data minimisation)
  • the limitations on use, retention, and disclosure of the data (use, retention, and disclosure limitation)
  • the notion that there should be a presumption of privacy, meaning that the default settings should provide the best possible privacy protection for users.

These are all issues that are still subjects of debate today in political and research circles.

CyberSec4Europe’s report, Definition of Privacy by Design and Privacy Preserving Enablers, focuses on privacy. It defines a set of challenges in today’s research, presented as three broad categories: data privacy challenges, identity privacy challenges, and legal and development challenges. These are open problems that CyberSec4Europe’s enablers want to address over the course of the project’s lifetime.

For the most part, the document presents CyberSec4Europe’s privacy-preserving enablers and privacy-preserving architecture, of which the enablers are critical components, and also comprises detailed explanations of the privacy-preserving enablers’ functionalities. This is followed by a three-part discussion of their relationship to the project’s core research and development work; namely, its relation to the lines of research, its place within the research roadmap, and how the demonstrator use cases could leverage their functionalities.

Therefore, this document is of interest to anyone looking for an overview of CyberSec4Europe’s portfolio of privacy-preserving technologies, as well as the project’s plans to address today’s privacy research challenges.

This work is especially important today because privacy is at the centre of a convoluted debate. There are governments and corporations that harvest user data indiscriminately, using national security and “services tailored to your needs” as justification. These practices gained support by leveraging users’ psychological state, such as their fear of terrorist attacks, or the comfort of using a recommended system. But as time goes by, they are increasingly perceived as dubious at best, and against human rights at worst.

And then there are those users who are worried about being tracked. Because of their protests, organisations are being scrutinised more than ever for adherence to privacy rules, with consequences to their public image if they are found guilty of breaching their users’ privacy or lacking adequate data security protocols.

Perhaps the biggest obstacle to attaining a satisfying conclusion to this debate is the myth that privacy and usability are mutually exclusive. The strategy of today’s software is to shower users with security warnings and pop-ups – often full of technical jargon – whenever there is a security incident. Through lack of understanding or patience to wade through what they’ve been bombarded with, users more often than not respond by ignoring the warnings, closing the pop-ups and carrying on. Therefore, it is crucial that researchers and industry leaders collaborate to create software that prevents such incidents altogether, while providing users with the simplest and most direct way to achieve their goals, with actions that were designed to preserve the privacy and security of their data.

Cross-border regulations, such as the GDPR, are important steps to further promoting best practices and shielding users from malicious third parties; but they are a complement to secure software, not a substitute. Together with increasing efforts to educate users on the steps they can take to secure their data, such measures could lead to a future in which everyone’s online well-being is secured.

Alessandro Sforzin, NEC Laboratories Europe GmbH