In the e-SIDES project, the ethical, legal, societal and economic implications of big data applications are examined in order to complement the research on privacy-preserving big data technologies and data-driven innovation. The first step in the project was to identify important ethical, legal, societal and economic issues that are raised by big data applications. The second step was to provide an overview of existing technologies that may have the potential to address some of the issues.
To draw the attention of the big data community and carry out a collaborative discussion on emerging challenges in the field of privacy-preserving technologies, e-SIDES hosted the workshop called “Technology solutions for privacy issues: what is the best way forward?” held on the 14th of May between h. 17:00-18:30 at the Big Data Value Meet-Up in Sofia, Bulgaria.
The workshop, chaired by Gabriella Cattaneo from IDC (e-SIDES coordinator) started from the project analysis and classification of the main PETs (Privacy-Enhancing Technologies) to assess with participants the state of the art, discuss the emerging challenges, and carry out a collaborative discussion on the possible guidelines of responsible research and innovation in this field.
The workshop was centered around 4 main questions:
- Can technology guarantee the anonymization of personal data without losing the value added of analytics?
- What is the best approach to design privacy-aware solutions and services without falling into ethical traps, such as fostering discrimination and unfairness?
- Which data technologies are best to preserve privacy and security?
- Can we move from technology as the problem (violating privacy) to technology as the solution?
Cattaneo opened the session with the presentation “Privacy-enhancing technologies: do no evil?”. The presentation, based on the outcomes of the deliverable D3.1 Overview of existing technologies, was focused on the e-SIDES classification of PETs, based on which technologies considered as privacy enhancing or privacy-preserving were identified and assigned to eleven classes: anonymisation, sanitisation, encryption, multi-party computation, access control, policy enforcement, accountability, data provenance, transparency, access and portability, and user control. During the presentation, Cattaneo stressed the weak point of today’s privacy enhancing solutions and underlined the main technological, organizational and policy issues in the field, including the insufficient implementation of privacy by design and the need to develop an appropriate regulatory framework.
The first keynote was followed by a panel including a presentation by Rigo Wenning from ICT-18 project SPECIAL (Scalable Policy-aware Linked Data Architecture for Privacy, Transparency and Compliance), who focused on GDPR compliance for big data, and involving in the discussion Eliot Salant from H2020 project RestAssured. At the beginning of the discussion, the participants focused on privacy policies and shared their views about what is the best way to ask for user consent. Many of them agreed on the fact that sticky policies can be considered the way to go for the future and that users should be asked for consent in context.
After the panel, all the participants were involved in a voting session using the online app Mentimeter, during which they were asked to express their opinions and choose which PETs are most promising and/or effective and which actions could be considered the best way forward. User control and policy enforcement emerged as the most promising PETs according to the audience: the answers, which were showed in real-time, triggered the discussion on data provenance, as some of the participants underlined the unexpected low score earned by this technology. As regards the most relevant actions to be implemented, the highest scores were reached by:
Putting Privacy-by-design into action
- Employ multidisciplinary and diverse teams to leverage different viewpoints
Focus on Responsibility in Data Use:
- Design your data and systems for auditability
Keep Transparency, Trust and User control at the centre
- Develop algorithmic transparency
After the voting session, the participants also pointed out the need for raising awareness and the importance of education in this field.
Find more at: