Task 3.7 focused on the design of best practices for innovative and GDPR compliant user experience and the investigation of the interoperability issues in identity technologies (eIDAS and GDPR). We have also investigated the legitimacy of technologies used and the processing of trust services in cross-border dimensions.
The result of task 3.7 were two deliverables. Each is shortly introduced in a subsection. The content and the results of the research will be discussed in more detail in the remainder of this document, where we chose to split the second deliverable on interoperability and cross-border compliance issues into two sections because each part represents independent research (although they have a similar goal).
Guidelines for GDPR Compliant User Experience is an asset/deliverable that was produced as D3.6 in the CyberSec4Europe project. As its name implies, it is a collection of guidelines, best practices and recommendations for achieving GDPR compliance. It is combined from two parts. The first is the Guidelines for General Data Protection Regulation (GDPR) which present the regulation's requirements through the GDPR principles. The second part of the enabler is the Data Protection Impact Assessment (DPIA) template. As the name suggests, this part of the enabler can be used to help guide the user through the process of doing a DPIA and serve as documentation for the performed analysis.
Regulation (EU) 2016/679 of the European Parliament and of the Council or more commonly known as General Data Protection Regulation (GDPR), is a legal framework that sets guidelines for the collection and processing of personal information. The regulation was designed to strengthen the rights of individuals across the EU and ensure uniform and coordinated action across the Member States. The GDPR has caused a significant amount of panic and confusion among businesses. This can be attributed to multiple factors, such as high fines, applying to all organisations (as long as they process personal data) equally regardless of size or amount of data they process, often vague or open to interpretation requirements, etc. Therefore, the goal was to create something to help, especially smaller organisations, understand the requirements GDPR demands from them and support them in carrying some of the demanded tasks out.
The DPIA template is like a to-do list with guidelines on performing specific tasks and some pre-prepared structures to support the user. DPIA template is a combination of a guide and pre-prepared content in the form of table templates that personal data controllers can use to perform the DPIA. This solution aims to be primarily of use to the smaller organisations having problems performing or having questions about the assessment's specific steps by giving them a starting point on which they can build.
The Guidelines for GDPR compliant user experience combine and summarise known guidelines and opinions in the form of an actionable to-do list, supported by integrated checklists and concrete guidelines with explanations. It presents a baseline to perform data protection impact analysis when needed or required by the regulation. However, there are alternative GDPR guides and tools to perform a DPIA. This asset specifically addresses the needs of smaller organisations with limited resources. The DPIA template is purposefully made in a simple way that does not have any additional requirements to use and encourages users to change, expand, and upgrade the given template to suit their own organisation requirements and circumstances better. This, together with the pre-prepared list of potential risks that have to be addressed in the assessment and the self-assessment form for the users to check and evaluate their work, are the main elements that distinguish it from other similar tools and the main challenges in the process of ensuring GDPR compliance that were addressed.
Main deliverable:
- D3.6 Guidelines for GDPR compliant user experience
Auxiliary deliverable:
- D3.16 Security requirements and risks conceptualisation
Demonstrators:
- D3.13 Updated version of enablers and components
- D3.17 Integration to demonstration cases
Videos:
This research builds on the work already completed at the European Commission and ENISA as a basis for monitoring the current eIDAS network and proposing eIDAS 2. Additionally, it is an extension to the research described in the previous section, where, in the process of creating the guidelines for GDPR, we have noticed differences between the EU Member States that could potentially cause problems in situations involving more than one Member States. At the same time, this caused us to wonder if similar situations are possible in eIDAS, where interoperability between countries is even more important. The results are therefore split between the two regulations.
In the first section, we focused on a specific sample of selected eIDAS network cross-border use-cases with the intent to identify any deficiencies hidden under the umbrella of the already proposed global renovation of the regulation. Consequently, the study focused on specific real-world scenarios and was not based on administrative review as most existing reports. By analysing real-world implementations of the use-cases chosen, the analysis leads to the identification of detailed shortcomings of the current regulation as it is manifested in the following areas: organisational independence, remote access to the banking services, remote video identification, electronic signatures in public administration, commercial access to the eIDAS network, biometrics as an authentication mechanism, and technical authentication and onboarding security.
The second part is dedicated to GDPR, where we first identified some potential areas of differences between the EU Member States and then used a survey to query supervisory authorities about those areas in their respective countries. The results of the survey show GDPR related heterogeneity in the EU.
Although there is an eIDAS side and a GDPR side to the interoperability and cross-border issues research we have done in this task, the underlying purpose and, as an extension, the challenges we are addressing in both sections are very similar. In both cases, we are identifying areas of both regulations and how they are implemented in different countries that could cause issues in the long run. Given the nature of the problem, the detected issues that we have not found discussed elsewhere are mostly centred around how trust services and data protection could be abused or cause difficulties for the services or data protection providers. In some cases, we do also offer some recommendations on how to possibly alleviate a specific issue.
Main deliverable:
- D3.18 Analysis of interoperability and cross-border compliance issues
Demonstrator:
- D3.13 Updated version of enablers and components
- D3.20 Final cybersecurity enablers and underlying technologies components
Video:
Dynamic map:
Scientific publications:
- M. Hölbl, B. Kežmah, and M. Kompara, "Data Protection Heterogeneity in the European Union," Applied Sciences 2021, Vol. 11, Page 10912, vol. 11, no. 22, p. 10912, Nov. 2021, doi: 10.3390/APP112210912
The 4 Pilots and ECSO designed priority Cybersecurity Research Focus Areas. The challanges addressed and work done in assets developed in this task fall into the folowing categories and specific areas.
--- | Governance and Capacity Building | Trustworthy Ecosystems of Systems | Trust-Building Blocks | Disruptive Emerging Develpment |
---|---|---|---|---|
Guidelines for GDPR compliant user experience | - | - | ✔️ | ✔️ |
Analysis of interoperability and cross-border compliance issues | ✔️ | - | - | - |
--- | Collaborative Networks | Education & Training | Certification | Secure Platforms of Platforms | infrastructure Protection | Holistic Data Protection | AI-based Security | Systems Security & Security Lifetime Management | Secure Architectures for Next Generation Communication | Secure Quantum Technologies | Secure AI Systems | Personalized Privacy Protection |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Guidelines for GDPR compliant user experience | --- | --- | --- | --- | --- | ✔️ | --- | --- | --- | --- | --- | ✔️ |
Analysis of interoperability and cross-border compliance issues | ✔️ | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
Considering all the requirements set forth by the GDPR, it challenges businesses to fulfil the requirements to ensure compliance. The requirements are sometimes vague or too open and consequently subject to interpretation. This is where businesses struggle with their compliance endeavours. Achieving compliance can also be more demanding on smaller organisations that typically do not employ staff with all the necessary knowledge, and hiring out presents a high cost.
To support such businesses, we have assembled privacy guidelines for developing new information services. The asset combines and summarises known guidelines and opinions in the form of an actionable to-do list, supported by integrated checklists and concrete guidelines with explanations. It presents a baseline to perform data protection impact analysis when needed or required by the regulation. It is primarily targeted at small and medium organisations with personnel restrictions and organisations that are using guidelines, best practices and standards other than the ISO 27000 family, e.g. COBIT, ITIL and similar. Since these guidelines are implementation-oriented, they can be used as additional best practice guidance and control lists to be used with the ISO 27701.
By following these guidelines, data controllers and processors can either execute data protection impact assessment and/or get combined guidelines for GDPR compliance. The combined guidelines follow the latest standards, methods and frameworks for risk analysis and include a simple to follow methodology. Partially this also includes the WP29 Guidelines that were endorsed by the European Data Protection Board (EDPB). The guidelines include a baseline of identified risk to conduct threat analysis during Data Protection Impact Assessment exercise and easy to follow instructions where additional information is needed to explain decisions taken and document the assessment process. The guidelines end with the required Data Protection Officer consultation template and the non-required self-assessment.
Special emphasis is given to the Data Protection Impact Assessment (DPIA). The DPIA is a key part of complying with the GDPR where high-risk data processing is planned. We wish to provide the data controllers with a DPIA template that combines a guide and pre-prepared content they can use to perform the assessment easily. The DPIA template should be simple in construction so those less familiar with similar risk assessment processes can easily use it. It should allow also for data controllers to freely change the template to suit their needs and circumstances better. The content is aimed primarily at small and medium-sized organisations that find it harder to comply with the GDPR requirements due to more cost and personnel limitations.
When processing personal data, there are nine GDPR principles one should be aware of: lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity and confidentiality, and accountability. Therefore, this methodology expands the basic CIA triad that the ISO 27000 family is based on, with additional dimensions specific to privacy and GDPR compliance, making it more suitable for privacy-oriented endeavours.
The essential changes to the new personal data protection framework in Europe are reflected in the basic principles of GDPR. Legal principles of this kind are important criteria that guide the substantive definition of legal rules and how they are enforced. Personal data must be collected, processed and used lawfully (based on one of the foreseen legal bases) in a fair and transparent manner (in relation to the data subject).
They may only be collected for specified, explicit and legitimate purposes (the so-called purpose limitation), which means that the collected personal data may not be processed for other purposes for which they were not obtained (most often for marketing purposes) or processed in a manner incompatible with the purposes for which they were collected.
The minimum data principle restricts all organisations from collecting personal information in such a way that if the data is not necessary to achieve the goal, it is not appropriate to collect it. This principle also requires organisations to use less sensitive information than those whose nature or misuse carries more weight (e.g. pseudonyms are better than a person's full name) and that the data is only available to those persons in the organisation who actually need it.
The principle of accuracy is the obligation to check the accuracy of data and to keep it up to date.
The principle of storage restriction states that personal data are only stored for as long as is necessary for the purposes for which the personal data are processed.
Integrity and confidentiality impose an obligation to process in a way that ensures adequate security of personal data, including protection against unauthorised or unlawful processing.
However, the most important principle is the principle of accountability, which requires management organisations to comply with the fundamental principles and the ability to demonstrate the compliance of the processing with the fundamental principles.
Controllers and processors should, inter alia, achieve this by implementing appropriate technical and organisational measures to ensure compliance. These measures may include, are not limited to, internal rules on the protection of personal data and additional training of employees, internal audits of processing activities, etc. Other possible measures may include the minimisation of personal data collection, pseudonymisation, transparency, allowing individuals to monitor processing and establishment, and continuous upgrading of security measures. Privacy Enhancing Technologies or PETs are closely linked with the concept of privacy by design and privacy by default, which is required under the GDPR (Article 25) and can help achieve all the mentioned security measures.
Processing shall be lawful only if at least one of the following applies according to Article 6 of the GDPR:
- the data subject has given consent to the processing of their personal data for one or more specific purposes
- processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract
- processing is necessary for compliance with a legal obligation to which the controller is subject
- processing is necessary in order to protect the vital interests of the data subject or of another natural person
- processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller
- processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child (does not apply to processing carried out by public authorities in the performance of their tasks).
A Data Protection Impact Assessment (DPIA) must be performed before any type of processing is carried out and is an ongoing process that has to be regularly reviewed and brought up to date. A finished and properly performed DPIA will also help an organisation evaluate, document, and later show how they comply with all the personal data protection requirements. However, having to perform a DPIA can be a lot of work, especially for smaller organisations that do not have people with the required knowledge working in the organisation and do not have much capital to spend on outsourcing the work. The produced template was primarily designed to help such organisations do a DPIA by themselves.
DPIA is meant to identify and minimise personal data protection risks by systematically analysing the processing of personal data. Unlike most other risk analyses, DPIA is concentrated on the prevention of harm to data subjects, individuals, and overall society rather than the risk to the organisation itself. A DPIA is a legal requirement under the GDPR when the processing is likely to result in a high risk to natural persons' rights and freedoms. This is an excellent example of a condition set by the GDPR for which it is difficult to instinctively know whether it applies or not because there is no definition for "likely to result in a high risk" and the type of issue the enabler is meant to resolve.
The major elements of the DPIA template are presented in the figure below. The DPIA template aids with the initial decision on the necessity of performing a DPIA. If the circumstances demand the organisation to perform the assessment, then the template describes and provides guidelines for the DPIA steps. The "Conduct the self-assessment" (bottom left former in the figure) is optional and the last step in the DPIA. Before the solution/process can be implemented in the organisation, it is important also to make sure all other GDPR requirements are met, which is the purpose of the more broad GDPR compliant user experience enabler. DPIA template contains all the basic information about the assessment and many recommendations and good practices on how to perform it. Next, we briefly describe the DPIA template sections where the users can directly utilise the content to perform their own assessment.
The first step of performing a DPIA is determining if a DPIA is required. The DPIA template provides a list of criteria based on the GDPR and recommendations given by the Article 29 Working Party and endorsed by the European Data Protection Board. The template requires the users to answer a few questions about the type of processing they intend to perform. Based on the answers, the template enables the user to make an easy judgment about the necessity of the DPIA, given the users' circumstances. This decision process can also be beneficial to show an organisation has performed the DPIA voluntarily. Performing DPIA when not necessary can improve the trustworthiness and reputation of an organisation, assist in ensuring that the best practices for data security and privacy are being utilised, and it helps minimise the organisation's liability.The next major component of the DPIA is focused on risks arising from the processing of personal data. The first step is to establish what type of personal data will be processed, whether data processing is proportional/necessary, for how long it will be stored, and on what legal basis it will be collected. From this information, compliance with the GDPR can be determined. The template helps establish compliance levels based on the collected data. The DPIA template provides instructions on measuring risk based on the severity and probability of threats. The risk assessment methodology is aligned with the ISO 31000:2018 and ISO 31010:2011 and can be directly used to assign risk levels to all identified threats. The DPIA template provides the users with a template to fill this data into, but more importantly, it already includes a long list of personal data processing threats related to the GDPR requirements. Users can freely add other threats specific to their organisations, circumstances, used processes, etc. Finally, the risk to individuals' rights and freedoms are evaluated.
GDPR requests that the Data Protection Officer (DPO) provides an opinion on the assessment if one is appointed. The template suggests when a consultation with a DPO might be beneficial. These opinions should be documented.
The final part of the template provides a form for self-assessment. This step of the DPIA is optional and not required by the GDPR. The prepared self-assessment can help organisations track the work they have done and learn from it. Based on their performance, they can improve future work on DPIAs for other processes/services.
The eIDAS regulation provides a common foundation for secure electronic interaction between citizens, businesses and public authorities. The eIDAS regulation aims at increasing the effectiveness of public and private online services, electronic business and electronic commerce in the Union. To this end, it includes provisions for electronic identification and trust services. However, after more than five years, only 14 Member States (representing 59% of the EU's population) have notified at least one eID scheme . This and other issues have caused the EU to review the eIDAS (electronic IDentification and Trust Services) regulation and change it to reduce the issues and improve the adoption of eIDs and trust services across the Member States. On June 3rd 2021, the European Parliament and European Commission published a Proposal for amending Regulation (EU) No 910/2014 to establish a framework for a European Digital Identity, which will someday become the so-called eIDAS 2. In our analysis of interoperability and cross-border compliance issues, we considered some problems with the current situation, namely how eIDAS compliant implementations across the EU Member States are different and how that could be a problem for interoperability and cross-border cooperation of service providers and users.
In the research, we have focused on a specific sample of common eIDAS network cross-border use-cases to identify additional shortcomings of the current framework with the intent to suggest further additions to the current regulation and in this way achieve EU's long-term cybersecurity goals as well as to promote the use of eIDs in the commercial sector of the European Single Market. By analysing real-world implementations (from Italy, Slovenia, Spain, and Switzerland) of the use-cases chosen, the analysis leads to the identification of detailed shortcomings of the current regulation as it is manifested in the following areas.
- Organisational independence between supervisory authorities and service providers
The results show a pattern where supervisory authorities are providing trust services at the same time. This is contradictory to the families of standards (International standard of Auditing 200, IESBA Code, ISO 19011) that define the independence of auditors or other types of professional reviewers. Since cybersecurity incurs costs, providing the function of supervision and provision of services in the same entity could affect the independence, at least in appearance. This arrangement could also affect the competition in the market as the main entity that is offering trust services and defining legislation or even access to the eIDAS network is also competing with other service providers in a closed market. - Remote access to the banking services across borders
When setting up eIDAS, one of the uses cases was to enable the remote opening of bank accounts across borders. However, the area of banking is heavily regulated, and there is much local legislation (that is different across the Member States) that could hinder the envisioned seamless connection with foreign banks. The examples from the included countries showed that each of the countries has a different system in place. Every Member State has its own solution and its own requirements for the remote opening of a banking account. Currently, according to our sample, even if the banks embraced the eIDAS network, opening a bank account in another Member State would not yet be possible. - Remote video identification
With the digital interoperability of the eIDAS network, borders are fading away. A citizen obtaining an identity in one Member State could use acquired identity in any other Member State if eIDAS is followed strictly. In marginal scenarios, a citizen of one Member State could even obtain digital identity using remote video identification in another Member State and later use that identification in its own state. This could already be a case, for example, if the bank in Spain decided to provide banking services and trust services according to eIDAS to be freely used by her customers. Consequently, the questions of regulating requirements for remote video identification are invading the regulation area of the eIDAS network.
Again, the results from the selected countries show that, where remote video identification is allowed, each state has their own requirements regarding the security of remote identification. This will potentially lead to different levels of trust in the obtained title and difficulties in their cross-border recognition. Some Member States have already put limitations on the use of identities acquired using video identification for this reason. - The use of electronic signatures in public administration and remote access to EU digital COVID certificate
eIDAS provides different levels of assurance and different kinds of electronic signatures. Every level of assurance and every kind of electronic signature brings additional costs and complexity into the information system and user experience. Businesses are therefore reluctant to use higher levels of assurance than necessary. Since eIDAS was primarily targeted at the public sector, we were interested to understand whether assurance levels and the kind of signatures used in the public sector could be comparable across the Member States.
Even though different levels of assurance and different kinds of signatures are defined in the eIDAS, there is little convergence in understanding what levels should be used in specific use-cases. The surveyed countries support different authentication methods, with no clear indication that they prefer or promote the highest assurance level electronic signatures. When governments are not promoting the use of the highest assurance levels and qualified electronic signatures with their services, the technology support and acceptance in the population is lower, and this spills over to the commercial market. Commercial services that require higher levels of assurance (e.g. banking, insurance) either by the law or because they are not prepared to take the risk of lower assurance levels are left alone to promote the use of higher assurance level technologies with the citizens.
In the case of the EU digital COVID certificates, where the implementation was more consistent across the Member States, the minimum assurance level to access the EU Digital COVID certificate was the same across all surveyed Member States. - Commercial access to the eIDAS network
Since eIDAS does not have a condition that the eIDAS network has to be accessible to private entities, this is left to the regulation of the Member States. The results from the selected Member States vary. Some Member States do not allow access for the public sector (Spain), access is envisioned but not implemented (Slovenia), or access is allowed even for foreign businesses (Italy).
With some governments providing access to foreign entities, the competition between local regulations will also start to build. Companies will have the option to choose an eIDAS network entry point and consequently control their costs. That will put pressure on the public providers to stay competitive or local nodes may start to lose interest. This may have a negative impact. Suppose there is too much open competition between trusted service providers in the different Member States. In that case, that may directly affect the security of the network in the different Member States. Security incurs costs. If security requirements in some Member State are lower than in other Member State, this will give local providers a competitive advantage. Consequently, care has to be put into requiring the exact basic security requirements in all Member States. - Biometrics as Authentication Mechanism
In the private sector, especially banking and the general public identity providers (e.g. Google, Microsoft), we see the rise of the use of biometrics. The use of biometrics is currently "a grey area". The use cases are primarily based on the biometric capabilities of current mobile devices. That means that the service providers (e.g. banks) are not processing biometric data and are consequently not under the GDPR requirements. There is no certification scheme in place, and there are no specific requirements for the use of such devices. Even though these devices have a direct impact on the security of the service for the end-user, the service providers don't have contracts with "biometric security device providers", e.g. Apple, Samsung etc. even though that was the case when the banks were buying authentication solutions on the market to meet their needs and the needs of their customers. Consequently, the user is left to their own selection of the mobile device and the final security of the service will vary depending on the device selected. For this kind of authentication, we propose "Bring Your Own Authentication Device – BYOAD".
The overview of the selected countries showed that Switzerland (member of the European Economic Area, not the EU) is the only one that supports biometric authentication with eIDAS services. - Technical authentication and onboarding security mechanisms/requirements
The authentication mechanisms in the eIDAS network across the Member States are using technical solutions with questionable security attributes, like SMS One Time Password (OTP) codes. Some Member States are even turning to security questions that are deemed obsolete. Considering the stakes at play, it is better not to trade ease of use (which, as far as we can tell, is the main reason for their use) for actual security. This is especially true because alternatives do exist, and steering away from using weak authentication would also improve the trust users have in the system. Therefore, current standards may not be detailed or current enough to support the latest findings in cybersecurity. This finding is another indication that a call to increase the speed of security standards development is justified.
When analysing security requirements of electronic signatures, it also became evident that eIDAS uses very strict wording regarding capabilities of the qualified electronic signature, namely that "any subsequent change in the data is detectable". However, the current state-of-the-art technologies used to create electronic signatures this is not feasible (there is an extremely tiny chance of different messages resulting in identical signatures). To avoid any differences in how courts in different Member States may explain and understand this term, we suggest that the wording be amended in a way that will refer to how existing technology works.
Given the findings, here are the final eIDAS related recommendations:
- eIDAS 2 should follow the best practices of other certification and supervisory schemes regarding the organisational independence of the supervisory body.
- Essential services for the single market (e.g. banking) should be explicitly allowed in all Member States under the provisions of eIDAS 2 to avoid local limitations and even prohibition of the use of eIDAS services.
- Security baseline should be established for the remote identification services to avoid degradation of remote identification because of the market competition and to avoid exclusion of specific services or even the Member States from the network based on inadequate security standards.
- We should achieve higher market penetration of the highest level of assurance to empower citizens to use any service at any time without additional effort. Promoting or even requiring the use of a substantial level of assurance in the public sector wherever possible would support this effort.
- Access to the eIDAS network should be allowed explicitly to the private sector in all the Member States. Any limitation to access the eIDAS network through another Member State should at least be discouraged to promote competition between the Member States.
- We should build a strategy for using "Bring Your Own Authentication Device" solutions as this approach is getting traction. At the same time, it represents a "grey area," at least when combined with biometrics. We propose further research in current state-of-the-art use cases with the intent to identify best practices and definitions of the feasible legal framework for such use of biometric devices.
- An increase in the speed of security standards development is justified. Current standards are falling behind the latest cybersecurity developments and even in referencing the latest sibling security standards.
- The definition of the capabilities of the qualified electronic signature should be changed to reflect actual state-of-the-art technologies used to create electronic signatures to avoid different interpretations.
This work is on the topic of GDPR differences between the Member States. GDPR allows for the Member States to define or change some parts of the regulation in the way they wish. The prime example of this is the consent age (GDPR, Article 8, paragraph 1) set at 16 (persons aged 16 years and older do not require parental consent) in the GDPR. However, the regulation allows individual countries to change this and go as low as 13 years old. Member States can also have additional legislation that builds on top of the GDPR. To get a better picture of the current situation in the EU, we have gathered information from Supervisory Authorities (SA; a.k.a. Data Protection Authority - DPA). Each Member State's own SA is the best equipped to give information on their national legislation and policies on data protection. The results of this work provide users with an easy tool to compare the rules on some data protection topics in an individual Member States.
In the collection of data, we focused on topics that could potentially affect how data protection is implemented differently between the Member States regardless of GDPR. One such example is the collection of biometric data on electronic signatures. When signing on an electronic device, sensors can measure the pressure of the pen, the speed, the tilt, etc., of the signing process. All of this data is considered biometric data because it is produced from the technical processing of a natural person's physical, physiological, or behavioural characteristics. Similar characteristics of signatures can be obtained from close examination of actual physical signatures, which is why just mimicking the look of a signature does not make a convincing forgery (at least to an expert). This is the same reason all the previously mentioned biometric data is collected during an electronic signature. However, some countries do not allow the processing of biometric data for this purpose. Meaning electronic signatures are nothing more than images of signatures. Such differences between the Member States have the potential to cause problems related to the legitimacy of signatures, where a signature is could be valid in one country but invalid in another (either because it does not contain biometric data, or because it does and is consequently a case of illegal processing of biometric data).
Some of the important aspects of data protection that often involve personal information are not discussed much in the GDPR and could become troublesome to implement under its requirements. Here we are primarily thinking of the processing of personal data in audit trails and the problems surrounding the processing of personal data in backups. Therefore, we were interested if individual Member States have made legislation to more clearly define the requirements and how they can be achieved. Note that the results are only limited to legislation and do not include any guidelines or rulings that Supervisory Authorities might have made on how personal data should be handled in audit trails and backups.
The inclusion of anonymisation as a form of avoiding having to comply with the GDPR and pseudonymisation as a method of complying with the GDPR is very interesting, especially with the open questions of when does personal data become truly anonymous and how can we tell it is. Therefore, we were interested in whether any of the Member States have any additional legislation on the two topics where they might explain the requirements in more detail. Finally, as already discussed in the related literature, we have also included some of the topics that were also included in the previous studies.
As already mentioned, to collect the best possible quality of data, we chose to collect the data directly from national Supervisory Authorities. A SA is an independent public authority that supervises the application of European data protection law, including GDPR. Each EU Member State has to have a SA, which has investigative and corrective powers, provides expert advice on data protection issues, and handles any raised complaints. However, collecting responses from SAs is more difficult because there is only one per Member State, and they might not be inclined to participate in unsolicited research. Even though they are the best entity to answer the prepared data protection questions, we expected to not get a response from every SA. To have the best possible feedback, we have repeatedly asked for their participation and have collected the data between April 2020 and June 2021.
In the survey, we have managed to receive feedback from 19 (Austria, Belgium, Croatia, Cyprus, Czechia, Denmark, Estonia, Finland, Greece, Germany, Hungary, Latvia, Luxembourg, Malta, Poland, Romania, Slovakia, Slovenia, and Spain) out of the 27 Member States. The results show that in the majority of the cases, Member States do not have many additional/specific legislations. We have found that only 95 cases have additional/specific legislation of the 323 possible – which is 29,4%.
The topics most often additionally covered with legislation other than the GDPR are in the area of biometry use, video surveillance, processing of genetic data, using the biometric data for the purpose of identification, and processing of health data. On the other end of the spectrum is the legislation on photography and data backups which have further legislation only in one Member State each. They are closely followed by additional legislation on anonymisation and extensions on GDPR rules regarding the processing of data on sex life and sexual orientation, each with additional legislation in only two countries.
Luxemburg and Malta are the only countries that do not have any additional legislation on the topics covered in our survey; all other included Member States have at least one topic where they have other/additional legislation to the GDPR. Other countries with little additional legislation on the topics covered in this survey include Czechia, Poland, and Greece.
Based on the feedback from the SAs, the most additional legislation relevant to the discussed topics are in Finland, Spain, Hungary, Germany, and Latvia. The use of biometrics for the electronic acquisition of handwritten signatures is allowed in 10 of the 19 surveyed countries – so a very even split. In contrast, only four Member States do not allow biometrics in a work environment. This could indicate that the Member States are interested in limiting the use of biometric data but do not wish to limit businesses.
The results of the survey have also been integrated into a dynamic map, enabling the users to quickly navigate through the different topics of information and compare the EU Member States at a glance. The map has been published at https://cybersec4europe.eu/heterogeneity-of-data-protection-legislation-in-the-eu/.