The European Commission has published its roadmap for the preparation of the statutory report on the application of the GDPR, which is due in 2020. The roadmap published is open for public consultation by stakeholders and interested third parties until 29 April 2020.
The Commission has announced that its report will build on the two communications published by the European Commission to date, “Data protection rules as a trust-enabler in the EU and beyond – taking stock” (24 July 2019) and “Exchanging and Protecting Personal Data in a Globalised World” (10 January 2017). The report will also take into account the contributions received from the Council, the European Parliament, the European Data Protection Board and the GDPR Multi-Stakeholder Group.
The Commission has advanced that the report will cover in particular the following two topics: (i) international transfer of personal data to third countries (Chapter V of GDPR), with a special focus on existing adequacy decisions, and (ii) the cooperation and consistency mechanism between national data protection authorities (Chapter VII of GDPR).
Although it may be premature to extract conclusions from an unwritten report, it seems regretful that the Commission’s roadmap does not put an emphasis on other elements of the GDPR that have proven critical in its application. Besides the non-negligible issues surrounding international transfers and cooperation and consistency, companies in Europe have other equally pressing questions on the application of the GDPR. For example, the EC could also address and aim to resolve issues surrounding the selection of the correct legal basis and exception for cross-border data processing, or the conduct of DPIAs and the implementation of security measures that help companies avoid future liabilities.
Even the recent COVID-19 outbreak in Europe, and the different (and, sometimes, contradictory) positions that supervisory authorities are adopting across the EU, could call for a specific chapter in the report. For example, the Commission could consider reducing the strictures on the processing of health personal data and revamping cross-border coordination. The effectiveness of measures of public or private initiative, such as social-distancing orders, Covid-19 tracking apps, or telecommunications apps, may rely on this. One single pan-European approach, rather than several different national perspectives to the same issue, can be preferable to address these questions.
Further to the adoption of the GDPR, and as businesses
are rolling out new tools and mechanisms to carry out operations, the question
arises often whether these operations are GDPR compliant. Does access to systems that contain personal
data have to be verified by two-factor authentication? Do customer databases always have to be
pseudonymised? Do drives have to be
permanently encrypted?
The obligations contained in the GDPR seemingly set a
high bar for businesses to put in place compliant security measures. However, the guidance papers issued by ENISA
(“Handbook on Security
of Personal Data Processing”, December 2017), the
French data protection authority (CNIL) (“Security of Personal
Data”, 2018) and the Article 29 Working Party (“Guidelines on DPIA”, October 2017) shed
some light on the minimum security measures that are expected from businesses
when they handle personal data in general, sensitive personal data or data of a
highly personal nature.
Common errors committed when assessing and remedying security risks
When evaluating risks to personal data or systems,
companies tend to assume that the absence of a security measure constitutes a
valid parameter to determine risk, regardless of whether or not the measure in
question is actually necessary to address a risk. Accordingly, the risk levels are possibly
flawed or skewed.
Furthermore, because the absence of a security measure
is often found by companies to constitute a “risk”, it will always have to be
“fixed” by implementing a security measure, regardless of the actual risk
level. For example, the absence of pseudonymised or encrypted personal
data will lead companies to think that they need to implement both pseudonymisation
and encryption measures, even if risk scores and the size of the personal data
at risk are low.
The challenge of evaluating risks and implementing appropriate security
measures
Risks should be identified and measured as the
likelihood of materialisation of a threat against the impact of such threat on
privacy. ENISA, the CNIL and the EU data
protection authorities refer to the following sequential steps that are necessary
to carry out an assessment and recommendation on security of personal data:
Step 1: Defining the
processing operation (e.g., understanding the data processing operation, the
kinds of personal data processed).
Step 2: Understanding
and evaluating the impact (e.g., minor inconveniences if disclosure, or major,
significant and irreversible consequences).
Step 3: Defining the
possible threats (e.g., data loss, leaks, broad accessibility, etc.).
Step 5: Adopting the
appropriate security measures to address the risks.
As can be seen, particularly in Steps 4 and 5, every
security measure should address a risk, but the absence of a security measure does
not in itself represent an issue, unless a risk is exposed and not
addressed.
Finding the
appropriate security measure to address risks
The question then becomes, what security measures
should be adopted by entities in order to address risks?
If risk levels are based on (a) the threat score and (b)
the number of personal data infringement score, companies can opt to rely on this
data to filter and focus on the higher-risk scores (i.e., larger amounts of
personal data, potentially sensitive, being processed and subject to high
levels of threat), and apply the general rules indicated by ENISA or
CNIL.
For example, if the number of personal data
infringement score (b) assigned to a certain system is low, then the absence of
certain security measures should not be an obstacle to address that risk
through other less onerous measures.
By contrast, there might be a need to implement a
security measure if it is necessary to address a (high) risk. For
example, if the level of risk is high, then one might need to implement
appropriate (and even, to a certain extent, overlapping) security measures to
ensure that the high level of risk is addressed.
ENISA and CNIL have provided the following categorisation
of risk levels for the following types of personal data processes:
Low-risk (low threat
and low impact):
Marketing and
advertising information (e.g., contact information such as name, postal address,
telephone number, email).
Contact details of B2B
suppliers of services and goods (e.g., first and last name, contact information,
tax and banking information (for suppliers)).
Medium-risk
(low/medium threat and low/medium impact):
Payroll processing information
(e.g., social security number, taxation identifiers, date of employment, salary
information).
Recruitment data (e.g.,
academic education and qualifications, working experience, further professional
or academic training , family status, first and last name, address, telephone
numbers, date of birth, interview notes/report).
Employee evaluation
information (e.g., position within the SME, date of employment, employment
history, technical skills, knowledge and behaviour).
E-learning platform
information (e.g., date of birth, date of admission, selected courses,
evaluation results, grades).
High-risk (medium/high
threat and medium/high impact):
Health services data (e.g.,
social insurance number, medical examination results, pathologies, allergies,
diagnosis and cure schemas, related administrative and financial information).
As can be seen, only in very specific data processes might
the systems be exposed to high risks. In other cases which might concern
personal data such as salary information or education, the level of risk is
medium or even low. This will have a
bearing on the security measures that are necessary to address each purpose.
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPT
Privacy & Cookies Policy
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.