A&L Goodbody blog is about Dublin Lawyers & Attorneys for Intellectual Property, Communications & Information Technology Law in Ireland, U.S. & England.
A&L Goodbody is an Irish law firm providing expert legal advice across every aspect of business law.
The Advocate General of the Court of Justice of the EU (CJEU) has delivered an Opinion in the Planet49 case (Case C-673/17), finding that a pre-ticked checkbox giving consent for cookies does not constitute valid consent under the e-Privacy Directive 2002/58 read in conjunction with the Data Protection Directive 95/46 or the GDPR.
In order to participate in a lottery organised by Planet49, an internet user was confronted with two checkboxes which had to be clicked or unclicked before he could hit the ‘participation button’. The first checkbox, which was not pre-ticked, required the user to accept being contacted by a range of firms for promotional offers. The second checkbox, which was pre-ticked, required the user to consent to cookies being installed on his computer.
The Advocate General gave the following Opinion:
There is no valid consent within the meaning of the e-Privacy Directive 2002/58 , in conjunction with the Data Protection Directive 95/46, in a situation such as that in the present proceedings where the storage of information, or access to information already stored in the user’s terminal equipment, is permitted by way of a pre-ticked checkbox which the user must deselect to refuse his consent.
The same applies in regard to the interpretation of the e-Privacy Directive 2002/58 read in conjunction with the GDPR.
The consent requirements for cookies apply regardless of whether or not the information stored and accessed constitutes personal data.
Whilst the Advocate General’s Opinion is not legally binding, it will be of persuasive value to the CJEU in making its decision. The CJEU is expected to issue a final judgment in the coming months.
The Data Protection Commission (DPC) has published its Annual Report for 25 May-31 December 2018. As always, the Report reveals some interesting statistics and case studies. In the coming months, the DPC expects to conclude a number of statutory inquiries, which it launched in 2018, into multinational technology companies with EU headquarters situated in Ireland. The DPC anticipates that the conclusion of those inquiries will provide precedents for better implementation of the principles of the GDPR across key aspects of internet and ad tech services. This briefing note sets out some of the highlights of the Report.
There was a substantial rise in the number of complaints made to the DPC due to greater public awareness of data protection issues and rights (a 56% increase). While the majority of complaints continued to be resolved amicably, the DPC issued a number of formal decisions.
2,864 complaints received between May and December 2018
Largest category of complaints concerned data access requests (977 out of the 2,864 complaints)
4,113 complaints received in the 2018 calendar year (compared with 2,642 in 2017)
136 cross-border processing complaints received through the new one-stop-shop mechanism
32 electronic marketing complaints investigated under the e-Privacy Regulations 2011
18 formal decisions by the DPC (13 upheld the complaint, and 5 rejected the complaint)
Case-studies 1-3 and 4-7, respectively, concern complaints received, and amicable resolutions sought, by the DPC. The Report notes that in many of the complaints that the DPC handles, data subjects hold the mistaken believed that because they have not consented to the processing of their personal data, it is unlawful. However, there are a number of legal bases other than consent that justify processing. The DPC has warned that it will “rigorously interrogate” whether the circumstances of the processing justify reliance on the legitimate interests legal basis.
Case studies 3 and 7 discuss one of the most common data breaches, namely unauthorised disclosure as a result of sending emails to the wrong address. Case Study 3 concerned the disclosure by an airline of a web-chat transcript by email to the wrong customer, as a result of using an auto-fill function in software. The DPC warned that such functions should be used with caution, and safeguards deployed, such as on-screen prompts to double-check recipient details.
Case Study 7 concerned the erroneous disclosure by a car dealership of the complainants’ personal data to the wrong email address. The DPC highlighted that it is not enough in such instances, to acknowledge to the DPC and/or data subject that a data breach has occurred. It is also incumbent on the data controller to take all reasonable remedies to remedy such a breach, including recalling the email (if possible), asking the unintended recipient to confirm they have deleted the email, and putting in place measures to prevent a recurrence.
Data Breach Notifications
Most organisations engage with the DPC and accept its guidance around mitigating losses for affected individuals, communicating any high risks to them and learning lessons from the breach to avoid a repeat. The Report notes that where a breach has been notified to a data subject by the data controller, but not to the DPC, the DPC’s Breach Complaints Unit will ensure the breach is retrospectively reported to the Breach Notifications Unit, accompanied by a clarification from the data controller/processor as to why the DPC was not notified in the first instance.
3,542 valid data security breaches notified to the DPC
145 invalid breach notifications (which did not meet the definition of a ‘personal data breach’ under Article 4(12) of the GDPR)
4,740 valid data security breaches were notified in the 2018 calendar year (compared with 2,795 in 2017)
Largest category of breaches concerned “unauthorised disclosure” of personal data (3134 breaches)
38 cross-border processing personal data breach notifications were handled by the DPC, involving 11 organisations
Case-Studies 9-13 discuss sample data breaches notified to the DPC, including failure to implement the data protection policies in place; an unencrypted USB device lost in the post; website phishing; loss of paper files in transit, and a SIM card swap attack.
A number of statutory inquiries have been launched by the DPC under section 110 of the Data Protection Act 2018 (the 2018 Act), which are expected to reach the decision and adjudication stage in 2019. The DPC has not yet commenced any statutory inquiry under section 137, Part 5 of the 2018 Act, which provides additional investigatory powers, including the power of an authorised officer conducting an investigation to hold an oral hearing.
The Report provides a useful flowchart showing the phases of a statutory inquiry, where the DPC is acting as lead supervisory authority in relation to a cross-border processing issue and a complaint has been lodged with the DPC directly, or the DPC has commenced an inquiry of its own volition. However, the sequencing may be subject to change following completion of the first wave of statutory inquiries, and the crystallisation of the inquiry process at national and EU level in those cases.
31 own-volition inquiries opened by the DPC’s Special Investigations Unit (SIU) into surveillance of citizens by the state sector for law-enforcement purposes (concerning surveillance by CCTV, body-worn cameras, automatic number-plate recognition (ANPR) enabled systems, drones and other technologies);
15 statutory inquiries by the DPC under section 110 of the 2018 Act as lead supervisory authority, into GDPR compliance by multinational technology companies, including Facebook and its affiliates (10), Apple (2), Twitter (2), LinkedIn (1);
23 formal requests by the DPC to technology companies seeking information on GDPR compliance.
New Technology Leadership Unit
The DPC has established an advanced technology evaluation and assessment unit, the ‘Technology Leadership Unit’ (TLU). The TLU’s objective is ‘to maximise the effectiveness of the DPC’s supervision and enforcement teams in assessing risks related to the dynamics of complex systems and technology’. The TLU has enabled the DPC to provide enhanced technology-focused internal guidance on ePrivacy, internet protocols and data portability, ad tech and accountability. It is also planning to provide external guidance and training in areas such as Artificial Intelligence and machine learning, ad tech, device ID settings and cybersecurity.
The DPC has received several submissions from privacy advocates concerning the conduct of technology companies in the advertising sector, particularly in relation to behavioural advertising. Issues of concern highlighted to the DPC include: the use of special categories of personal data for profiling purposes; how location data is being used by advertisers; the processing of personal data for advertising purposes without a lawful basis; and individuals not being aware who has access to their personal data.
The DPC received 16 requests (formal and voluntary) for mutual assistance from other EU data protection authorities in relation to the technology sector.
The mutual assistance requests concerned topics such as transparency of processing agreements; privacy notices; the interaction of the GDPR and the ePrivacy Directive, and digital advertising.
Case study 14 demonstrates use of the DPC’s enforcement powers in its investigation of LinkedIn’s “mentions in the news” feature. LinkedIn was forced by the DPC to suspend the service for European users as a result of complaints that the feature was wrongly associating LinkedIn members with media reports of people that happened to have the same name. The DPC stated that this gave rise to concerns around the lawfulness, fairness and accuracy of the personal data processed.
New DPC Consultation Teams
In 2018, the DPC continued its proactive consultation work. The DPC has set up three new dedicated consultation teams, each headed by an Assistant Commissioner, including: (i) public sector and law enforcement; (ii) health and voluntary sector, and (iii) private and financial.
The Consultation Unit has encouraged the development of Data Protection Officer (DPO) networks whereby groups of DPOs in a related area collaborate to share knowledge and experience. The Unit is open to attending regular roundtable forums with DPO networks to advise on sector-specific data protection issues, and ensure best practices become commonplace.
The DPC has also opened a public consultation on the processing of children’s personal data and the rights of children as data subjects under the GDPR, with a closing date of 5 April 2019. Following the consultation, the DPC will work with industry, government and voluntary-sector stakeholders to encourage the drawing up of Codes of Conduct to promote best practices by organisations that process the personal data of children, in accordance with the DPC’s obligation under Section 32 of the Data Protection Act 2018.
The DPC concluded prosecutions against five entities in respect of 30 offences under the e-Privacy Regulations 2011. Case Studies 15-19 discuss prosecutions taken by the DPC for direct marketing offences. These prosecutions were generally taken as a result of the companies failing to heed earlier warnings by the DPC about their direct marketing practices. In the majority of cases, the court ordered, in lieu of a conviction and fine, the company to make a charitable donation and pay the DPC’s prosecution costs.
Litigation in which the DPC was involved
In Nowak v The Data Protection Commissioner  IEHC 443 (12 July 2018), the High Court ruled that where a data subject explicitly limits their data access request, it is legitimate for the organisation to solely provide the personal data specified rather than all the personal data held. In addition, the Court held that in requesting a copy of specific personal data, it is reasonable for the controller to assume that the data subject is not seeking the descriptions of the personal data processed (as provided for in section 4 of the Data Protection Acts 1988 and 2003) (see our previous blog for other Irish DPC litigation from 2018).
Last year, the Court of Justice of the European Union (CJEU) delivered a number of important decisions on the concept of controllership, including in the Facebook Fan Pages case (Case C-210/16), and in the Jehovah’s Witnesses’ case (Case C 25/17). Those decisions emphasise that the concept of a ‘data controller‘ should be interpreted broadly. However, that does not mean that every data controller has equal responsibility or has to have access to the relevant personal data to be a ‘data controller‘. The CJEU found that joint controllers might be involved at different stages of the processing to different degrees, so that the level of responsibility (and liability) of each controller must be assessed by reference to all the circumstances of the case.
In another case, Ministerio Fiscal (Case C-207/16), the CJEU was asked to interpret its earlier decision in the Tele2 case. In that case, the CJEU held that only the objective of fighting ‘serious’ crime is capable of justifying public authorities’ access to personal data retained by service providers. It stated that that interpretation was based on the principle of proportionality, namely that serious interference can be justified only by the objective of fighting serious crime. By contrast, the CJEU found in Ministerio Fiscal that where the interference is not serious, access may be justified for the purpose of preventing, investigating, detecting and prosecuting criminal offences generally, so long as such access does not constitute a serious infringement of privacy.
What’s ahead in 2019?
Much new salient case law is expected from the CJEU in 2019. In particular, the Irish High Court’s reference on the validity of standard contractual clauses (SCCs) for transferring personal data out of the EEA is expected to be heard and decided by the CJEU this year.The Advocate General’s opinion and CJEU ruling in the Planet49case are also eagerly awaited to provide guidance on cookie-based transparency and consent.
In late 2018, the DPC commenced a project to develop a new five-year DPC regulatory strategy, allowing stakeholders input into how the DPC deploys its resources. This will include extensive external consultation during 2019. The strategy will set out the DPC’s regulatory priorities and give insight to organisations and individuals on how the DPC intends to regulate.
Other activities the DPC plans to continue in the coming year include:
To monitor new developments in the fintech industry in the use of blockchain, security and big-data processing. It is currently assessing the impact of the Payment Services Directive 2 (PSD2) on the banking sector and the applications that allow third parties to access and deliver payment of services by way of the consent of a customer via his or her bank account.
To engage with the private and financial sector in relation to transparency standards to ensure customer notices and privacy policies comply with the GDPR; to better understand the application of emerging technologies to data-processing operations, and in regard to legislative proposals to implement national banking and insurance fraud data bases.
To examine the ad tech sector. The DPC has said that the conclusion of some of its statutory inquiries in 2019 should “contribute to answering some of the questions relating to this complex area”.
On 4 March 2019, Minister Richard Bruton TD announced that he will introduce an Online Safety Act to regulate harmful content online and ensure children are safe online. The Act will also implement the revised Audiovisual Media Services (AVMS) Directive (which Member States are required to implement by 19 September 2020). The Minister stated that the era of self-regulation in regard to online safety is over. It is proposed that an Online Safety Commissioner would oversee the new system. The Department of Communications, Climate Action and Environment is seeking views on the proposed legislation, and has launched a six-week consultation period which is open until 15 April 2019.
The proposed Online Safety Act will provide for:
New online safety laws applicable to Irish residents and
Regulation of Video Sharing Platform Services (e.g. YouTube), On Demand audiovisual services (eg. RTÉ Player, Netflix) and Traditional TV
The Minister acknowledges that putting in place a new national regulatory structure for dealing with the removal of harmful online content is a complex task for a number of reasons, including the existing legislative and regulatory measures in place in relation to specific content (for example in the area of Data Protection, Terrorist Content, Child Sexual Abuse Material or other criminal content); the role of An Garda Síochána in relation to illegal content such as child sexual abuse material and terrorist related content, and the eCommerce Directive which provides that online services are not liable for hosting illegal content of which they are not aware.
New Online Safety Laws applicable to Irish residents
The Minister intends to give the Online Safety Commissioner the powers necessary to ensure that the content which is most harmful can be removed quickly from online platforms. The views of stakeholders are sought in regard to the content that should be considered “harmful content” and which online platforms should be in scope. The Minister asks whether the following examples of “harmful content” are sufficient and appropriate:
Serious cyber-bullying of a child;
Material promoting self-harm or suicide, and
Material designed to encourage prolonged nutritional deprivation.
Online platforms are already required to remove content which is a criminal offence under Irish and EU law to disseminate when they are notified of it, such as material containing incitement to violence or hatred, content containing public provocation to commit a terrorist offence, offences concerning child sexual abuse material or concerning racism and xenophobia.
Powers of Online Safety Commissioner
The Government proposes assigning the Online Safety Commissioner with the following powers:
Certify that the approach of services to operating an Online Safety Code is “fit for purpose”.
Require regular reports from services ( i.e. content moderation, review, adjudication of appeals etc).
Assess whether the measures which a service has in place are sufficient in practice (i.e. by conducting audits).
Issue interim and final notices to services in relation to compliance failures, and seek Court injunctions to enforce such notices.
Impose administrative fines in relation to compliance failures.
Publish the fact that a service has failed to comply or cooperate with the regulator.
Require content takedown within a set timeframe, where a user is dissatisfied with the response they have received from a service provider.
Seek criminal proceedings be brought against the service provider.
Regulation of Video Sharing Platform Services (VSPS)
The revised AVMS Directive requires significant changes in regard to the way Ireland regulates audiovisual content both online and offline. A VSPN will be regulated in the country where it is established, without the need for further regulation in other EU Member States in which it offers its services. The revised Directive does not prescribe the content which is not permissible, rather it establishes principles (protection of minors from potentially harmful content; incitement to hate speech or violence; and criminal content) and requires a national regulator to be appointed to ensure that services have appropriate measures in place to meet those principles (such as parental controls and age-verification). It also requires VSPS to have a complaints mechanism in place where a user can make a complaint regarding content which is hosted on the service. The Minister proposes that the rules applicable to VSPS will also apply to platforms in respect of other user-generated content for Irish residents (e.g. photos, comments, or other material which is not audiovisual in nature). The consultation seeks views on what type of regulatory relationship should exist between a VSPS established in Ireland and the regulator; how should the Irish regulator determine whether the methods put in place by VSPS are sufficient to meet the relevant principles; and on what basis should the regulator monitor and review the measures which VSPS have in place.
Regulation of On-Demand Audiovisual Services and Traditional TV
The revised AVMS Directive requires a number of changes in regard to regulation of on-demand audiovisual services and traditional TV, including closely aligning the rules and requirements for traditional TV and on-demand audiovisual media services. The consultation seeks views on what type of regulatory relationship should exist between an on-demand audiovisual media service established in Ireland and the relevant Irish regulator, and whether the same content rules should apply to both traditional TV and on-demand services.
The Minister seeks views on the most appropriate regulatory structure. At present the Broadcasting Authority of Ireland (BAI) is the National Regulatory Authority under the AVMS Directive for traditional TV, including the publicly funded broadcasters (RTÉ and TG4), and commercial television broadcasters such as Virgin Media Television. The BAI also regulates the traditional radio broadcasters who are established in Ireland.
The Minister proposes two ways in which an Online Safety Commissioner may be established:
Restructuring and reforming the BAI as a Media Commission, along the lines of the multi-Commissioner Competition and Consumer Protection Commission. The Online Safety Commissioner could operate within that structure.
Two Regulatory bodies, one of which would involve restructuring the BAI and assigning it responsibility for content which is subject to editorial controller (traditional television and radio broadcasting and on-demand services). The second online safety regulator would be a new body responsible for online content that is not subject to editorial controls (such as social media and video sharing platforms etc.)
Once the public consultation has concluded, the Minister will review the submissions and finalise a proposal. The Minister will then bring draft heads of bill to government for approval and publication.
The Data Protection Commission (DPC) has published the results of the annual Global Privacy Sweep for 2018, which examined how well organisations are implementing the concept of accountability. The Global Privacy Enforcement Network members made contact with 356 organisations in 18 countries during the Sweep. It found that while there were examples of good practice reported, a number of organisations had no processes in place to deal with complaints and queries raised by data subjects, and were not equipped to handle data security incidents appropriately.
In Ireland, 30 randomly-selected organisations across a range of sectors (including pharmaceutical, multinational, Government / Local Government, transport, charity, education and finance) were contacted. The organisations were asked to complete a table of questions relating to ‘Privacy Accountability’. The DPC reported the following trends in Ireland:
86% of organisations have a contact for their DPO listed on their website, and all have privacy policies which are easily accessible from the homepage.
The majority of organisations reported that they have policies and procedures in place to respond to requests and complaints from individuals.
75% of organisations reported that they have adequate data breach policies in place.
All organisations reported that they provide some form of data protection training for staff. However, only 38% of those organisations provided evidence of training programmes for all staff, including new entrants and refresher training.
In most cases, organisations reported that they undertake some data protection monitoring / self-assessment (e.g. internal audits), but not to a sufficiently high level.
One third of organisations failed to provide evidence of documented processes to assess risks associated with new products and technology (e.g. Data Protection Impact Assessments). However, many reported that they are in the process of documenting appropriate procedures.
30% of organisations failed to demonstrate that they had an adequate inventory of personal data while almost half failed to maintain a record of data flows.
The DPC is currently assessing what follow-up actions are necessary based on the responses.
By any measure, 2018 was a historic year for data protection law with the coming into effect of the GDPR on 25 May 2018. Ireland plays an important role in the regulation and enforcement of data protection law and decisions of the Irish courts have had a disproportionate impact on European data protection jurisprudence. With the introduction of the one-stop-shop mechanism under the GDPR it is to be expected that this trend will continue in the years ahead. This briefing note highlights the key data protection legislative developments and Irish court decisions over the past year.
The European Parliament, Council and Commission have reached a compromise on the text of the new Copyright Directive (previously discussed here). The proposed Directive targets digital use of press publications by information society service providers, such as news aggregators and media monitoring services. As discussed below, the two most controversial provisions are Articles 11 and 13, known respectively as the “link tax” and “upload filtering” provisions. The Commission has issued a press release, but not an official copy, of the compromise text.
Articles 11: Link Tax – What was agreed?
Article 11 is a new press publisher’s right. It requires service providers to pay a copyright royalty to press publishers, including journalists, for digital use of their news articles, for a period of 2 years after publication of the press publication. Despite being known as the “link tax” provision, Article 11 provides that hyperlinking or re-using individual words or very short extracts of press publications (i.e. snippets of articles) will be excluded from the new right granted to press publishers. This means that service providers should be able to continue use such parts of a press publication, without requiring authorisation. However, it remains to be seen how the courts will construe “very short extracts”. The Commission has merely stated “when assessing what very short extracts are, the impact on the effectiveness of the new right will be taken into account”. It is likely that service providers will face some uncertainty in their compliance efforts until further guidance is provided by the courts and/or the Commission. In the meantime, in order to mitigate the risk of copyright infringement, service providers may face the choice of paying press publishers for a licence to ensure any snippets are authorised, or stop using snippets, which would inevitably have a detrimental impact on internet users, who would see links to news stories, without any headlines or summaries of stories. The new press publishers’ right will not apply to individual users’ private or non-commercial use of press publications, with the result that internet users can continue to share content on social media and link to websites and newspapers.
Article 13: Upload Filtering – What was agreed?
Article 13 requires a service provider that stores large amounts of copyright protected works uploaded by users, which it promotes for profit-making purposes, to reach licensing agreements with rightholders for use of their works. A service provider will not be able to benefit from the ‘hosting’ liability exemption, in Article 14 of the e-Commerce Directive 2000/31/EC, where it hosts copyright-infringing material on its platform.
If authorisation from rightholders is not obtained, a service provider will be liable for any infringing content on its platform, unless it can demonstrate it has taken the following steps:
made “best efforts” to obtain authorisation from rightholders;
made, “in accordance with high industry standards of professional diligence”, best efforts to ensure the unavailability of copyright protected works and other subject matter which have been identified to them by rightholders, and
acted expeditiously, upon notification by the rightholder, to remove unauthorised content, and made best efforts to prevent their future uploads.
It remains to be seen how the courts will interpret “best efforts“, and the extent to which service providers will have to police their sites. Whilst the compromise text provides that the application of Article 13 shall not lead to any general monitoring obligation, it is clear that if a court finds a service provider’s efforts to prevent copyright infringement are not strong enough, that service provider will be directly liable for infringements, as if it had committed them itself. Thus it seems likely that service providers will have no choice but to implement some form of content filtering technology which will be a costly exercise. The compromise text indicates that the Commission shall, in consultation with service providers, rightholders and relevant stakeholders, issue guidance on the application of Article 13, which will hopefully provide further clarity to service providers on the precise scope of their new obligations.
New smaller service providers will be subject to a lighter regime in cases where no authorisation has been granted by rightholders. This concerns service providers who have been active for less than three years, have an annual turnover below €10 million, and a website with less than 5 million monthly visitors. In order to avoid liability for unauthorised works, these new small service providers will only have to prove that they have made their best efforts to obtain an authorisation, and that they have acted expeditiously to remove the unauthorised works notified by rightholders from their platform. However, where the average monthly visitors exceeds 5 million, these new service providers shall have to also demonstrate that they made their best efforts to prevent further uploads of copyright-protected works notified by rightholders.
In addition, Article 13 will not apply to providers of services such as not-for-profit online encyclopedias, not-for-profit educational and scientific repositories; open source software development platforms; electronic communication service providers as defined in Directive 2018/1972; online marketplaces, and cloud storage services.
The compromise text must next receive final approval by the European Council and Parliament, which is expected by March or April 2019. In the meantime lobbying is set to continue. Once the Directive is approved, it will be published in the Official Journal, and Member States will then have 24 months to transpose the new rules into their national legislation.
The Information Note on Data Transfers warns, once again, that the UK will be a ‘third country’ from 30 March 2019. As a result, personal data cannot be transferred from the EEA to the UK unless organisations implement a data transfer mechanism under the GDPR, such as standard contractual clauses; ad hoc contractual clauses (authorised by the competent supervisory authority, following an opinion by the EDPB); binding corporate rules (BCRs); codes of conduct and certification mechanisms, or a derogation. In regard to data transfers from the UK to the EEA, the UK Government have confirmed the current practice, which permits personal data to flow freely from the UK to the EEA, will continue in the event of a no-deal Brexit.
The EDPB sets out 5 steps that organisations should take to prepare for a no-deal Brexit, including:
1. Identify what processing activities involve a personal data transfer from the EEA to the UK;
2. Determine the appropriate data transfer mechanism;
3. Implement the transfer mechanism by March 30 2019;
4. Indicate in your internal documentation that transfers will be made to the UK; and
5. Update privacy notices to inform individuals that transfers will be made to the UK.
The Information Note on BCRs provides guidance for companies which have the UK ICO as their BCR lead supervisory authority. The EDPB recommends such companies take the following steps:
Groups headquartered in the UK wishing to apply for BCRs should identify the most appropriate BCR lead supervisory authority in an EU Member State;
Groups with BCRs at the review stage by the ICO should identify a new BCR lead supervisory authority. That new authority will take over the application and initiate a new procedure at the time of a no-deal Brexit;
If a draft ICO decision approving BCRs is pending before the EDPB at the time of a no-deal Brexit, the group should identify a new BCR lead supervisory authority. The new authority will take over and resubmit a draft decision for approval of the BCRs to the EDPB; and
ICO authorised BCR holders should identify their new BCR lead supervisory authority.
The EDPB highlight that the supervisory authority that may be approached to act as new lead authority will consider, in cooperation with other concerned authorities, whether it is the appropriate BCR lead on a case by case basis.
The European Data Protection Board (EDPB) has published its work program for the next two years. The program lists the guidelines, consistency opinions, and other types of activities the EDPB intends to carry out. The program is based on the needs identified by the EDPB as priority for individuals, stakeholders, as well as the EU legislator planned activities. The Guidelines due to be published over the coming two years include:
Guidelines on reliance on Art. 6(1) b in the context of online services (i.e. the contractual necessity legal basis)
Guidelines on concepts of controller and processor (Update of the WP29 Opinion)
Guidelines on the notion of legitimate interest of the data controller (Update of the WP29 Opinion)
Guidelines on the Territorial Scope of the GDPR (finalisation after the public consultation)
Guidelines on Codes of Conduct and Monitoring Bodies
Guidelines on delisting
Guidelines on PSD2 and GDPR
Guidelines on international transfers between public bodies for administrative cooperation purposes
Guidelines Certification and Codes of Conduct as a tool for transfers
Guidelines on Connected vehicles
Guidelines on Certification (finalisation after the public consultation)
Guidelines on video surveillance
Guidelines on Data Protection by Design and by Default
Guidelines on Targeting of social media users
Guidelines on children’s data
Guidelines on the powers of DPAs in accordance with Art. 47 of the Law Enforcement Directive
Guidelines on data subjects rights
The EDPB has also indicated that it may publish guidelines on other topics such as on: cross-border requests for e-evidence, and the interpretation of Article 48 GDPR (concerning requests from a court, tribunal or administrative authority in a third country to an EU controller or processor to transfer personal data). In addition, the EDPB intends to publish an Opinion on the interplay between the GDPR & e-Privacy, an issue which has been causing companies some confusion.
The European Data Protection Board (EDPB) has adopted an Opinion(3/2019) on the interplay between the EU Clinical Trials Regulation (536/2014) (CTR) and the GDPR, following a request from the European Commission to review its Q&A on the topic. The CTR, which is expected to enter into force in 2020, aims to harmonise the rules for conducting clinical trials throughout the EU. It does not contain any derogations from the GDPR and will therefore apply simultaneously with the GDPR.
The EDPB’s Opinion focuses: (1) the legal basis under the GDPR for processing personal data in the course of a clinical trial protocol (primary use), and (2) further use of clinical trial data for other scientific purposes (secondary use). Some highlights of the EDPB’s Opinion are set out below.
(1) Primary Use
Primary use of personal data includes all processing operations during the lifecycle of a clinical trial protocol, from the start of the trial to deletion of data at the end of the archiving period.
Not all processing operations relating primary use of clinical trial data pursue the same purposes and fall within the same legal basis.
There are two main categories of processing activities during the lifecycle of a clinical trial: (i) processing operations relating to the protection of health and setting standards of quality and safety for medicinal products by generating reliable and robust data (reliability and safety purposes) and (ii) processing operations related to research activities.
The appropriate legal basis for processing operations relating to reliability and safety purposes is: Article 6(1)(c) GDPR (processing necessary for compliance with a legal obligation to which the controller is subject), and in regard to special categories of data, Article 9(1)(i) (necessary for reasons of public interest in the area of public health). This is because the CTR includes specific legal obligations related to safety reporting, archiving and disclosure obligations.
The EDPB identifies three alternative legal bases for processing operations related to research activities (depending on the circumstances of the specific clinical trial):
Article 6(1)(a) in conjunction with Article 9(2)(a) (explicit consent);
Article 6(1)(e) (a task carried out in the public interest) in conjunction with Article 9(2)(i) or (j) (processing necessary for reasons of public interest in the area of public health, or necessary for archiving purposes in the public interest, scientific or historical research or statistical purposes) or
Article 6(1)(f)) (legitimate interests of the controller) in conjunction with Article 9(2)(j) (necessary for archiving purposes in the public interest, scientific or historical research or statistical purposes).
For the processing of special categories of data (e.g. health data), the legal basis identified under Article 6 shall be applied, only if Article 9 GDPR provides for a specific derogation from the general prohibition to process special categories of data.
The EDPB considers that, in most cases, consent will not be an appropriate legal basis for processing clinical trial data for research purposes, due to an imbalance of power between the sponsor/investigator and the trial participants (for example where the participant is not in good health condition or in a situation of institutional dependency), and that alternative lawful grounds of processing should be considered.
The EDPB notes that informed consent under the CTR must not be confused with the notion of consent as a legal basis for the processing of personal data under the GDPR. The provisions of Chapter V of the CTR (primarily Article 28) that relate to informed consent primarily seek to ensure the protection of the rights of human dignity and to integrity of individuals, and were not conceived as an instrument for data protection compliance.
(2) Secondary Use
Article 28(2) of the CTR specifically addresses the issue of secondary use. It provides that, at the time a clinical trial subject gives informed consent to participate in a clinical trial, the sponsor may ask the participant for consent to the use his or her data outside the protocol of the clinical trial, exclusively for scientific purposes. Such consent is not considered the same as consent for processing personal data under the GDPR.
If a sponsor or investigator would like to make further use of the personal data gathered for any scientific purpose other than those defined in the clinical trial protocol, the sponsor or investigator should have a specific legal basis under the GDPR for doing so. The chosen legal basis may be the same or different from the legal basis of the primary use.
However, pursuant to Article 5(1)(b) GDPR, where secondary use of the clinical trial data is for archiving purposes in the public interest, scientific, historical research or statistical purposes, these shall not be considered as incompatible with the initial purpose, provided appropriate safeguards are in place. In such a case, the controller may further process the data without the need for a new legal basis.
The EDPB recommends that the European Commission modify its Q&A in regard to the lawful basis for processing clinical trial data under the GDPR, to distinguish between processing activities related to reliability and safety and those related to research activities. The EDPB’s Opinion will now be transmitted back to European Commission for further consideration.
In addition to having an appropriate legal basis to process clinical trial data under the GDPR, organisations carrying out health research activities in clinical trials, and operating in Ireland, should ensure they have put in place appropriate data protection safeguards as required by the Data Protection Act 2018 (Section 36((2)) (Health Research) Regulations 2018 (SI 314/2018 (discussed here).
The European Commission has publishedan infographic on compliance with and enforcement of the GDPR since from May 2018 to January 2019. The infographic reveals some interesting statistics, including:
95,180 complaints have been made to EU national data protection authorities (DPAs) by individuals who believe their rights under the GDPR have been violated. The majority of these complaints concerned telemarketing, promotional emails, and video surveillance/CCTV.
An increase in breach notifications to EU DPAs to comply with the 72 hour GDPR breach notification deadline – 41,502 data breaches have been notified to EU DPAs.
The EU DPAs have initiated 255 cross-border investigations, following individuals’ complaints and on their own initiative.
Three fines have been issued by EU DPAs under the GDPR – Germany imposed a €20,000 fine on a social network operator for failing to protect users’ personal data. Austria imposed a €5,280 fine on a sport betting café for unlawful video surveillance. France imposed a €50 million fine on Google for alleged lack of consent to personalised ads.
23 EU Member States have adapted their national legislation to ensure compliance with the GDPR. Five Member States are still in the process of doing so, including Bulgaria, Greece, Slovenia, Portugal and the Czech Republic.
In a conference at Brussels last week, the Irish DPC confirmed that it had initiated a number of statutory investigations into tech companies with their headquarters in Ireland. She indicated the investigations were mostly at an advanced stage, but that it would likely be June or July before she was received an investigation report from her authorised officers, and is in a position to make a final decision on sanctions in the case of an infringement.