A&L Goodbody | Ireland IP & Technology Law Blog | Cyber Risk & Data Privacy..
Ireland IP & Technology Law Blog provides every information you need to know about Intellectual property & technology law in Ireland. A&L Goodbody is an Irish law firm providing expert legal advice across every aspect of business law. The Firm advises a broad domestic and international client base in both the private and public sectors, across the island of Ireland.
A recent survey of regional data protection authorities in Germany has revealed 75 cases of reported personal data breaches since the GDPR came into effect on 25 May 2018. As a result, German authorities have imposed punitive fines totalling €449,000.
Germany differs from Ireland as the responsibility for monitoring and ensuring compliance with the GDPR and national data protection laws is delegated to each of the 16 German states, with each state possessing its own authority. A committee consisting of representatives from each regional authority (the ‘Data Protection Conference’) has also been appointed to ensure that a consistent approach is taken throughout the states.
So far, fines have been imposed in six of the sixteen federal states. The highest fines have been reported in the Baden-Wurttemberg region (€203, 000 across seven cases), Rhineland-Palatinate region (€124,000 across nine cases) and Berlin (€105,600 across eighteen cases). Examples of commonly reported GDPR violations include inadequate technical or organisational security measures (e.g. storing user password in non-encrypted form), non-compliance with information duties (e.g. lack of transparency around processing activities) and unauthorized marketing e-mails.
Recent data breach investigations
We have set out below a summary of two recent cases investigated by the German authorities.
Storage of unencrypted passwords
The German chat and dating service “Knuddels” was fined €20,000 in November 2018, following a data breach in which hackers were able to steal the personal data of approximately 300,000 users. Following reports from its own users, the company diligently reported the breach to the relevant state authority in line with its obligations under Article 33 of the GDPR.
As a result of the subsequent investigation, it was found that the company had stored passwords of its users in an unencrypted plain text format. This amounted to a significant breach of its obligation to implement appropriate technical safeguards for the protection of its user’s data in line with Article 32 of the GDPR.
The German authority noted the company’s willingness to engage with the investigation, and to undertake significant improvements to its IT security architecture. In a statement released by the State Commissioner for Data Protection and Freedom of Information, Dr Stefan Brink, it was noted that companies which are willing to learn from such incidents and to act transparently in rectifying data protection shortcomings can emerge stronger as a company, following a hacker attack. Dr Brink concluded that national authorities should avoid “a competition for the highest possible fines”, but instead focus on “improving privacy and data security for the users”.
Inadequate data processing contract
A €5,000 penalty was imposed on a small shipping company for failing to have an adequate contract in place governing the data processing activities carried out on the company’s behalf by third party contractors. Article 28(3) of the GDPR requires such contracts to impose certain mandatory obligations on processors.
Following investigation, the German authority concluded that sensitive personal data had been transmitted to a third party unlawfully due to the absence of a contract governing the controller / processor relationship. It was also noted that the both the company and contractor had failed to take their obligations seriously, and had instead attempted to evade responsibility rather than cooperate with the authority to rectify their shortcomings.
This case demonstrates that the GDPR is not only of concern for multinational corporations with large-scale data processing activities. Small to medium sized enterprises must also be fully aware of their obligations to comply with data protection legislation in order to avoid investigatory action by data protection authorities and potentially punitive fines.
These cases serve as a reminder of the importance of proactive cooperation with regulators at all levels of the organisation in order to mitigate the adverse effects of a personal data breach, and of ensuring that policies, systems and the company culture are designed with GDPR compliance in mind. It is vital not to overlook the basics, such as ensuring that proportionate security measures are implemented to protect particularly sensitive personal data.
The European Commission’s High Level Expert Group on Artificial Intelligence has released a new set of guidelines for ensuring that AI is “trustworthy”, following a public consultation with feedback from over 500 contributors.
The updated guidelines set out the EU’s guidance for assisting developers and deployers in achieving “trustworthy AI”, maximizing the benefits and minimizing the risks associated with this emerging area of technology.
Following its European strategy on AI (published in April 2018), the guidelines were drafted by an independent expert group, comprising of 52 representatives from academia, industry and society.
How do you make sure AI is trustworthy?
The guidelines provide that “trustworthy AI” should be lawful, ethical and robust from a technical and social perspective. They recognise that AI systems do not operate in a vacuum and do not aim to replace any existing laws or regulations applicable to AI. They largely focus on the ethical aspects of AI and call particular attention to protecting vulnerable groups, such as children.
Based on fundamental rights and ethical principles, the guidelines list seven key requirements that AI systems should meet in order to be considered trustworthy:
Human agency and oversight: AI systems should enable equitable societies by supporting human agency and fundamental rights, and not decrease, limit or misguide human autonomy.
Technical Robustness and safety: algorithms should be secure, reliable and robust enough to deal with errors or inconsistencies during all life cycle phases of AI systems.
Privacy and data governance: citizens should have full control over their own data, while data concerning them will not be used to harm or discriminate against them.
Transparency: the traceability of AI systems should be ensured.
Diversity, non-discrimination and fairness: AI systems should consider the whole range of human abilities, skills and requirements, and ensure accessibility.
Societal and environmental well-being: AI systems should be used to enhance positive social change and enhance sustainability and ecological responsibility.
Accountability: mechanisms should be put in place to ensure responsibility and accountability for AI systems and their outcomes.
Of particular interest to AI developers and deployers should be the non-exhaustive “AI trustworthiness assessment list”, which can be used as a practical checklist in AI risk assessments (for example to assess the appropriate level of human control for an AI system).
What does this mean for AI architects?
Although the principles are somewhat abstract and the guidelines aren’t legally binding, they provide a good starting point for AI developers and deployers to determine whether their new AI technologies are ethical. The guidelines also demonstrate the EU’s approach to regulating this emerging technology and will likely form the basis of any future laws on AI. Businesses should continue to comply with existing laws and regulations while being mindful of this changing landscape and keeping abreast of all guidance being published in the AI sphere.
The EC is now inviting all interested businesses to participate in a pilot phase of the “assessment list” (in June 2019) to provide practical feedback on how best to implement and verify the group’s recommendations. The EU has also launched a forum for the exchange of best practices and wants businesses interested in participating to join the European AI Alliance. Following this pilot and based on the feedback received, the expert group will propose a revised version of the assessment list in early 2020.
The Information Commissioner’s Office (ICO) has launched a consultation on a code of practice for online services to ensure they adequately safeguard children’s personal data. This follows on from the UK consultation for new online safety laws (discussed here). The Irish government has also recently launched guidance in relation to online safety (discussed here). The UK Data Protection Act (DPA) 2018 also requires the ICO to produce an age-appropriate design code of practice to give guidance to organisations about the privacy standards they should adopt when offering online services and apps that children are likely to access and which will process their personal data.
The code of practice aims to be a global benchmark, setting out 16 standards that online services, such as apps, social media platforms, and streaming services must meet to protect children’s privacy. It is not restricted to services specifically directed at children.
The draft code states that the best interests of the child should be the primary consideration when developing online services. The code will ensure greater transparency in relation to published terms, policies and community standards. It has taken account of the principles and protections of both the GDPR and the United Nations Convention on the Rights of the Child (UNCRC) to provide practical guidance for online services.
Summary of code standards
The 16 standards that online services must meet when designing and developing services likely to be accessed by children include:
Best interests of the child: The best interests of the child should be a primary consideration when designing and developing online services likely to be accessed by a child.
Age-appropriate application: Online services should consider the age range of their audience and the needs of children of different ages.
Transparency: The privacy information provided to users, and other published terms, policies and community standards, must be concise, prominent and in clear language suited to the age of the child.
Detrimental use of data: Children’s personal data should not be used in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice.
Policies and community standards: Online services must uphold their published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).
Default settings: Settings must be ‘high privacy’ by default (unless there is a compelling reason for a different default setting, taking account of the best interests of the child).
Data minimisation: Only the minimum amount of personal data should be collected and retained. Children should be given separate choices over which elements they wish to activate.
Data sharing: Children’s data should not be disclosed unless online services can demonstrate a compelling reason to do so, taking account of the best interests of the child.
Geolocation: Geolocation options should be switched off by default (unless there is a compelling reason for geolocation, taking account of the best interests of the child), and an obvious sign is provided to children when location tracking is active.
Parental controls: If parental controls are provided, children should be provided with age appropriate information about this. If an online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign should be provided to children when they are being monitored.
Profiling: Profiling options should be switched off by default (unless there is a compelling reason for profiling, taking account of the best interests of the child). Profiling should only be allowed where appropriate measures are in place to protect the child from any harmful effects.
Nudge techniques: Nudge techniques should not be used to lead or encourage children to provide unnecessary personal data, weaken or turn off their privacy protections, or extend their use.
Connected toys and devices: If a connected toy or device is provided, effective tools should be included to enable compliance with the code.
Online tools: Prominent and accessible tools should be provided to help children exercise their data protection rights and report concerns.
Data protection impact assessments: DPIAs should be conducted to specifically assess and mitigate risks to children who are likely to access online services. DPIAs should build in compliance with the code.
Governance and accountability: Online services should ensure policies and procedures are in place demonstrating how they comply with their data protection obligations, including data protection training for all staff involved in the design and development of services likely to be accessed by children.
Failure to comply with the Code
The code aims to help online services comply, and demonstrate that they comply, with their data protection obligations. Failure to comply with the code may result in regulatory action. In accordance with section 127 of the UK DPA 2018, the ICO must take the code into account when considering whether an online service has complied with its data protection obligations. The code may also be used in evidence in court proceedings, and the courts must take its provisions into account wherever relevant.
The code is open for consultation until 31 May 2019. The final version should come into effect before the end of 2019.
On 17 April 2018, the European Commission proposed new rules in the form of a Regulation and an accompanying Directive, which aim to improve law enforcement authorities’ cross-border access to e-evidence.
The proposed Regulation on European Production and Preservation Orders enables a judicial authority in a Member State to obtain electronic evidence in criminal matters directly from a service provider in another Member State. The Directive complements the Regulation, as it sets out the rules for the appointment of service providers’ legal representatives, whose role is to receive and respond to judicial orders. The new rules will ensure swift access to e-evidence, with service providers being required to respond to judicial orders within 10 days and in emergency cases within 6 hours, compared to 10 months under the current Mutual Legal Assistance process.
Will Ireland opt in?
Under the Lisbon Treaty, Ireland has an option to opt in to European laws relating to freedom, security and justice. As a result, Ireland has discretion as to whether to adopt the new e-evidence rules. Irish support for the new rules are clear, and the Government has indicated its intention to take part in the adoption and application of the new rules (see here and here).
The draft Directive
Last month, the Council reached its position on the draft Directive, which lays down precise rules for the appointment of legal representatives by internet service providers for the purposes of gathering evidence in criminal proceedings. The Directive is an essential tool for the application of the Regulation, on which the European Council adopted its position in December 2018. The creation of legal representatives was necessary because of the lack of a general legal requirement for non-EU service providers to be physically present in the EU when providing services within the EU. The legal representatives designated under the Directive could also be used for domestic procedures.
The rules will affect internet service providers in two key ways:
Internet service providers will be required to publically designate legal representatives for receiving, complying with and enforcing orders issued by competent EU authorities.
Internet service providers and their legal representatives will be held jointly and severally liable for non-compliance.
Current Problems with Cross-Border Access to Evidence
The proposed legislation represents a welcome development for law enforcement agencies which have battled against outdated mechanisms for accessing e-evidence stored in servers across the EU for years. As it stands, the main method for accessing e-evidence in Ireland are Mutual Legal Assistance Treaties (MLATs), governed by the Criminal Justice (Mutual Assistance) Act 2008.
The MLAT process requires a criminal investigation to be taking place in a Member State, a lawful interception warrant and a nominated authority in the Member State to make a request for interception to the Minister for Justice, Equality and Law Reform. This process takes on average, 10 months to complete.
The enforcement of MLATs is further complicated by the fact that there is no requirement for non-EU internet service providers to be present in the Union when providing services within it. This dramatically increases the time it takes to gain access to e-evidence and the likelihood that it will be disposed of or destroyed in the interim.
How will the new rules help?
The key point for these new rules is how they work in tandem. The creation of European Production Orders under the proposed Regulation will permit law enforcement authorities in one Member State to order the production and preservation of e-evidence in another Member State for the investigation of criminal matters.
The legal representatives established by internet service providers under the Directive will be compelled to respond to the European Production Orders within ten days of receiving it but there is an emergency provision whereby replies will be required within six hours. It is only SMEs who will be granted some relief.
Under the Regulation, SMEs will be able to pool together and have the same legal representative for receiving requests. Additionally, any financial sanctions imposed on SMEs will be obliged to take into account the enterprise’s financial capacity to pay.
This legislation will drastically inflate the already substantial burden on private companies, often seen in the media as too lax in their handling of personal data, to verify requests and determine the information they will provide while conforming to the protections given to personal data under GDPR.
The exchange of e-evidence is becoming increasingly important as major technology firms continue to locate their European headquarters in Ireland. The rise in demand for the gathering of e-evidence will only continue as criminal and terrorist organisations become more technologically advanced, spreading themselves across multiple EU states to avoid detection. The requirement of the proposed Directive that internet service providers appoint legal representatives to directly respond to their EPOs will present a major opportunity for Irish law firms to continue to be at the cutting edge of the European IP/IT market. This may become invaluable as the uncertainty of Brexit looms ever closer.
The European Council is ready to start trilogue negotiations on both the draft Regulation and Directive as soon as Parliament has adopted its position. It is unlikely that these negotiations will take place before the European elections in May 2019.
On 17 April 2019, the European Parliament approved a new Regulation on platform-to-business trading practices. It requires online platforms and online search engines to comply with certain legal obligations and also encourages them to take voluntary complimentary steps. The Regulation aims to ensure that businesses using online intermediation services and general online search engines have greater certainty and clarity with respect to the rules governing their relationships with these platforms and how to resolve potential disputes.
The text adopted by the European Parliament has not yet been formally approved by the Council of the EU. Once approved, the Regulation will enter into force 12 months after its publication in the Official Journal.
Online search engines and platforms generate the vast majority of internet traffic for big businesses as well as small and medium-sized enterprises (SMEs). However, it has been deemed important to address structural issues and to prevent unfair business trading practices. The intermediary role of these online platforms may allow them to engage in unfair trading practices which may cause economic harm to the businesses that use them. Additionally, the online visibility of small businesses can depend on their position in search results. The Regulation is designed with SME’s in mind, to ensure that they will no longer be faced with “unexplained account suspensions, opaque rankings in search results, unaffordable dispute resolutions and many other unfair practices”.
Scope of the Regulation
The Regulation covers online platform intermediaries and online search engines providing services to businesses established within the EU, and that offer goods or services to consumers located in the EU. Online platform intermediaries include third-party e-commerce market places (e.g. Amazon Marketplace, eBay etc.), app stores (e.g. Google Play, Apple App Store etc.), social media for business (e.g. Facebook pages, Instagram used by makers/artists etc.) and price comparison tools (e.g. Skyscanner, Google Shopping etc.). Online search engines that facilitate web searches based on a query and provide results in a format corresponding to the request (e.g. Google, Bing etc.) are also covered by the new Regulation.
The Regulation excludes certain services, including payment services, online advertising, and search engine optimisation. It also excludes online retailers, such as supermarkets, and retailers of brands to the extent that they sell only their own products.
Impact of the New Rules
The key obligations of online platform intermediaries and online search engines under the new Regulation are:
Terms and conditions must be easily available and provided in clear and plain language. Changes must be announced in advance.
Platforms should not prevent the business user from making its identity visible.
Platforms and search engines will have to inform businesses how they treat and rank goods or services offered by themselves or by businesses they control compared to third party businesses, either in the terms and conditions or in a publicly available document. Businesses should also be informed how online platforms can influence their ranking position. Search engines will need to inform consumers if the ranking result has been influenced by any agreement with the website user.
If a platform decides to restrict, suspend or terminate a business’ account, it must provide a statement of reasons to the business concerned, give 30 days prior warning in most instances of termination and preserve the data associated with business users’ account.
The Regulation provides an effective and quick means to resolve disputes between businesses and online platforms including an internal complaint handling system, and naming specialised mediators.
Member states shall need to take sufficient deterring measures to ensure that the platforms and search engines comply with the requirements in the Regulation. Associations or organisations representing businesses can take action in national courts in order to stop or prohibit non-compliance with the Regulation.
Online platforms will need to review their terms and conditions to ensure they are clear and unambiguous, and make them easily available to business users. Otherwise the terms and conditions, or at least specific provisions, may be null and void, creating business disruption issues for the online platform.
Businesses will have greater legal certainty and clarity with respect to the rules governing their relationships with these platforms and how to resolve potential disputes.
On 1 May 2019, Ms Helen Dixon, the Data Protection Commissioner (DPC), appeared before the US Senate Committee on Commerce, Science and Transportation. She was invited to testify on Ireland’s implementation of the GDPR, as the US is considering introducing a federal data privacy framework. California has already passed a new data privacy law, the California Consumer Privacy Act, which is due to come into effect on 1 January 2020. This note sets out some of the highlights of the DPC’s testimony.
Data Subject Complaints
The DPC shared her office’s experience in dealing with data subjects’ complaints under the GDPR. She noted, in particular, that:
In the 11 months since GDPR came into application, the DPC has received 5839 complaints from individuals. These complaints frequently come from individuals as a means of pursuing further litigation or action (e.g. ex-employees seeking access to their personal data as part of an unfair dismissals case, or individuals seeking access to CCTV footage to pursue personal injuries cases).
Many issues arising for individuals are being resolved directly through the intervention of the mandatorily appointed Data Protection Officer in the company before there’s a need to file a complaint with the DPC.
Overall, the most complained against sectors in a commercial context are: retail banks, telecommunications companies and internet platforms.
In the case of retail banks and telecommunications providers, the main issues arising relate to consumer accounts, over-charging, failure to keep personal data accurate and up-to-date resulting in misdirecting of bank or account statements, processing of financial information for the purposes of charging after the consumer has exercised their right to opt-out during the cooling-off period.
Other complaints concerned the requirement for financial lenders to notify details to the Irish Central Bank of credit given to individuals. Certain lenders notified the details twice resulting in adverse credit ratings for the individuals as they appeared to have 2 or 3 times the number of loans as compared to what they actually had.
In regard to internet platforms, individuals or not-for-profit organisations on their behalf, have raised complaints about the validity of consent collected for processing on sign-up to an app or service, the transparency and adequacy of the information provided and about non-responses from the platforms when they seek to exercise their rights or raise a concern.
The most frequent category of complaint relates to access requests where an individual considers they have been denied access to a copy of the personal data they requested from an organisation. Most of these complaints are resolved amicably by the DPC, with the individual receiving all of the personal data to which they’re entitled. This may be less than they originally sought, as an organisation may apply statutory exemptions where it is lawful to do so.
Clearer standards of data protection expected to evolve in coming years
The DPC expects that much of the GDPR’s success over the coming years will derive from the evolution of clearer, objective standards to which organisations must adhere. Ms Dixon said that these standards will evolve in the following ways:
Through the embedding of new features of the GDPR, such as Codes of Conduct, Certification, and Seals, that will drive up specific standards in certain sectors. Typically, Codes of Conduct that industry sectors prepare for the approval of EU data protection authorities (DPAs) will have an independent body appointed by the industry sector to monitor compliance with the code.
Through enforcement actions by the DPC where the outcome, while specific to the facts of the case examined, will be of precedential value for other organisations. The DPC currently has 51 (domestic & cross-border) investigations open. The first set of investigations are expected to conclude during the summer of 2019.
Through case law in the national and EU courts, where EU DPA decisions are appealed or in circumstances where individuals use their right of action under the GDPR to claim compensation for material or non-material damage suffered as a result of a GDPR infringement.
Through the provision of further regulatory guidance, particularly through published case studies of individual complaints the DPC has handled, or following consultations with stakeholders.
The DPC also enforces e-Privacy laws, pursuant to the e-Privacy Directive, which applies in parallel to the GDPR. Her office annually prosecutes a range of companies for multiple offences. The majority of these prosecutions concern companies targeting mobile phone users with marketing SMS messages without their consent, and/or without providing the user with an OPT OUT from the marketing messages. Equally, a number of companies are prosecuted annually where they offer an OPT OUT but fail to apply it on their database resulting in the user continuing to receive SMS messages without their consent. As a result of several years of some high-profile prosecutions in this area, the DPC considers the rate of compliance is improving.
The DPC has also devoted considerable resources to a series of investigations into the “Private Investigator” sector. In the last 4-5 years, the DPC has prosecuted five companies and four company directors for bribing or “blagging” government officials, or utility company staff, to unlawfully procure personal information about individuals.
In order to secure damages for data breaches, individuals have a right of action under Article 82 of the GDPR where they, or a not-for-profit representing them, can bring a case through the courts to seek compensation for material or non-material damage they have suffered as a result of infringements of the GDPR. The DPC noted that no Article 82 actions for compensation by individuals in the Irish courts have been heard yet, but such actions will provide further clarification on how the courts view the GDPR and its application.
While there are some reports emanating, particularly from the UK, that representative actions are being lined up by some law firms on a “no win no fee” basis post large-scale breaches being notified, the DPC stated that nothing of significance has yet materialised.
As we approach the GDPR’s one-year anniversary, we are starting to see more enforcement activity by the EU Data Protection Authorities (DPAs) as they complete their initial investigations into data breaches. This blog looks at two recent fines issued by the Polish and Danish DPAs, which demonstrate the type of conduct likely to lead to enforcement activity.
Polish DPA issues first fine for failure to fulfil information obligation
The Polish DPA recently imposed its first fine of €220,000 on a company which aggregates personal data from official publicly available registers for commercial purposes. The DPA concluded that the company had failed to inform data subjects about how it processes their personal data, as required under Article 14 of the GDPR.
The company fulfilled its obligation under Article 14 of the GDPR in respect of those data subjects whose email addresses it had at its disposal. However, the company failed to comply insofar as it did not contact the remaining 6 million people, whose email addresses it did not have. Even though the company had the postal addresses, and in some cases telephone numbers, of those remaining data subjects, the company argued that sending information by registered post would have involved a disproportionate effort, as the cost of mailing letters would be over PLN 30 million (€6.9 million), which was more than the company’s annual turnover. Instead, the company displayed a notice about the processing on its website, in an effort to comply with Article 14.
In the DPA’s opinion, displaying the information on the company’s website was insufficient where the company had the data subjects’ contact information, enabling it to inform them directly. In addition, the DPA noted that Article 14 does not impose an obligation to provide the necessary information by registered post (or other specific medium), so the expense of doing so was not a valid excuse. The DPA concluded that the infringement was intentional, as the company was aware of its duty to directly inform the data subjects, and had made a conscious decision not to inform them on costs grounds. The DPA seems to have taken the view that the company, in conducting its business, should have taken into account the costs necessary to comply with its legal obligations. The company is reportedly challenging the DPA’s decision in the Polish courts.
The DPA’s decision has been criticised as being unduly harsh, as Article 14(5)(b) provides an exemption to a controller’s information obligation to the extent that the provision of such information would involve a “disproportionate effort”. In such cases the controller is required to take appropriate measures to protect the data subjects rights, including making the information publicly available. We await further clarification from the EU DPAs, EDPB and/or courts on the scope of the “disproportionate effort” exemption, and how much effort a data controller is expected to expend to inform data subjects that it is processing their data. In the meantime, companies carrying out data-scraping for commercial purposes should carefully consider how to do so in compliance with Article 14, and factor in the cost implications of such compliance.
Danish DPA issues first fine for failure to delete customer phone numbers
Denmark’s DPA has also recommended its first fine of DKK 1.2 million (approximately €160,754) on a taxi company. The DPA found the taxi company retained customer data (namely customer telephone numbers) relating to approximately 9 million taxi rides, for longer than necessary, in breach of the GDPR’s data minimisation obligation. The DPA recommended the fine after it discovered the taxi company deleted only the name of data subjects after a two-year retention period, but continued to hold onto individuals’ phone numbers for a further three years. The DPA dismissed the taxi company’s argument that telephone numbers were an essential part of its IT database and could not be deleted within a shorter time span. The DPA also found that the taxi company was unable to demonstrate (beyond a manually updated deletion log) how and when personal data is deleted in its systems and backup recovery files. It remains to be seen whether the Danish Court will approve and impose the fine recommended by the DPA.
The fine serves as a reminder to companies of the importance of having a comprehensive data retention policy in place, being able to justify relevant retention periods, and deleting data when it is no longer needed for the purpose for which it was collected.
Other enforcement activity
Earlier this year, the European Commission published an infograph (discussed here) highlighting GDPR fines issued by the German, Austrian and French DPAs, respectively, for failure to secure users’ data (€20,000 fine), unlawful video surveillance (€5,280 fine) and lack of consent for Ads (€50m fine). Other enforcement activities are set out on the EDPB’s website.
The Irish DPC has not yet issued any GDPR fines, as the 16 statutory inquiries it is conducting (as lead supervisory authority) into multinational technology companies are ongoing. Those investigations are apparently at an advanced stage, but are subject to the consistency and cooperation process which will take some time to conclude.
The UK Supreme Court has granted supermarket chain Morrisons permission to appeal against a landmark UK Court of Appeal ruling that found it vicariously liable for a deliberate data breach carried out by a former employee (previously discussed here).
Mr Skelton, an internal auditor at Morrisons, maliciously disclosed his co-workers’ personal data (including payroll data) on the internet. The UK Court of Appeal found Morrisons vicariously liable for the rogue employee’s actions, even though the data breach was targeted at harming Morrisons. In a class action suit, over 5,500 employees sued Morrisons for compensation for loss caused by the data breach, including non-pecuniary loss such as distress.
The Court of Appeal acknowledged that data breaches caused by individuals acting in the course of their employment may lead to a large number of claims against companies for “potentially ruinous amounts” but that the solution is to insure against such catastrophes. However, although insurance may help mitigate the consequences of a data breach, it is not a magic solution. In particular, it may be challenging for organisations to price a potential data breach. It is vital therefore that companies also take all appropriate technical and organisational measures to prevent the accidental or unauthorised disclosure of personal data, and to respond quickly once a breach has occurred to minimise any damage.
The appeal will be watched closely by employers and legal practitioners in Ireland, as the UK Supreme Court’s decision on the scope of an employers’ vicarious liability for data breaches may be of persuasive authority to the Irish courts. No date has yet been given for the appeal hearing.
The EDPB has released new draft guidelines 2/2019 on the contractual necessity legal basis for processing personal data in the context of the provision of online services to data subjects. The guidelines emphasise the narrow scope of the contractual necessity legal basis. A controller must be able to demonstrate that the processing is ‘objectively necessary’ for a purpose that is ‘integral’ to the delivery of a contractual service to the data subject in order to rely on this legal basis. If a controller cannot demonstrate such necessity it must consider another legal basis for processing the personal data. This note considers the key highlights of the guidelines.
Article 6(1) of the GDPR provides that processing shall be lawful only on the basis of one of six specified conditions set out in Article 6(1)(a) to (f). Article 6(1)(b) of the GDPR sets out the contractual necessity legal basis. It provides that the processing of personal data shall be lawful to the extent that “processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract”. This legal basis reflects the fact that sometimes contractual obligations cannot be performed without the data subject providing certain personal data.
Scope of the Guidelines
The EDPB notes that ‘online services’ as used in the guidelines refers to ‘information society services’, which are defined as “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services” (Directive (EU) 2015/1535 and Article 8 GDPR). The EDPB confirms that this definition extends to services that are not paid for directly by the persons who receive them, such as online services funded through advertising (see also Recital 18 of the e-Commerce Directive 2000/31/EC).
The Article 29 Working Party has previously expressed views on the contractual necessity legal basis under the Data Protection Directive 95/46/EC in its opinion on the notion of legitimate interests of the data controller (opinion 06/2014), and the EDPB has indicated that opinion remains relevant when assessing the application of Article 6(1)(b) of the GDPR.
Compliance with the GDPR as a whole
The contractual necessity legal basis in Article 6(1)(b) must be considered in the context of the GDPR as a whole, including the data protection principles. The EDPB highlights that the fair and transparent processing, purpose limitation and data minimisation obligations are particularly relevant in contracts for online services, insofar as technological advancements make it possible for controllers to easily collect and process more personal data than ever before.
Where processing is not considered ‘necessary for the performance of a contract’, the EDPB recognises that another lawful basis may be applicable, such as consent or legitimate interests. However, the EDPB warns that where a controller is relying on consent as a legal basis, it is important to distinguish between entering into a contract and consent to the processing of personal data under Article 6(1)(a). Data subjects should not be given the impression that they are giving their consent in line with Article 6(1)(a) when signing a contract or accepting terms of service.
Necessity of Processing
When assessing whether Article 6(1)(b) is an appropriate legal basis for an online contractual service, regard must be given to the particular aim, purpose, or objective of the service. Article 6(1)(b) will not cover processing which is “useful but not objectively necessary” for performing the contractual service or for taking relevant pre-contractual steps at the request of the data subject, even if it is necessary for the controller’s other business purposes. Other legal bases, such as the controller’s legitimate interests, may be available for those other business purposes.
(i) Necessary for the performance of a contract with the data subject
Where a controller seeks to establish that the processing is necessary for the performance of a contract with the data subject, the EDPB expects the controller to be able to demonstrate how the main object of the specific contract with the data subject cannot be performed if the specific processing of the personal data in question does not occur.
The EDPB suggests online services ask the following questions, when assessing whether Article 6(1)(b) is applicable:
What is the nature of the service being provided to the data subject? What are its distinguishing characteristics?
What is the exact rationale of the contract (i.e. its substance and fundamental object)?
What are the essential elements of the contract?
What are the mutual perspectives and expectations of the parties to the contract? How is the service promoted or advertised to the data subject? Would an ordinary user of the service reasonably expect that, considering the nature of the service, the envisaged processing will take place in order to perform the contract to which they are a party?
The guidelines provide examples of when it is or is not appropriate for online services to rely on Article 6(1)(b) to process personal data. Example 1 (in italics below), illustrates the narrow scope of this legal basis.
“A data subject buys items from an online retailer. The data subject wants to pay by credit card and for the products to be delivered at home. In order to fulfil the contract, the retailer must process the data subject’s credit card information and billing address for payment purposes and the data subject’s home address for delivery. Thus, Article 6(1)(b) is applicable as a legal basis for these processing activities. However, if the customer has opted for shipment to a pick-up point, the processing of the data subject’s home address is no longer necessary for the performance of the purchase contract and thus a different legal basis than Article 6(1)(b) is required.”
(ii) Necessary for taking steps prior to entering into a contract
The alternative condition for the application of Article 6(1)(b) is where processing is necessary in order to take steps at the request of the data subject prior to entering into a contract. This provision reflects the fact that preliminary processing of personal data may be necessary prior to entering into a contract in order to facilitate actually entering into that contract.
Example 5 (in italics below) demonstrates when processing would not fall within the remit of the contractual necessity legal basis.
“In some cases, financial institutions have a duty to identify their customers pursuant to national laws. In line with this, before entering into a contract with data subjects, a bank requests to see their identity documents. In this case, the identification is necessary for a legal obligation on behalf of the bank rather than to take steps at the data subject’s request. Therefore, the appropriate legal basis is not Article 6(1)(b), but Article 6(1)(c)”.
Applicability of Article 6(1)(b) in Specific Situations
(i) Processing for ‘service improvement’
The EDPB does not consider that Article 6(1)(b) would generally be an appropriate lawful basis for processing for the purposes of improving a service, as such processing cannot usually be regarded as being objectively necessary for the performance of the contract with the user.
(ii) Processing for ‘fraud prevention’
In the EDPB’s view, processing for fraud prevention purposes is likely to go beyond what is objectively necessary for the performance of a contract with a data subject. Such processing could however still be lawful under another basis in Article 6(1), such as compliance with a legal obligation or legitimate interests.
(iii) Processing for ‘online behavioural advertising’
The EDPB does not view Article 6(1)(b) as providing a lawful basis for online behavioural advertising simply because such advertising indirectly funds the provision of the service. The EDPB states that: “Although such processing may support the delivery of a service, it is separate from the objective purpose of the contract between the user and the service provider, and therefore not necessary for the performance of the contract at issue.” Furthermore, in line with the e-Privacy requirements, controllers must obtain data subjects’ prior consent to place the cookies necessary to engage in behavioural advertising.
(iv) Processing for ‘personalisation of content’
The EDPB notes that personalisation of content may constitute an essential element of certain online services, and therefore may be regarded as necessary for the performance of the contract with the service user in some cases.
The guidelines are open to public consultation until 24 May 2019. Comments should be sent to EDPB@edpb.europa.eu.