Loading...

Follow Fieldfisher - Privacy, Security and Information.. on Feedspot

Continue with Google
Continue with Facebook
or

Valid

We're now on the home stretch in our run-down of the CCPA's core rights. So far, we've covered the Notice, Access and Opt-out requirements. Now, we'll be looking at the two final rights under the CCPA – Deletion and Non-discrimination. Compared to some of the other rights, these provisions may seem relatively straightforward. However, a closer look raises some tricky questions – in particular how the deletion exemptions under the CCPA tie in with the GDPR's Article 17 grounds for deletion, and how the right to non-discrimination sits alongside the CCPA's allowance for 'financial incentives'.

The Deletion requirements

The right to deletion under the CCPA is set out at as follows:

"A consumer shall have the right to request that a business delete any personal information about the consumer which the business has collected from the consumer." (Section 1798.105(a))

The first point to note relates to the scope of personal information within the deletion right. What isn't clear is whether the wording "which the business has collected from the consumer" is intentional. Elsewhere, with respect to other rights, the CCPA refers more broadly to personal information "about the consumer". If this distinction is intentional, then the right to deletion only extends to personal information provided voluntarily by the consumer, and possibly, data collected from consumers automatically (e.g. device data). However, other information such as data from third party sources, inferential data, passively observed and recorded data (like CCTV) would seemingly fall out of scope.

Exemptions

Assuming the right to deletion does apply, then there are a number of pretty broad exemptions to keep in mind. For example, a business can continue to retain the data where it is necessary to:

  • complete a transaction, provide a good or service or perform a contract with the consumer,

  • detect security incidents,

  • protect against fraud and illegal activity,

  • debug and repair errors,

  • enable solely internal uses that are reasonably aligned with the expectations of the consumer, or

  • for other internally uses that are lawful and compatible with the context in which the consumer provided the information.

A business that has already implemented data deletion processes under the GDPR will be in a good position to respond to requests, as they will already have the technical capabilities to permanently delete personal data within their systems and internal procedures for handling and responding to requests.

However, a mental shift is required for the CCPA. Under the GDPR, a data controller must delete data only if one of the preconditions set out in Article 17(1) applies – for instance, if the personal data is no longer needed (Art 17(1)(a)) or where a data subject objects to processing based on legitimate interests and there are no "overriding legitimate grounds" for the processing (Art 17(1)(c)). In addition, the right does not apply if the business needs the data for certain purposes – such as to comply with a legal obligation (Art 17(3)(b)) or to establish or defend legal claims (Art 17(3)(e)).

The upshot of this is that if a business receives two deletion requests from an EU individual and a Californian individual then it will need to consider those requests quite differently, depending on the use of the data. For example, if the data is being used by the business for internal analytics then under the GDPR it's likely the request would need to be honoured (under Art 17(1)(c)), whereas under the CCPA the request could be refused (due to the exemption for internal uses reasonably aligned with the expectations of the consumer).

Each request will, of course, turn on its own facts – but lawyers will need to consider different legal thresholds and exemptions under the two pieces of legislation.

The Non-discrimination requirements

To complement and help reinforce the other consumer rights, the CCPA contains a non-discrimination provision:

"A business shall not discriminate against a consumer because the consumer exercised any of the consumer’s rights under this title…" (Section 1798.125(a)(1))

This means that a business cannot treat a consumer differently simply because they have chosen to exercise any of their rights under the CCPA – for instance, if they requested their information be deleted or they opted out from the sale of their personal information.

The CCPA contains a non-exhaustive list of discriminatory practices, which includes:

  • denying goods or services to the consumer,

  • charging different prices or rates for goods or services (including through the use of discounts, other benefits or penalties),

  • providing a different level or quality of goods or services to the consumer, and

  • merely suggesting that the consumer will receive a different price or rate or a different level or quality.

Financial incentives

As an exception to the non-discrimination requirements, the CCPA allows a business to offer 'financial incentives' relating to the collection, sale or deletion of personal information (Section 1798.125(b)). This means that a business may, for example, encourage consumers (through monetary or other valuable consideration) to allow the business to sell the consumer's information or, similarly, discourage consumers from requesting their information be deleted. These types of incentives would not fall within the scope of non-discrimination even though they would clearly involve the use of discounts, benefits and/or penalties.

An obvious example of a financial incentive is where an individual signs up to a mailing list and receives a free e-book or download – a simple quid pro quo where the individual provides their email address in exchange for free content. Another example could be a wellness app that sends the user offers or discount codes if they share information with the app about their daily step count or weekly class attendance. Similarly, financial incentives could include a broad range of loyalty schemes.

The CCPA also allows a business to offer a different price or quality of goods or services if the difference "is directly related to the value provided to the consumer by the consumer’s data" (Section 1798.125(b)(1)). This implies that where the value of the service is tied to the value of the consumer's data, then the business can justify setting a different price or withholding a service depending on whether, for example, the consumer opted out from the sale of their personal information or requested their information be deleted.

In theory, this could potentially cover a number of different use cases – for instance, where a service is funded entirely through advertising and the consumer's data is needed to deliver and target advertising to the consumer. There is, however, some uncertainty as to how broadly this allowance will be interpreted – this is certainly one area where further clarification or guidance would be welcome.

In any event, the CCPA places conditions around the offering of financial incentives – firstly, a business must not offer incentives in a way that is "unjust, unreasonable, coercive, or usurious in nature" and, secondly, the business must:

  • notify consumers about the use of incentives in a way that clearly describes the material terms of the program, and

  • obtain the consumer's prior opt-in consent (which can revoked at any time).

Non-discrimination and the GDPR

Interestingly, the GDPR doesn't contain an equivalent provision to the CCPA's right of non-discrimination. However, Article 5(1)(a) of the GDPR states that processing of personal data must be ‘fair’ – which arguably prohibits discriminatory treatment of a data subject based on their choice to exercise their rights under the GDPR.

The GDPR also doesn't explicitly address financial incentives, but these types of exchanges involving personal data pose certain difficulties. For instance, if a controller is relying on the individual's consent to process personal data, then the individual's consent must be "freely given" – as is clear from Article 7(4) of the GDPR and the European Data Protection Board's Guidelines on Consent, consent is not "freely given" if the service is conditional on consent or the individual would suffer a detriment by not providing it. In other words, an individual cannot consent to a quid pro quo to receive a good or service in exchange for their personal data, as such consent would not be considered valid. So the controller would need to rely on one of the other lawful grounds for processing under the GDPR, such as "contractual necessity" or "legitimate interests".

That's a wrap…

We've made it through the CCPA's core rights – phew! But there is still a lot to talk about. Keep checking in for more thoughts and updates about the CCPA, both in terms of how the CCPA will impact your business and what you should be doing to stay compliant.

 

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

What is the Regulation and why is it deemed necessary?

The Regulation is a draft EU legislative instrument which is intended to replace the Privacy and Electronic Communications (EC) Directive 2002/58/EC (the Directive) which is implemented in the UK through the Privacy and Electronic Communications (EC Directive) Regulations 2003, SI 2003/2426.


The European Commission’s intention in bringing forward the Regulation is to ‘reinforce trust and security in the digital single market’ by updating the legal framework on ePrivacy. This step is part of the European Commission’s project to modernise the EU’s data protection framework. It will also make the ePrivacy legislation consistent with the provisions of the General Data Protection Regulation (EU) 2016/679 (GDPR). The GDPR sets out a broad framework for the
processing of personal data. The Regulation, by contrast, sets out specific rules for the processing of personal data in the context of electronic communications.


Examples of the changes which the Regulation makes include removing breach notification requirements as these are now covered in the GDPR, consistency of the fines regime with the GDPR (up to 4% of annual worldwide turnover), harmonising cookie consent rules throughout the EU and creating slightly wider exemptions (for example in the context of analytics), harmonising and broadening the communications data processing rules and harmonising direct marketing consent requirements.
 
What is the current status of the draft legislation?
The European Parliament set out its position on the Regulation in October 2017. However, the Council of the EU, which is made up of ministers of the Member States, has not yet come to a position on the legislation. The Regulation cannot be adopted until the Council of the EU has come to a position and the Council of the EU and the European Parliament have agreed on a text.


The European Parliament is due to hold elections on 23–26 May 2019 and will thereafter appoint a new European Commission, which will begin its term of office from 1 November 2019. It is therefore likely that any adoption of the Regulation will not take place before 2020. It is also possible that the new European Parliament will decide not to continue negotiations on the Regulation and the instrument will fall. The more likely outcome, however, is that the new European Parliament picks up where the previous European Parliament left off, and the Regulation will eventually proceed to adoption.
 
When is it likely that Regulation will be adopted and come into force?
If the Regulation is adopted, it will come into force after a few days. However, the coming into force date is different from the date from which the legislation applies. It is only from the date of application that the Regulation would become enforceable law. The latest draft text from the Council of the EU says that the Regulation will apply two years from its coming into force date. However, the European Parliament will need to agree the timeframe. Their draft of 2017 envisaged a much shorter time before the legislation would apply.
 
Has the UK government indicated its position on ePrivacy?
The government has welcomed the opportunity to update the Directive, to address technological developments and the evolving digital landscape. The government’s central policy aim is to ensure that the proposals protect the confidentiality of electronic communications while still encouraging digital innovation. In terms of the relevant timeframes, the government considers that due to the importance of the Regulation, the quality of the text should be prioritised over speed.
 
Which ePrivacy rules will apply in the UK after Brexit?
If the UK leaves the EU without the withdrawal agreement in place, SI 2003/2426 will be retained in domestic law by virtue of section 2 of the European Union (Withdrawal) Act 2018 (EU(W)A 2018). SI 2003/2426 will continue to be interpreted in the same way as it was before the UK's exit from the EU subject to any subsequent amendments (see below).


Domestic and EU case law that applies to the legislation will also continue to be relevant (EU(W)A 2018, s 6(3)). Amendments will be made to SI 2003/2426 under the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations to ensure that they continue to operate effectively after the UK's withdrawal from the EU.


If the Withdrawal Agreement is approved and enters into force on the UK’s exit from the EU, then EU law will continue to apply to the UK during the transition period (currently due to last until 31 December 2020, although it is likely that it could be extended to either December 2021 or December 2022). If the Regulation applies (as opposed to merely coming into force) during the transition period, it will become UK national law automatically, by virtue of section 2(1) of the European Communities Act 1972. This provision enables EU regulations to flow directly into UK law without further implementation and will be retained in domestic law through the European Union (Withdrawal Agreement) Bill which will be introduced if Parliament approves the Withdrawal Agreement, in order to implement it.


At the end of the transition period, if the Regulation is indeed applicable in the UK, the provisions of EU(W)A 2018 will come into force at that point in order to save the Regulation and turn it into domestic law. 

 
Is it too early for businesses to be preparing for change?
Yes. The text of the Regulation is far from agreed at EU level. In all likelihood, given the upcoming European Parliamentary elections and change of the European Commission, the final text of the Regulation will not be agreed until 2020. At that point businesses should start preparing, but there may be a long lead in time as there was with the GDPR before the Regulation actually applies.


It is also possible that the Regulation will never become law in the UK if its date of application falls after the end of the transition period. If the UK exits the EU without a deal, then the Regulation will not apply to the UK either, although the adoption of substantially similar rules may be necessary in order for the UK to gain an EU adequacy decision.


Eleonor Duhs advises on GDPR and ePrivacy law. Prior to joining Fieldfisher, Duhs worked as a senior government lawyer. She was the UK government’s lead lawyer in negotiations on the GDPR. While working in the Department for Exiting the European Union, she was the legal lead on the provisions of EU(W)A 2018. She also led on aspects of the Withdrawal Agreement and the framework for the UK-EU future relationship.

This article was first published on Lexis®PSL TMT on 21 March 2019

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This article was co-authored by Paola Heudebert, legal trainee at Fieldfisher, and Olivier Proust, Partner in the Privacy, Security & Information Law department at Fieldfisher.

On April 15 2019, the French Data Protection Authority (the "CNIL") released its 2018 Annual Report (hereinafter the "Report"). The key practical insights of this Report are summarized in this article.

 

  • 2018: from theory to practice

2018 was an exceptional year marked by the entry into force of the GDPR. The CNIL contemplates that 2018 marked a new era of awareness of data protection issues among professionals and individuals. As rightly stated by Jean Lessi, CNIL's general secretary, this realization is " all the more important because, in addition to the novelty effect of the GDPR, there is also an undeniable 'spotlight' effect on pre-existing obligations and rights".

 

  • A record number of complaints

The new legal framework has been widely publicized. 6 months after its implementation, 66% of the French population stated they were more concerned about the processing of their personal data, according to an IFOP survey conducted for the CNIL. As a result, in 2018, the CNIL received more than 11,000 data subjects' complaints, which represents an increase of 32% as compared to 2017. 9,000 of these complaints were considered as "complex", meaning that the CNIL had to either contact the controller directly in writing, or conduct on-site inspections, to enquire about the conditions under which its data processing was carried out. In others words, 80% of the individual complaints resulted in the CNIL's involvement. This shows once more the importance of taking data subject requests seriously and answering them in a timely manner (the CNIL being allowed to intervene within one month of the controller's denial or absence of response).

The Report interestingly highlights the sectors that are most subject to complaints.

 :

  • 35.7 % of the complaints concern the IT and telecom industry, making the erasure of data accessible online the primary concern of data subjects. The CNIL also acknowledges the growing concern of data used in smartphone apps;
  • 21% of complaints concern the sales/marketing sector, especially direct marketing by text messages and emails without obtaining prior consent of the data subjects or the retention of consumers bank details;
  • 16.5% of complaints are employment related including the use of CCTV and other monitoring practices at the workplace being under the CNIL's strict scrutiny;
  • 8.9% of complaints pertain to the bank and credit sector especially the listing of individuals in the French National Database on Household Credit Repayment Incidents and the various difficulties experienced by data subjects to effectively exercise their right of access;
  • 4.2 % of complaints are related to the health and social sector and more precisely to issues relating to accessing personal medical records;

It is also worth noting the growing interest of data subjects to exercise their right to data portability in particular in the banking sectors or in relation to online services.

Lastly, 20% of the complaints involved cross-border processing activities and required EU cooperation with other supervisory authorities.

 

  • Advisory and consultation powers

In 2018, the CNIL sought to provide professionals with guidelines and documentation and took into account the need for legal certainty in a context of increased sanctions and the demand for greater simplification for smaller businesses. The French DPA was especially prolific and produced a plethora of guidelines and articles on a wide variety of topics ranging from template records for processing activities, through clarifications on key concepts such as consent or profiling, and guidelines on the privacy challenges of blockchain technology or artificial intelligence.

The CNIL also released 120 legal opinions, including on the amended French Data Protection Act or its implementation decree and on a number of other legal instruments.

 

  • Supervision and enforcement

In 2018, the CNIL also flexed its GDPR muscles. It is however important to underline that most enforcement measures were still taken under the old Data Protection Directive and afferent version of the French Data Protection Act.

Regarding its investigating powers, the CNIL conducted 204 on-site inspections (including 20 on-site inspections of CCTV devices); 51 online inspections; 51 controls on a document production basis, and 4 hearings.

As regards its corrective powers, the CNIL sent 49 cease and desist letters (13 of which were made public), specifically targeting the insurance sector (5 decisions) and targeted advertising sector (4 decisions). The Report acknowledges that in the vast majority of cases, the simple intervention of the CNIL resulted in the organization's compliance. Indeed, of the 310 controls carried out, only 11 sanctions were adopted by the Restricted Committee, including 10 financial penalties (9 of which were made public), one non-public warning and one dismissal.

The majority of the financial penalties imposed in 2018 concerned security incidents (7 out of 10), in particular personal data breaches, which indicates more than ever the need to evaluate the risks inherent to the processing and to implement measures to mitigate those risks. Several big players were sanctioned including Uber, Bouygues Télécom, Dailymotion or Optical Centre. As acknowledged in the Report: "it is not the incident as such that the CNIL sanctioned, but the deficiencies and inadequacies in the security measures of which this incident was only a symptom".

 

  • What to expect in 2019?

In 2019, the CNIL's actions will focus on three priorities:

    • Successfully implementing the GDPR for individuals and professionals;
    • Developing its legal, technical and ethical expertise capacity on a range of subjects such as cloud computing and virtual assistance;
    • Maintaining its leading role at the European and international level. The CNIL is currently lead supervisory authority in 40 on-going investigations and is involved in 609 other cases.

    The CNIL will also follow its annual inspections program, which represents one fourth of its prospective investigations. The French DPA aims to focus its controls on the practical exercise of data subject rights and on cross-sector issues such as the allocation of responsibility between controllers and processors, or children' privacy.

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    This is Part 3 of our CCPA Blog Series.

    In Parts 1 and 2 we considered two of the core rights introduced by the CCPA – Notice and Access. This time, we'll be looking at the third core right – Opt out from the sale of personal information (and the related Opt in requirements for children). In particular, we will consider when the right to opt out applies, how it compares to similar rights under the GDPR and the practical steps businesses should take to stay compliant.

    The Opt out requirements

    The right of opt out is described in the CCPA as follows:

    "A consumer shall have the right, at any time, to direct a business that sells personal information about the consumer to third parties not to sell the consumer’s personal information. This right may be referred to as the right to opt-out." (Section 1798.120(a) – emphasis added)

    The first important point to mention, which may seem obvious but is worth reiterating, is that the right to opt out does not apply to processing generally – it is in fact a very specific right that only applies where a business sells personal information relating to Californian consumers. Whether a business sells personal information according to the CCPA, however, is not necessarily a straightforward question. We'll explore this issue first before discussing the actual substance of the requirements.

    Do you sell personal information?

    The CCPA definition of "sell" essentially includes any transfer of personal information to another business or third party for "monetary or other valuable consideration". This includes "renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating" personal information to another party (whether orally, in writing, or by electronic or other means). Importantly, the location of the sale (and whether the sale took place in California) is not relevant here – the key questions are (1) whether you are a business caught by the scope of the CCPA (which could include a business anywhere in the world), and (2) whether you "sell" personal information relating to Californian consumers (as defined by the Act).

    In some cases, whether you sell personal information will be quite obvious – for instance, where you are a data broker, a lead / prospect generation service, or a company that sells marketing lists as a business.

    However, in other cases, it may be less obvious. For example, consider these potential examples:

    • an online publisher disclosing visitor data to advertisers, ad networks, and other ad tech intermediaries to display targeted advertising for revenue,

    • identity verification services,

    • credit reporting services,

    • fraud detection services, or

    • disclosing telematics or machine learning data to affiliates for product development.

    Can you rely on any exemptions?

    The CCPA specifies that a business does not sell personal information in four scenarios:

    1) Communicating opt out preferences: The first exemption is relatively straightforward and applies where a business shares personal information with a third party to alert them of the consumer's opt out preferences. This would include, for example, where a website transmits a user's cookie choices to an advertiser or ad tech intermediary or where a company provides a suppression list to a third-party marketing agency.

    2) Intentional interaction with a third party: A business does not sell personal information if the consumer has directed the business to intentionally disclose their information or uses the business to intentionally interact with a third party. The CCPA does not define how a consumer may "direct" a business to disclose their personal information but does clarify that "an intentional interaction occurs when the consumer intends to interact with the third party, via one or more deliberate interactions" which would not include "hovering over, muting, pausing, or closing a given piece of content". This suggests that the consumer must take some form of affirmative action that is clearly linked to the instruction (e.g. not by merely closing or choosing to ignore a cookie banner). However, this is not the same as "opt-in" consent under the GDPR and would not necessarily require an unticked check box.

    3) Sharing personal information for a business purpose: The third exemption is the broadest in scope and applies wherever information is used or shared with a service provider for a "business purpose", which is defined as "a business’s or a service provider’s operational purposes, or other notified purposes". The CCPA provides a list of business purposes which covers a whole host of standard business activities such as security and fraud prevention, auditing, internal research and service improvement, marketing, analytics, as well as mere "short-term, transient use". It also includes performing services on behalf of a business, such as maintaining customer accounts, processing orders or providing advertising or marketing services. The words "or other notified purposes" suggests the exemption could include other purposes not listed by the CCPA - but further regulation or guidance will be needed.

    Applying this to one example, if you are a vendor providing fraud detection services you could argue that you fall within the exemption as you are protecting against fraud or illegality on behalf of a business. However, if you act as a controller for some of the personal information (e.g. you use visitor data obtained from one of your customers to inform other customers about the risk presented by that same visitor) and use it for your own commercial purposes, then you will no longer come within the scope of being a "service provider" and cannot rely on the business purpose exemption. Companies like these will need to consider what stance they will take as it will be difficult for them to offer opt out rights (as they typically have no interface with the end user) and it also undermines the nature of their services.

    4) Mergers, acquisitions and other corporate sale transactions: This exemption applies where a third party takes control of all or part of the business, and personal information is transferred as an asset as part of that transaction. If the acquirer materially changes or alters the way it uses or shares a consumer's personal information, it must provide prior notice which must be sufficiently prominent and robust to ensure existing consumers can exercise their right to opt out.

    What are the Opt out requirements?

    If you sell personal information and cannot rely on one of the above exemptions, then you must comply with the Opt out requirements. These require that you:

    • provide a "Do Not Sell My Personal Information" ("DNSMPI") link on (i) your homepage, (ii) any webpage where you collect personal information, (iii) your mobile app's platform or download page and within the app itself, (iv) your privacy notice, and (v) wherever else you describe Californian consumers' rights under the CCPA,

    • stop selling personal information as soon as a consumer exercises their right to opt out, unless the consumer subsequently provides express authorization for you to do so, and

    • wait at least 12 months before requesting authorization from the consumer to sell their personal information again.

    If you are not a business but a "third party" who has been sold personal information by a business, you must not sell the information unless the consumer has received explicit notice and been provided with an opportunity to exercise the right to opt-out.

    Some businesses will need to give careful thought to how they will grant these rights, as doing so may not be easy in every situation. Returning to our previous example, if you are a fraud detection service that retains personal information only in the form of IP addresses and other device data, you may have difficulties in matching the particular consumer who made the request with all of the relevant data in your systems – you may need additional information from the consumer about all of their different devices in order to identify their data.

    Equally, if you are an intermediary in the ad tech ecosystem then it may not be sufficient to place the DNSMPI link on your own website – the opt out may also need to be provided through each publisher's website where the information is actually being collected. However, would the existing Ad Choices icon be sufficient or does the CCPA envisage that there must be a separate DNSMPI link on these sites? Again, further guidance would help to address some of these uncertainties.

    The Opt in requirements

    The CCPA also contains more restrictive "opt in" rights for children:

    "A business shall not sell the personal information of consumers if the business has actual knowledge that the consumer is less than 16 years of age, unless the consumer, in the case of consumers between 13 and 16 years of age, or the consumer’s parent or guardian, in the case of consumers who are less than 13 years of age, has affirmatively authorized the sale of the consumer’s personal information." (Section 1798.120(c))

    This means that a business can only sell the personal information of a child between the ages of 13 and 16 with the child's consent and can only sell the personal information of a child under 13 with the consent of the child's parent or guardian. This applies where the business has "actual knowledge" of the consumer's age, although the CCPA is clear to state that any business that wilfully disregards the consumer’s age is deemed to have actual knowledge.

    So where sites or services are obviously attractive to or targeted at children, then this provision is likely to apply. For example, an online kids TV channel will most likely need to switch off all further sales of data by default (i.e. no ad tracking), unless they can obtain clear opt-in consent.

    So what should you do to ensure compliance with the Opt out / Opt in requirements?

    The GDPR does not focus on the "sale" of personal information – and there are no direct provisions relating to such sales – so it's likely that businesses will need to implement most of the requirements for this right from scratch.

    To ensure compliance with the Opt out / Opt in requirements, a business should:

    • Identify whether you "sell" personal information: Carry out a data mapping exercise to ascertain all situations in which you disclose personal information to third parties. You will then need to consider carefully whether this may amount to a "sale" as per the guidance above.

    • Provide notice to consumers: If you do sell personal information, ensure that your privacy notice is updated to inform consumers of their right to opt out. You must also ensure consumers are given the opportunity to opt out before their information is sold.

    • Create a DNSMPI link: Create a DNSMPI link on your homepage and any other web pages and apps where personal information is collected. Note that you may place the link on your homepage or on a separate page dedicated specifically to California consumers.

    • Identify the age of your consumers: Consider whether you collect any children's personal information and whether you would be deemed to have knowledge of age. If so, ensure that you turn off sales by default and only sell such personal information if you obtain appropriate consent.

    • Train your staff: The CCPA also requires that you train any staff that handle consumer inquiries to ensure that they are aware of the Opt out requirements and know how to handle consumer requests. This could be provided as part of more general privacy training (for instance, alongside CCPA and/or GDPR training) or within shorter training.

    Next up…

    We'll be looking at the rights to deletion and non-discrimination – watch this space!

     

     

     

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    As promised, we return with Part 2 of our CCPA Blog Series.

    Last time, we looked at the CCPA's scope of applicability, the key definitions/concepts, and the first of the 5 core consumer rights under the CCPA, the Notice requirements. In this second post, we'll be delving more deeply into the next consumer right – the right to access – which, as explained below, comes hand in hand with the right to data portability.

    We'll consider how these rights differ to their equivalent counterparts under the GDPR, and what this means from a practical implementation standpoint.

    The Access requirements

    The consumer's right to access is scattered over several different sections of the CCPA. The main access-related provision states that:

    "A consumer shall have the right to request that a business that collects a consumer's personal information disclose to that consumer the categories and specific pieces of personal information the business has collected." (Section 1798.100(a) - emphasis added)

    So, just like the GDPR, the CCPA provides consumers with the right to obtain a copy of the personal information that a business has collected about them.

    In subsequent sections, the CCPA stipulates that a business must also disclose to the consumer, upon request:

    • The categories of personal information collected,

    • The categories of sources from which the personal information was collected,

    • The business or commercial purposes for collecting or selling the personal information,

    • The categories of third parties with whom the business shares the personal information,

    • The categories of personal information sold and the categories of third parties to whom the personal information was sold (by category of personal information for each third party) and

    • The categories of personal information disclosed for a business purpose.

    (at 1798.110(a) and 1798.115(a))

    The CCPA states that the "categories of personal information" required to be disclosed above should follow the Act's definition of personal information – for example, "internet information", "geolocation data", "education information" etc. Aside from this, it isn't clear from the legislation how specific each of these disclosures needs to be, or whether a business could seek to refer the consumer to its online privacy notice which provides these disclosures (as they pertain to consumers generally). Just like the GDPR, businesses will no doubt take a variety of different approaches and levels of transparency on this.

    So what additional work is required to meet the CCPA's access rights?

    Many companies will already have processes in place under the GDPR – and the Privacy Shield - to respond to subject access requests from EU individuals. In order to expand these processes to accommodate Californian residents' access rights under the CCPA, a business should consider the following:

    • Designated methods for exercising rights: The CCPA mandates a toll-free number and web address for submitting CCPA access requests. Businesses should therefore register a US toll free number for this purpose, if they don't have one already.

    • Verifying identity: Unlike the GDPR, which encourages verification of the individual's identity but ultimately leaves it to the business's discretion, the CCPA states that a business is only obligated to disclose information in response to a "verifiable consumer request" and carrying out verification checks are mandatory - this should be incorporated into your access procedures.

    • Timescale: The copy of the personal information and the information set out above must be delivered to the consumer within 45 days. The deadline can be extended once by an additional 45 days (or 90 days, as it says elsewhere in the Act – no one knows which was the drafting error). The timescale of 45 days is the same as for complaints under the Privacy Shield – however, the GDPR has a tighter timescale of 1 month (with scope for an additional 2 month extension). It probably makes sense to simply align all internal processes to 1 month to be safe.

    • Scope of the information: The CCPA only requires disclosure of personal information collected, sold or disclosed in the 12 months preceding the request. The GDPR, however, requires disclosure of all of the personal information that a business processes about an individual. A business should consider whether it is going to trouble itself with this cut off date or whether it will simply take a more extensive GDPR approach.

    • Nature of the disclosures: The CCPA and the GDPR require businesses to make slightly different disclosures to the consumer. Under the CCPA, a business must disclose the categories of third party recipients to whom the information has been "sold" or "disclosed for a business purpose". The GDPR does not require this level of granularity, simply requiring disclosure of the categories of third party recipients more generally. On the other hand, the GDPR requires businesses to disclose certain information not required under the CCPR – such as data retention periods, recipients located outside the EU, details of automated decision-making, and the appropriate safeguards in place for international transfers. Given these subtle differences, it makes sense to prepare different template responses and search parameters for access requests.

    The bundled up right to data portability

    The right to data portability is worth a separate mention here because it is one of the more striking differences between the CCPA and the GDPR.

    Under the CCPA, the right to access has been merged with the right to data portability. See Section 1798.100(d), which provides that where a business responds to an access request "electronically", it is required to provide the personal information to the consumer in "a portable and, to the extent technically feasible, in a readily useable format that allows the consumer to transmit this information to another entity without hindrance".

    This means that a business must automatically provide data to the consumer in a format that is "readily useable" by other competing services, regardless of whether the consumer has requested it. The only circumstances in which they will not need to do this is where this is not "technically feasible" (which sounds like a surprisingly high hurdle). Potentially, this could be quite burdensome, especially on smaller businesses.

    By contrast, under the GDPR the right to data portability is a separate right to the right to access which only sophisticated consumers are likely to invoke. In addition, the right only applies in limited circumstances – in particular, to data that the individual has provided to the controller and which has been processed on the grounds of consent or contractual necessity. The CCPA's right to data portability contains no such limitations and seemingly applies to all data collected by a business – which, in principle, could include anything from analytics data, marketing data, profiling data, or inferential data. If this is the case, this is something that will certainly raise a few eyebrows and on which further guidance by the Attorney General would be most welcome.

    Businesses which have not yet implemented technical measures for data portability under the GDPR (for example, because they do not process on the grounds of consent or contractual necessity) will need to address this right head on under the CCPA.

    Next up…

    We'll follow up soon with more on the remaining consumer rights (deletion, opt out and non-discrimination). Stay tuned!

     

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    Just when you thought you were seeing daylight again after those years of GDPR compliance work… and what does the State of California go and do? That's right - enact the California Consumer Privacy Act 2018 (CCPA) – a new privacy law designed specifically to protect Californian residents that has sort of the same but not the same requirements as the GDPR. Excellent!

    So, as an EU lawyer with GDPR fatigue, you may be thinking "We just got through GDPR… here we go again?" You've already spent months (if not years), plus the entire decade's legal budget on GDPR compliance. What you want to know is the bare minimum and the additional compliance obligations the CCPA gives rise to for a GDPR compliant(ish) business.

    This is the first of a number of blog posts intended to do precisely that – to condense the key takeaways and compliance actions for the CCPA but directed, in particular, to companies that want to leverage the work they have already done for GDPR to meet the CCPA's compliance obligations.

    This first blog post focuses on introducing the CCPA and its key concepts, its general scope of application, and the Notice requirements.

    What is the CCPA?

    The CCPA was passed by the California State Legislature and signed into law by Governor Jerry Brown on June 28, 2018.

    It's famous for having been hastily passed by the Legislature after only a week of debate. As a result, there is still uncertainty around the scope of the CCPA, and many of the provisions within it require further clarification.

    The Attorney-General is required to adopt "regulations" (essentially, more detailed guidance on how businesses can comply) before July 1, 2020 - and everyone is waiting with baited breath.

    The CCPA is effective on January 1, 2020 with enforcement to begin six months after the adoption of the Attorney-General's regulations, or July 1, 2020, whichever is sooner.

    First things first…

    One very significant difference between the GDPR and CCPA that should be called out upfront. The GDPR is intended to be a holistic and completely overarching framework that governs the handling of all EU personal data. The CCPA, however, is really something much smaller. The CCPA is a limited set of rights given to Californian residents covering some of their personal data – many of these rights may look very much like the GDPR, but the CCPA requirements are nowhere near the same level of scale and scope as the GDPR.

    So don't worry – this isn't GDPR all over again. The CCPA is something that can hopefully be managed on a much smaller scale.

    Step 1: Does the CCPA apply to you?

    • The CCPA has worldwide effect and applies to any company "doing business in California". This concept isn't defined in the CCPA but it's generally understood by Californians (based on interpretation of similar language by the Californian Franchise Tax Board) that it refers to companies that "actively engage in any transactions for financial or pecuniary gain."

    • A bit like the GDPR, the CCPA applies mainly to businesses that are controllers of personal information. Although it doesn't use the word "controller", the CCPA's definition of a "business" is of an entity that "determines the purposes and means of the processing of consumers' personal information" (just like the definition in the GDPR).

    • In order for the CCPA to be applicable, the business should meet one of three thresholds:

      • Has annual gross revenue of over $25m;

      • Buys, receives, sells or shares the personal information of 50,000 or more Californian residents, households or devices per year; or

      • Derives more than 50% or more of annual revenue from selling California consumers' personal information.

    • Note that companies that share "common branding" with such a business will also end up being subject to the CCPA (presumably, regardless of whether that business itself meets the above threshold requirements).

    Step 2: What personal information is caught by the CCPA?

    The CCPA's definition of personal information is very similar, in effect practically the same, as under the GDPR. It defines "personal information" as:

    "…information that identifies, relates to, describes or is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household".

    I suppose one difference is that the CCPA's definition captures personal information relating to a "household", which the GDPR doesn't do expressly. However, it's difficult to imagine many scenarios where a business would be collecting personal information about a "household" (think a video streaming service shared by a family, or a Nest thermostat tracking the activities of people in a house) which wouldn't also be caught by the GDPR. There might be an argument that if the household had so many individuals within it that it was practically impossible to know which particular person's activity was being tracked, that it would not amount to personal data under the GDPR but would under the CCPA – but that's probably a very limited, unlikely exception.

    One of the interesting things about the CCPA is that it gives a very comprehensive list of examples of personal information (which the GDPR doesn't). Although it is all data that would be caught by the GDPR, the legislation leaves it in no doubt as to what kinds of identifiers and device data would fall within its remit. Examples expressly cited include:

    • Identifiers: including unique personal identifiers (which includes cookies, beacons, pixel tags, mobile adIDs, unique pseudonyms, probabilistic identifiers, a telephone number); online identifiers; IP addresses; account names; etc.

    • Biometric data: such as DNA for the purposes of identification; face, retina, fingerprints; voice recordings; keystroke patterns; sleep, health and exercise data,

    • Internet or other electronic network activity information: such as browsing history; search history; clickstream data; a consumer's interaction with an online ad; etc.

    • Geolocation data

    • Inferences drawn from any of the information to create a profile about a consumer: including their preferences, characteristics, psychological trends, preferences, predispositions, behaviour, attitudes, intelligence, abilities, and aptitudes.

    Certain data is excluded from the CCPA – such as personal information made available in federal, state or local government records (i.e. "publicly available data"), de-identified or aggregated data, and information covered by other US privacy legislation (such as medical information under HIPPA, information protected by Gramm-Leach Bliley and Driver's Privacy Protection Act).

    Step 3: Understanding the 3 key concepts: "collection", "sale" and "disclosure for business purposes"

    The CCPA's rights and obligations center around 3 key concepts, so it's worth taking a little time to understand these first:

    • The concept of "collection" is simple enough and includes any kind of receipt or access to personal information, including receiving data "actively or passively, or by observing the consumer's behaviour".

    • "Sale" is defined as essentially any kind of disclosure to another business or third party "for monetary or other valuable consideration".

    • "Disclosure for a business purpose" is a very broad concept, referring to disclosures to a third party for a range of standard operational purposes - like performing services, detecting security incidents, protecting against fraud / illegality, debugging, analysing ad impressions, maintaining or servicing customer accounts, customer services, processing orders, payments, marketing, analytics, internal technological research, QA and so on.

    So "disclosure for a business purpose" seems to capture all disclosures to third party vendors providing services as a pure processor. On the other hand, the "sale" of personal information potentially catches any disclosure to a third party falling outside of that. For example, it could include collecting personal information through cookies for targeted advertising purposes, sharing personal information with a third party for marketing partnerships, or sharing data with a service provider who uses personal information to enrich their own data-sets, training machine learning models, or for technological research.

    Step 4: Understanding the 5 core rights under the CCPA

    Let's park those concepts for a minute. An understanding of "collection", "sale" and "disclosure for business purposes" is essential for understanding the 5 core consumer rights introduced by the CCPA:

    1. Notice

    2. Access

    3. Deletion

    4. The right to opt out

    5. The right to non-discrimination

    In this first blog post, we will explore the Notice requirement. The other rights will be explored in posts to come.

    The Notice requirements

    The CCPA requires businesses to include specific information in their Privacy Notices. Many of these items are typical transparency items - things you are likely to have already included under GDPR such as: the categories of personal information collected, the purposes for the collection, the categories of third parties with whom you share personal information, the categories of sources of the personal information, etc.

    The good news is that businesses will be able to leverage the Privacy Notices they have already put in place for GDPR. There are a few additional CCPA-specific disclosures – and you'll most likely want to include these in a section of your Privacy Notice directed specifically to Californian residents.

    So presuming you already have a GDPR standard Privacy Notice, then the following are CCPA specific disclosures that will likely need to be added:

    • A description of the Californian consumer’s CCPA rights (i.e. access, deletion, right to opt out) and the designated methods for submitting such requests (i.e. businesses must have a toll free number and web address).

    • The CCPA appears to require more granular lists relating to the categories of personal information “collected”, “sold” and “disclosed for a business purpose” in the past 12 months. The level of detail expected for this is yet to be seen/clarified (e.g. whether three separate lists need to provided breaking the personal information down into each category; or whether more generic disclosures will suffice);

    • The specific business or commercial purposes for the collection and sale of personal information (again, the level of granularity required is unclear but it could require setting out each business purpose and the PI collected, or sold, for each);

    • If no personal information is "sold" or "disclosed for a business purpose", then the Privacy Notice must state so expressly; and

    • A separate link to the “Do Not Sell My Personal Information” internet webpage (which will be discussed in later blog posts) should be included.

     

    More to come…

    That's it for our first blog post. In the next ones, we will be exploring the rights to access, deletion and opt out, and how they differ to the GDPR data subject rights. Watch this space :)

     

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    As time goes by, the prospect of the United Kingdom leaving the European Union on 29th March 2019 with no deal seems more and more realistic. While the UK government strives for a better deal that can get adopted by the UK Parliament, the European institutions on the other hand have started planning and preparing for the UK's departure of the EU without any deal. Following its seventh plenary session that was held in Brussels on 12th February 2019, the European Data Protection Board ("EDBP") adopted two information notes: a general note dealing with data transfers under the GDPR in the event of a "no-deal" Brexit; and a specific note dealing with companies which have the Information Commissioner's Office ("ICO") acting as the lead Data Protection Authority ("DPA") for their Binding Corporate Rules ("BCR").

    1. What will happen to companies who are transferring personal data to the UK in case of a no-deal Brexit?

    On the 30th March at 00:00 CET time, if the UK Parliament has not adopted the Withdrawal Agreement that has been negotiated between the EU and UK representatives, effectively the UK will leave the EU without a deal and will become a third country (see our infographic for a visual summary of the situation). That is a fact which most likely would be irreversible. What does this mean from a GDPR standpoint and what are the consequences of a "no-deal" Brexit on transfers of personal data between the EU and the UK?

    From the moment the UK becomes a third country, this will trigger article 44 of the GDPR whereby "any transfer of personal which are undergoing processing or are intended for processing after transfer to a third country (…) shall take place only if (…) the conditions laid down in this Chapter [Chapter 5 – Transfers of personal data to third countries or international organisations] are complied with by the controller and processor".

    What this means is that companies in the EU (including the EEA) that are currently freely sending their personal data to the UK will have to implement appropriate safeguards in accordance with article 46 before transferring any personal data to the UK. These appropriate safeguards include:

    • Standard Data Protection Clauses adopted by the European Commission of a DPA
    • Ad Hoc Data Protection Clauses adopted the EU based company (data exporter) and the UK based recipient of this data (data importer)
    • Binding Corporate Rules
    • Codes of Conduct
    • Certification Mechanism

    In the absence of an adequacy decision having been pronounced by the European Commission, or of appropriate safeguards, such companies may only transfer personal data to the UK if one of the legal derogations listed under article 49 applies. The EDPB highlights the fact that these legal derogations must be interpreted restrictively and mainly relate to processing activities that are occasional and non-repetitive. Needless to say that companies which have been transferring personal data massively and regularly to the UK for years will not be able to rely on these legal derogations, or may only do so on a case-by-case basis. The risk is therefore that if the UK crashes out of the UK, this will leave hundreds of companies in a limbo and without any suitable legal basis for transferring personal data to the UK.

    Does this look like dejà vu? Indeed, US companies will remember the turmoil that followed the invalidation of Safe Harbour by the Court of Justice of the EU in 2016. But unlike Safe Harbour which was finally renegotiated into a new legal framework renamed the "EU-US Privacy Shield", there is currently no plan for of a similar framework between the EU and the UK.

    Furthermore, there has been no official announcement that the UK has opened discussions with the European Commission to obtain an adequate status, even though this is considered by many as a logical next step. Given the fact that the GDPR is the law today in the UK, one would hope that the UK could easily obtain the adequacy status. Unfortunately, things are not so simple. First, the UK needs to leave the EU before it can apply (as a third country) for adequate status. Second, the European Commission will need to assess the adequate level of protection that is offered by the UK in light of the GDPR's new requirements, which includes assessing the rule of law and the protection of human rights in the field of public security, defence, national security and criminal and the access of public authorities to personal data. Lastly, very few countries have acquired the adequate status and this is usually a very long process (as illustrated by Japan's recent accession to the magic circle of third countries after roughly ten years of negotiation). One can only hope that the process would be accelerated with the UK, but we're still talking a minimum of two years before an adequacy decision is pronounced. In the meantime, where does that leave companies?

    It is not difficult to see why a "no-deal" Brexit will put companies in a very, very difficult position. They will be required to implement appropriate measures as quickly as possible if they do not want to find themselves in violation of the GDPR. One can only see how painful an exercise this will be. Unfortunately, the EPDB does not mention anything about a grace period in its information note. Therefore, unlike US companies who benefited from a grace period for several months to give them time to re-certify under the Privacy Shield, with Brexit one should assume that there will be no grace period, and therefore, companies will be expected to have put in place such measures by March 30th.

    It is completely unrealistic to think that all companies will have implemented appropriate measures by March 30th (most of them are still in a "wait and see" mode and wishfully thinking that Brexit is just a bad dream) and even more so to think that companies will stop transferring their data to the UK simply because Brexit has removed the legal basis they were previously relying on to do so freely and without any restriction. Hopefully, the EU DPAs will adopt a pragmatic approach and will not investigate or issue sanctions against companies in the months that follow Brexit to give them time to implement the necessary measures.

    1. What will happen to companies which have the ICO as the lead DPA for their BCR?

    The EDPB's second note deals specifically with companies who have BCR or are considering putting BCR in place. In order to understand the consequences of Brexit on BCR, it is important first to understand that, as part of the BCR approval procedure, one DPA must act as the "lead supervisory authority". This Lead DPA reviews the applicant's draft BCR before sharing it with the other DPAs concerned and finally submitting it to the EDPB for final approval. In case of a "no deal" Brexit, the ICO can no longer act as a Lead DPA because it will no longer represent an EU or EEA Member State. In fact, the ICO will lose all its voting and decision-making powers as a member of the EDPB and will no longer be authorized to attend the plenary meetings as a permanent member of the EDPB.

    As a result, what will happen to BCR applications for which the ICO is the Lead DPA? Several scenarios must be addressed:

    1. Your organization already has its BCR approved: If you've already obtained the approval of your BCR and the ICO acted as the Lead DPA, then you must identify a new BCR Lead DPA in accordance with the criteria set out in WP 263. This may require you to update your internal procedure for notifying the lead DPA about any material changes that have been made to your BCR Policy and to the list of group entities within your organization that are bound by the BCR.
    2. Your organization is considering submitting an application for BCR with the ICO: If you are considering whether to submit your BCR application to the ICO but have not done so yet, then you should submit your BCR application with another DPA. This will concern in particular organizations that have their European headquarters in the UK or that are considering designating a UK-based affiliate as the entity within the group with delegated data protection responsibilities. In such case, organizations should identify a new BCR Lead DPA in another EU Member State in accordance with the criteria set out in WP 263. If you have already submitted your BCR with the ICO but the review process has not yet started, then it is likely that the ICO will tell you that post March 30th it can no longer act as the Lead DPA and that you should identify another DPA.
    3. Your organization has applied for BCR and is in the process of obtaining the approval: Organizations who will be most impacted by a "no-deal" Brexit are those who have already submitted their BCR to the ICO and are at the review stage by the ICO. Such companies must identify a new BCR Lead Supervisory Authority according to the criteria laid down in WP263.The new BCR Lead Supervisory Authority will take over the application and formally initiate a new procedure at the time of a "no deal" Brexit (presumably starting on 30th March 2019). As a consequence, those companies will see their BCR application being handed over to a new DPA who will start the review process all over again. This change is likely to extend the duration of the BCR approval process and delay the approval of an organisation's BCR. One can only hope that the new Lead DPA who takes over a BCR application will adopt a pragmatic approach and will take into account the review that was previously carried out and the comments that were already provided by the ICO as opposed to starting entirely from scratch.

    Companies whose draft BCRs have already been submitted to the EDPB will be less impacted. If a draft ICO decision for approving BCRs is pending before the EDPB at the time of a "no-deal" Brexit, the BCR applicant needs to identify a new BCR Lead Supervisory Authority according to the criteria laid down in WP263. The new BCR Lead will take over and re-submit a draft decision for the approval the BCRs to the EDPB, presumably without starting over the review process.

    Organisations that are currently in the BCR review process with the ICO should do everything that is possible to obtain the approval of their BCR before March 30th, bearing in mind that the ICO is one of the DPAs that has the most BCR applications and it is unlikely to get them all approved in time before March 30th.

    In summary, a "no-deal" Brexit will open up a period of uncertainty during which companies which transfer personal data to the UK will be required to implement appropriate measures until new permanent measures (such as an adequacy decision) are adopted. Unavoidably, this will put a strain on businesses and so the sooner they get started, the better.

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    On 25 and 28 May 2018, the French Data Protection Authority (the "CNIL") received group complaints from the associations None Of Your Business and La Quadrature du Net. La Quadrature du Net was mandated by 10 000 people to refer the matter to the CNIL. The associations claimed that Google did not have a valid legal basis to process the personal data of its users, particularly for ads personalization purposes. The complaints focused specifically on Android's set-up process where users need to create a Google account in order to use their device.

     

    On 21 January 2019, the CNIL's Restricted Committee – which is responsible for imposing sanctions – observed two types of GDPR infringements: a lack of transparency and information regarding the processing operations carried out by the tech giant and a lack of legal basis for the processing of personal data for advertising purposes. The CNIL contemplated that these substantial breaches opposed the legitimate aspirations of individuals who wish to maintain control over their own personal data.

     

    This article summarizes the CNIL's decision on these two fundamental issues and draws some practical observations that may potentially concern other Internet players.

     

    For the sake of this article, relevant portions of the CNIL's decision were translated. Please be advised that these are neither official nor certified translations.

     

     

    1. Violation of the transparency principle and the obligation to inform users

    On the merits, the CNIL found that the information Google had provided to its users did not comply with the principles of accessibility, clarity and intelligibility set out in Article 12 of the GDPR. The Restricted Committee also stressed the fact that some of the mandatory information listed in Article 13 of the GDPR had not been provided to the data subjects. 

     

    • On accessibility: the importance of structuring the information provided

    Article 12 of the GDPR states that information must be provided in an "easily accessible form". The CNIL states that the obligation of accessibility is partly based on the ergonomic choices made by the controller.

     

    In this case, the French DPA noted that the information was "scattered" across several documents, thus making it difficult for data subjects to easily access the entirety of the information. The CNIL further considered that these documents "contain buttons and links that must be activated to obtain additional information", which leads to a "fragmentation of information", thus forcing users to multiply the "number of clicks" necessary to access the various documents. On this point, the CNIL especially found that information about ads personalization processing was retrievable after "five actions of the data subjects". All in all, this makes it difficult to find the information, even for privacy practitioners.

     

    It is worth noting that the CNIL assessed the “overall layout of information” that was put in place by Google suggesting that, more than ever, the transparency principle must be imbedded into the user's experience and in particular, data controllers must pay attention to the first level of information that is provided to data subjects (see below).

     

    • On clarity and intelligibility

    • Higher scrutiny for massive and intrusive processing

    The CNIL considered that the obligation of clarity and intelligibility must be assessed in light of the nature of each processing operation and taking into account its concrete impact on data subjects.

     

    In this case, the CNIL regarded the processing of personal data carried out by Google as “massive and intrusive in nature”. This qualification arose from a number of factors detailed with great precision by the French DPA:

    • The significant number of services offered by the company;
    • The wide variety of sources the data originated from (e.g., Gmail account, Youtube, Google analytics etc.);
    • The very nature of some of the data obtained individually (e.g., geolocation data, browsing history or other data likely to reveal with a “significant degree of accuracy many of the most intimate aspects of data subjects' lives”);
    • The combination of the said data.

    Consequently, the particular massive and intrusive nature of Google's processing triggered a higher scrutiny. The CNIL considered in particular that the principle of clarity and intelligibility must be assessed in light of the particular characteristics of the said processing operations. In this case, users were not able to sufficiently understand the specific consequences of the processing. The purposes of the processing, or the collected data descriptions were deemed too generic, too vague and incomplete. Such description did not allow users to measure the extent of the processing and the degree of intrusion into their private life. The CNIL appears to apply here a balancing test: the more invasive the processing is, the more comprehensible and clearer the information must be.

    • Assessment of the clarity and intelligibility of the legal basis

    The CNIL also considered that the lack of clarity and intelligibility applied to the legal basis for the ads personalization processing. While Google claimed to only rely on consent as the legal basis, the CNIL found that the company also relied on its legitimate interests in its privacy policy, in particular to carry out marketing activities. The Restricted Committee found that the distinction between ads personalization processing and marketing processing was rather unclear and did not allow users to clearly understand what processing relied on consent or on legitimate interest. The CNIL highlighted the importance of defining a clear and distinct legal basis for each processing operation, and more explicitly, to clearly define the nature of the processing operations envisaged and their respective legal basis.

    • On the information to be provided to data subjects

    Citing Article 13 of the GDPR, the CNIL recalled that the data subjects must receive fair processing information, confirming its strict commitment to elect transparency as a key component of the European data protection framework. Some interesting developments are worth highlighting.

     

    • No distinction between the transparency obligations set out in Article 13(1) and Article 13(2)

    Data protection practitioners have long agreed that it is unclear why the information to be provided to data subjects under Articles 13(1) and 13(2) is set out in two different provisions. As a reminder, in its detailed guidance on the right to be informed the UK's information Commissioner's Office (ICO) implies that both sets of information shall be given to data subjects in all cases.

     

    In this instance, the CNIL heavily sanctioned Google for failure to specify the period for which the personal data will be stored. The French DPA especially stated that " this information is one of the mandatory information to be provided to the persons concerned pursuant to Article 13(2)(a) of the Regulation".

     

    Thus, the position of the French DPA supports the fact that there is no practical distinction between Articles 13(1) and 13(2) and consequently no discretionary opportunity for a data controller to distinguish the type of information to be provided to data subjects. 

     

    • Clarification on the amount of information to be provided: "just the right amount"

    In its defense, Google argued that the right to be informed must be appreciated in light of all the information tools made available to users at the time of the creation of their accounts and thereafter. Here, and unsurprisingly, the CNIL recalled that compliance with Article 13 of the GPDR shall be fully achieved at the time of the creation of the account, or at the time when personal data are obtained. 

     

    However, the CNIL pointed out that the “provision of comprehensive information in the context of the very first layer of information would be counterproductive and would not comply with the transparency requirement”. Thus, the Restricted Committee seemed to distinguish between a “first layer of information” (i.e., provided at the time of the creation of the account) where data subjects should be enabled to grasp the “number and scope of the data processing operations undertaken”, and “further layers of information” (i.e., after the account has been created) where more comprehensive information should be provided.

     

    As a result, it appears that practitioners must strike a balance to determine just the right amount of information to be provided to data subjects: saying too much too soon may turn out to be counterproductive and saying too little too late may be deemed an infringement of the obligation of transparency and information. This promises to be a complex balancing exercise for internet players who offer a plethora of interrelated online services.

     

    1. Violation of the obligation to have a legal basis for ads personalization processing: consent non-validly obtained

    On the merits, the CNIL found that Google violated its obligation to have a legal basis for the processing as set out in Article 6 of the GDPR or, more precisely, that valid consent had not been obtained, as it was neither sufficiently informed, nor specific or unambiguous.

     

    • Not sufficiently informed

    The Restricted Committee's decision enshrined that consent and transparency go hand in hand. Aligned with the EDPB's guidelines on consent, the CNIL recalled that for consent to be informed, data subjects must be clearly told what they are consenting to. In this case, due to the fragmented nature of the information and the lack of clarify on the exact nature of the processing thereof, the data subjects could not have a just and informed perception of the nature and amount of data collected. Thus, consent was not sufficiently informed.

     

    • Neither specific nor unambiguous

    The CNIL also contended that Google’s chosen user experience led to blanket consent. Indeed, even though users could modify some options associated with their account by clicking on the 'more options' button, the account personalization settings were pre-checked by default, which reflected, unless otherwise specified, users' consent to ads personalization processing. The fact that users' positive action was necessary to opt-out from such settings meant that consent was not given by means of a clear affirmative action and thus was not unambiguous. Furthermore, where users did not click the 'more option' button; they had to agree to Google's terms of service and to the processing of their personal data as detailed in the latter. In so doing, users accepted all data processing as a whole. This blanket acceptance resulted in a non-specific and thus invalid consent.

    On this last point, the Restricted Committee offered an interesting observation. The CNIL contemplated that to some extent a more generalized consent could legally be obtained for different but related purposes. For such generalized consent to be allowed, data subjects must be informed in advance about the different purposes of the processing and given the possibility to give consent for each purpose separately. Only after can the data subjects be offered with the choice to accept or refuse all data processing operations as a whole. The Restricted Committee specified that this should be the case without them having to take any particular action to access the information, such as clicking on a 'more options' button.

    Once more, the CNIL’s analysis enshrines the importance of designing a clear notice mechanism. More generally, this shows the importance of raising awareness on privacy issues among the developers who design the user's experience.

     

    1. What now?

    Unsurprisingly, in the days that followed the CNIL's decision, Google announced that it would appeal the decision before France’s Supreme Administrative Court ("Conseil d’État"). From a procedural standpoint, this case if far from being over and privacy practitioners will have their eyes riveted on the Court's ruling.

     

    In the meantime, the CNIL has given us some significant takeaways to chew on. First, transparency and lawfulness are essential components of any data processing activity. If you get them wrong, your entire processing activity may be flawed. Second, while the CNIL did consider that Google had not obtained valid consent, it did not analyse in detail whether Google could rely on its legitimate interests for some of its (less intrusive) processing activities. This may come as a disappointment for many companies in the ad tech sector who were hoping that the CNIL's decision would provide clarity on the possible legal grounds on which they can (or should) rely to run their business. For better or for the worst, this leaves the door open to future litigation on the legitimate interest ground.

     

    Lastly, this decision does finally answer one question: who will the EU regulators go after first? It comes as no surprise that the first massive fine under GDPR was pronounced against Google. This is only the beginning of a likely series of DPA actions against US tech giants across Europe. However, companies in other business sectors that are less in the spotlight should not underestimate the risk of a sanction that could be taken against their own business if they fail to comply with the GDPR. This is only the first sanction and there will be many more to come…

     

    With special thanks to Paola Heudebert for her valuable contribution to this article.

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    On December 28 2018, the French Data Protection Authority (the "CNIL") released guidance on the disclosure of data to business partners for direct marketing purposes. In essence, the CNIL enumerates five rules that companies who collect data directly from data subjects must comply with when disclosing such information to business partners and other organizations. The essential provisions of the said guidelines are summarized below. 

    • In what context do these rules apply?

    The rules set out below are meant to apply to: 

    - companies who collect personal data directly from their clients;

    - via online or paper forms;

    - who share this data with business partners or other organizations who may then use such data for their own direct marketing campaigns. The CNIL does not say whether these rules only apply to third party companies and organizations. Supposedly, that is is how these rules are meant to apply and so intra-group data sharing would not be concerned by these rules.

    - either by SMS or email marketing (other forms of marketing, such as telemarketing, seem to be accordingly excluded).

     

    • What are the rules set out by the CNIL?

    The CNIL recalls that to be valid, the disclosure of data must comply with the GDPR provisions in order to allow data subjects to exercise control over their own personal data. In particular, the CNIL has set out the following rules:

    Rule 1: the data subject must give consent prior to any disclosure of his/her to a business partner and/or other organization who intends to use the data for the purposes described above.

    Rule 2: the data subject must be able to identify the recipients of data (i.e. business partners) via the form used to collect the data. 

    On this point, the CNIL provides for two possible modalities:

    • Either the data controller gives access to an exhaustive list of all the data recipients via the form itself and this list must be regularly updated;
    • Or, if the list is too long, the data controller should provide a link to such a list and to business partners' respective privacy policies.  

    Rule 3: the data subject must be informed about any changes in the list of recipients, in particular when new business partners have been added.

    As a general rule, the information provided to the data subjects must specify the name of the company who initially collected the data and the rights of the data subjects (in particular the right to object the marketing). In addition, the CNIL considers that the data controller must provide up-to-date information about the list of recipients to the data subjects in the following manner:

    • First,  the initial data controller should provide an updated list of all the recipients in each e-mail or marketing communication that is sent to the data subjects.
    • Second, whenever a business partner receives the data from the data controller, it must, at the time of the first communication with the data subjects and at the latest within one month, inform the data subjects of the processing of their data.

    According to the CNIL, this two-step process will enable data subjects to follow the entire data life cycle more accurately and allow them to exercise their rights more effectively. 

    Rule 4:  consent of the data subject must be obtained by the initial data controller and is only valid for the processing activities that are carried out by the business partners with whom it shares the data. In other words, if a recipient of the data (i.e. a business partner) shares the data with another third party who intends to use the data for its own marketing campaigns, the first recipient must obtain prior consent from the data subjects prior to doing so and inform the data subjects about the recipients of the data. Consequently, the obligations to provide notice and obtain consent flow from one recipient to another, but consent itself is not "transferable" and has to be renewed by every subsequent recipient of data.

    Rule 5: Each business partner who is a recipient of the data and who in turn contacts the data subjects, must indicate, at the time of their first communication, how data subjects may exercise their rights, in particular their right to object as well as the source of the data. There are two ways for data subjects to exercise their right to object:

    • Either they exercise this right directly with the recipient of the data;
    • Or they exercise this right with the data controller who collected their data initially who then has an obligation to notify all its recipients that an individual has exercised this right to object.

    There are different ways in which companies can comply with these rules, depending on the means used to communicate with the data subjects, the manner by which consent is obtained as well as the interface used to provide notice to the data subjects.

    • What is the impact for companies?

    The principles above are not new and on the contrary they derive from the GDPR and the EDPB's guidelines on consent and transparency. For example, article 14 (2) (f) of the GDPR requires data controllers and processors to indicate the source of the personal data whenever data is not obtained directly from the data subject. However, the CNIL's interpretation of these rules is useful because it provides a more practical understanding of how companies must comply with the notice and consent principles of the GDPR. For example, the combination of rules 2 and 3 is likely to make it more burdensome for companies who share data with third parties because this will require them to review and update their lists of recipients periodically. Furthermore, this will require more collaborative work between the controller who collects the data from the customer and the business partner with whom it shares the data, particularly with respect to the handling of the data subjects' requests. Inevitably, this will require adding specific terms in agreements between controllers and their business partners to ensure that such recipients provide the necessary assistance to the data controller to allow the controller to comply with his obligations under the GDPR.

    Lastly, while the CNIL's guidelines do not target data brokers specifically, they will inevitably have an impact on this business sector. Data brokers are already under significant scrutiny by the CNIL since it published its list of data processing operations for which a data protection impact assessment (DPIA) must be carried out. Indeed, the CNIL considers that profiling activities that rely on data obtained indirectly from third party sources (e.g. data obtained by data brokers) is likely to result in a high risk. As a result, companies who rely on data brokers for their marketing campaigns are going to have to reassess their marketing strategy in light of this guidance.

    Special thanks to Paola Heudebert for her valuable contribution to this article.

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    Ever since the 25th May last year, the privacy community has been on tenterhooks, waiting to see whether European DPAs would take advantage of the significant fining powers afforded to them under the GDPR.  That question has been answered today, with news that the CNIL, the French data protection authority, has imposed a whopping fine of EUR50M on Google.

    The fine concerns a complaint made to the CNIL by the not-for-profit association NOYB (“None of Your Business”) founded by Max Schrems - the lawyer and privacy activist noted for his complaints which eventually led to the collapse of the EU-US Safe Harbor regime.  The complaint was reportedly the first GDPR complaint ever made (in parallel with other complaints made by NOYB against Facebook, Instagram and WhatsApp on the same day), being filed on 25th May 2018 - the very day that the GDPR became applicable.  The complaint concerned the validity of consents obtained by Google for its data processing, alleging that Google “forced” the consent of its users.

    The CNIL has seen fit to agree with the complaint, and today reported that it has levelled a fine of EUR50M on Google.  Details of the fine, and the CNIL’s reasoning, are available here.  While you can read details of the complaint for yourself, a few key points are particularly worthy of note:

    1. European DPAs were given some whopping great big new fining powers under GDPR - up to 4% of annual worldwide turnover, or EUR20M, whichever the larger (and the CNIL clearly took the words “whichever the larger” to heart, choosing to impose EUR50M on Google).  Today’s news is a clear statement of intent that they will use them - in other words, they’ll walk the walk, not just talk the talk. 
    2. There’s been a lot of misunderstanding about the functioning of the lead supervisory authority.  In the wake of the GDPR, many companies have talked about “selecting” their lead authority, or assumed that because they have an EU HQ in one Member State it must follow that the DPA in that Member State will always be their lead supervisory authority, whatever the issue.  Today’s message is don't get complacent about who your lead authority is!  Google had its EU headquarters in Ireland, but the CNIL still led this investigation and enforcement (not the Irish DPC).
    3. Personalised ads are firmly in the DPA's crosshairs at the moment - in recent months, the CNIL has issued decisions against mobile ad tech vendors Vectaury, Fidzup, Teemo and Singlespot, in each case emphasising the need for clearer transparency and consent.  And now we have this.  A big focus of the CNIL’s decision was on the need for unambiguous consent for ad personalisation.  It’s time to get serious about unambiguous consent for targeted ads - reliance on navigational or implied consent mechanisms will seemingly no longer cut it with EU DPAs.
    4. The complaints criticised Google for requiring users to “agree” to its privacy policy in order to use its services.  While asking users to “agree” to a privacy policy is still common practice for many companies, privacy notices are too long and too complex to be something that users can realistically understand and “agree” to.  Under GDPR consent needs to be freely given and specific, and must not be bundled - the user must be able to freely consent to specific activities on a case-by-case basis, e.g. consent to receive e-mails, or consent to use of their photograph within a promotional brochure etc.  Privacy notices are still needed for transparency of course - but they should serve as just that: informational notices, not catch-all consent-gathering documents.
    5. Will this decision lead to an appeal?  It would be naive not to expect one.  The size of the fine, and the significance to online advertising revenues (and for certain business models), means an appeal is all but inevitable. 
    6. Longer term, query what impact this will have on the future of tech, data collection and ad personalisation - is this the beginning of the revolution, or will fines simply be seen as a cost of doing business...?
    Read Full Article

    Read for later

    Articles marked as Favorite are saved for later viewing.
    close
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    Separate tags by commas
    To access this feature, please upgrade your account.
    Start your free month
    Free Preview