Loading...

Follow Hunton & Williams - Privacy and Information Se.. on Feedspot


Valid
or
Continue with Google
Continue with Facebook

The Department of Health and Human Services (“HHS”) recently published two advance notices of proposed rulemaking that address the accounting of disclosures and the potential distribution of civil monetary penalties to affected individuals.

The first notice of proposed rulemaking would solicit the public’s views on modifying the HIPAA Privacy Rule as necessary to implement the accounting of disclosures provisions of the Health Information Technology for Economic and Clinical Health (“HITECH”) Act. HHS had previously published a notice of proposed rulemaking on these provisions in 2011, which is finally being withdrawn.

The second notice of proposed rulemaking would solicit the public’s views on “establishing a methodology under which an individual who is harmed by an offense punishable under HIPAA may receive a percentage of any civil money penalty or monetary settlement collected with respect to the offense.” This is required under Section 13410(c)(3) of the HITECH Act, but has not been implemented. HHS has collected almost $40 million since it began imposing civil monetary penalties, a considerable sum that could be distributed to affected individuals.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On April 11, 2018, Arizona amended its data breach notification law (the “amended law”). The amended law will require persons, companies and government agencies doing business in the state to notify affected individuals within 45 days of determining that a breach has resulted in or is reasonably likely to result in substantial economic loss to affected individuals. The old law only required notification “in the most expedient manner possible and without unreasonable delay.” The amended law also broadens the definition of personal information and requires regulatory notice and notice to the consumer reporting agencies (“CRAs”) under certain circumstances.

Key provisions of the amended law include:

  • Definition of Personal Information. Under the amended law, the definition of “personal information” now includes an individual’s first name or initial and last name in combination with one or more of the following “specified data elements:” (1) Social Security number; (2) driver’s license or non-operating license number; (3) a private key that is unique to an individual and that is used to authenticate or sign an electronic record; (4) financial account number or credit or debit card number in combination with any required security code, access code or password that would allow access to the individual’s financial account; (5) health insurance identification number; (6) medical or mental health treatment information or diagnoses by a health care professional; (7) passport number; (8) taxpayer identification or identity protection personal identification number issued by the Internal Revenue Service; and (9) unique biometric data generated from a measurement or analysis of human body characteristics to authenticate an individual when the individual accesses an online account. The amended law also defines “personal information” to include “an individual’s user name or e-mail address, in combination with a password or security question and answer, which allows access to an online account.”
  • Harm Threshold. Pursuant to the amended law, notification to affected individuals, the Attorney General and the CRAs is not required if breach has not resulted in or is not reasonably likely to result in substantial economic loss to affected individuals.
  • Notice to the Attorney General and Consumer Reporting Agencies. If the breach requires notification to more than 1,000 individuals, notification must also be made to the Attorney General and the three largest nationwide CRAs.
  • Timing. Notifications to affected individuals, the Attorney General and the CRAs must be issued within 45 days of determining that a breach has occurred.
  • Substitute Notice. Where the cost of making notifications would exceed $50,000, the affected group is bigger than 100,000 individuals, or there is insufficient contact information for notice, the amended law now requires that substitute notice be made by (1) sending a written letter to the Attorney General demonstrating the facts necessary for substitute notice and (2) conspicuously posting the notice on the breached entity’s website for at least 45 days. Under the amended law, substitute notice no longer requires email notice to affected individuals and notification to major statewide media.
  • Penalty Cap. The Attorney General may impose up to $500,000 in civil penalties for knowing and willful violations of the law in relation to a breach or series of related breaches. The Attorney General also Is entitled to recover restitution for affected individuals.
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On May 8, 2018, Senator Ron Wyden (D–OR) demanded that the Federal Communications Commission investigate the alleged unauthorized tracking of Americans’ locations by Securus Technologies, a company that provides phone services to prisons, jails and other correctional facilities. Securus allegedly purchases real-time location data from a third-party location aggregator and provides the data to law enforcement without obtaining judicial authorization for the disclosure of the data. In turn, the third-party location aggregator obtains the data from wireless carriers. Federal law restricts how and when wireless carriers can share certain customer information with third parties, including law enforcement. Wireless carriers are prohibited from sharing certain customer information, including location data, unless the carrier has obtained the customer’s consent or the sharing is otherwise required by law.

To access real-time location data from Securus, Senator Wyden’s letter alleges, correctional officers can enter any U.S. wireless phone number and upload a document purporting to be an “official document giving permission” to obtain real-time location data about the wireless customer. According to the letter, Securus does not take any steps to verify that the documents actually provide judicial authorization for the real-time location surveillance. The letter requests that the FCC investigate Securus’ practices and the wireless carriers’ failure to maintain exclusive control over law enforcement access to their customers’ location data. The letter also calls for a broader investigation into the customer consent that each wireless carrier requires from other companies before sharing customer location information and other data. Separately, Senator Wyden also sent a letter to the major wireless carriers requesting an investigation into the safeguards in place to prevent the unauthorized sharing of wireless customer information.

In response, the FCC confirmed that it has opened an investigation into LocationSmart, reportedly the third-party vendor that sold the location data to Securus. Senator Wyden provided comment to the website Ars Technica that the “location aggregation industry” has functioned with “essentially no oversight,” and urged the FCC to “expand the scope of this investigation and to more broadly probe the practice of third parties buying real-time location data on Americans.”

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On May 16, 2018, the Irish Data Protection Bill 2018 (the “Bill”) entered the final committee stage in Dáil Éireann (the lower house and principal chamber of the Irish legislature). The Bill was passed by the Seanad (the upper house of the legislature) at the end of March 2018. In the current stage, final statements on the Bill will be made before it is signed into law by the President.

The Bill implements Ireland’s national legislation in areas where the EU General Data Protection Regulation (“GDPR”) provides a margin of maneuver to Member States, and specifies the investigative and enforcement powers of the Irish Data Protection Commission. The Bill also implements Directive 2016/680 (Law Enforcement Directive) into Irish law.

Key highlights of the Bill include:

  • Data Protection Commission: The Bill establishes the Data Protection Commission, which replaces the current Office of the Data Protection Commissioner. The Bill permits the appointment of three Commissioners, one of which will act as Chair and have voting rights in cases of decisions to be taken by the Commission where the vote is tied.
  • Children’s Data: The Bill notes that for the purposes of Data Protection Regulation in Ireland, a child is a person under 18 years of age. The initial draft of the Bill specified 13 years as its implementing age of digital consent in the context of Article 8 of the GDPR. However, in the previous committee stage, the age was amended to 16 years. A review of the provision is to take place three years after it comes into operation. Furthermore, the Bill specifies that processing children’s data for purposes of direct marketing, profiling or micro-targeting is an offense punishable by administrative fines.
  • Common Travel Area: The Bill provides that processing of personal data and disclosure of data for purposes of preserving the Common Travel Area (between Ireland, the United Kingdom of Great Britain and Northern Ireland, the Channel Islands and the Isle of Man) is lawful where the controller is an airline or ship.
  • Further Processing: The Bill states that processing of personal data or sensitive data for a purpose other than that for which the data was originally collected is lawful where the processing is necessary to (1) prevent a threat to national security, defense or public security; (2) prevent, detect, investigate or prosecute criminal offenses; (3) provide or obtain legal advice or for legal claims and proceedings; or (4) establish, exercise or defend legal rights.
  • Sensitive Data: The Bill outlines circumstances additional to those of Article 9 of the GDPR where the processing of special categories of data is permitted. These include the processing of (1) special categories of data for purposes of providing or obtaining legal advice, for legal claims and proceedings or to establish, exercise or defend legal rights; (2) political opinion data carried out in the course of electoral activities for compiling data on peoples’ political opinions by a political party or a candidate for election, or a holder of elective political office in Ireland and by the Referendum Commission in the performance of its functions; (3) special categories of data where necessary and proportionate for the administration of justice or the performance of a function conferred on a person by or under an enactment or by the Constitution; and (4) health data where necessary and proportionate for insurance, pension or property mortgaging purposes.
  • Right to Access Results of Examinations and Appeals: The Bill specifically provides for a right of access to examination results, examination scripts and the results of an examination appeal.
  • Enforced Access Requests: The Bill notes that a person who requests that an individual make an access request in connection with the recruitment of that individual as an employee, the continued employment of that individual or for purposes of a contract for the provision of services to the person by the individual will be guilty of an offense and subject to a fine or imprisonment.
  • Right to Object to Direct Marketing: The Bill protects direct mailing carried out in the course of electoral activities, subject to certain conditions, from the right to object to direct marketing.
  • Administrative Fines: The Bill specifies that where the commission decides to impose an administrative fine on a controller or processor that is a public authority or public body, but is not a public authority or public body that acts as an undertaking within the meaning of the Competition Act 2002, the amount of the administrative fine concerned shall not exceed €1,000,000. Previous editions of the Bill exempted such public authorities and public bodies from administrative fines.
  • Representative Actions: The Bill permits a data protection action to be brought on behalf of a data subject by a non-profit body, organization or association, and the court hearing the action shall have the power to grant the data subject relief by way of injunction, declaration or compensation for the damage suffered by the plaintiff as a result of the infringement. Previous editions of the Bill did not permit recovery in the form of damages.
Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On May 2, 2018, the Belgian Privacy Commission (the “Belgian DPA”) published its Annual Activity Report for 2017 (the “Annual Report”), highlighting its main accomplishments for the past year.

In 2017, the Belgian DPA focused on the following topics:

EU General Data Protection Regulation (“GDPR”). The Belgian DPA issued a number of  documents to help companies and organizations with their GDPR compliance efforts, including a Recommendation on internal register of processing activities and a template register, FAQs on Codes of Conduct, and a Recommendation on the appointment and role of a Data Protection Officer.

Facebook case. Despite the changes Facebook made on its cookies policy and practices following the 2015 Recommendation, the Belgian DPA considered that the social networking site did not obtain valid consent from the individuals concerned. In April 2017, the Belgian DPA issued a new set of Recommendations that provides additional guidelines for external website operators that use Facebook technologies and services, as well as several recommendations for Internet users that wish to protect themselves against Facebook’s tracking practices.

Other issues. Among others, the Belgian DPA also worked on the following issues:

  • The Belgian DPA expressed an unfavorable Opinion to the long-term retention of fingerprints and recalled that data minimization must remain the norm.
  • The Belgian DPA criticized the mass screening of festival visitors and the screening of asylum-seekers using data available on social media platforms, highlighting the lack of free consent and insufficient privacy safeguards.
  • In addition, the Belgian DPA published a Report on Big Data with recommendations to assess projects in practice in light of the GDPR.

Finally, the Annual Report states that the Belgian DPA processed 4,934 requests or complaints (an increase of 443 from 2016), including requests for information, mediation and control. Most requests for information were related to the use of CCTV, applicable privacy principles, data subjects rights, the right to one’s image and notifications via the public register.

Read the Annual Activity Report for 2017 (in French and Dutch) and the press release (in French and Dutch).

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On May 14, 2018, the Centre for Information Policy Leadership (“CIPL”) at Hunton Andrews Kurth LLP published a study on how the ePrivacy Regulation will affect the design and user experiences of digital services (the “Study”). The Study was prepared by Normally, a data product and service design studio, whom CIPL had asked for an independent expert opinion on user experience design.

Using examples, the Study examines already established user experience design principles and potential new principles that support an approach to design with user privacy in mind. The Study further details how Articles 6, 8 and 10 of the proposed ePrivacy Regulation will affect user experience design, and emphasizes that greater flexibility in how the ePrivacy Regulation is formulated would facilitate design that would enhance options and the experience for end users.

Key findings of the study include:

  • While principles for good user experience design already exist (i.e., designs must be timely, efficient, personal and convenient), there is more work to be done to create privacy centered design principles.
  • Design principles which enable design for privacy could include (1) transparent designs which ensure a user is informed to engage meaningfully; (2) empowering designs which enable users to make active choices and control their personal data; and (3) conscientious designs which recognize users can be lax with their privacy and proactively remind individuals of their choices and their ability to control or adjust them.
  • Article 6 of the ePrivacy Regulation—which requires that communications content only be processed with consent for a specific purpose—creates specific design challenges in respect of many digital services (e.g., obtaining group consent for group message chats and the ePrivacy Regulation applicability to smart messaging).
  • Article 8 of the ePrivacy Regulation—which states that service providers may collect information from a user’s device or use the device’s processing and storage capabilities only if it is technically essential or if the user has expressed consent for a specific purpose—creates specific design issues in respect of obtaining cookie consent. Cookie walls create the risk of fatigue to users and can hinder the “open web.”
  • Article 10 of the ePrivacy Regulation requires that software providers must allow users to prevent third parties from collecting information from their device or to use the device’s processing and storage capabilities, and suggests that the best moment for providers to exercise this responsibility is at the moment of installation. By frontloading such choices at installation, users will be asked to make blanket privacy choices before interacting with the digital services that these decisions will affect, thus inhibiting a user from making a fully informed choice.
  • In order to find solutions to the problems posed by the proposed ePrivacy Regulation, designers need more freedom to select and sequence privacy controls throughout the user experience, not just upfront. Distributing the controls across the user journey avoids overloading the onboarding experience, helps users engage with privacy settings through contextual relevance and allows for user understanding to build over time.

To read more about the Study’s key findings, along with all its other conclusions, please see the full Study.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On April 27, 2018, the Federal Trade Commission issued two warning letters to foreign marketers of geolocation tracking devices for violations of the U.S. Children’s Online Privacy Protection Act (“COPPA”). The first letter was directed to a Chinese company, Gator Group, Ltd., that sold the “Kids GPS Gator Watch” (marketed as a child’s first cellphone); the second was sent to a Swedish company, Tinitell, Inc., marketing a child-based app that works with a mobile phone worn like a watch. Both products collect a child’s precise geolocation data, and the Gator Watch includes geofencing “safe zones.”  

Importantly, in commenting on its ability to reach foreign companies that target U.S. children, the FTC stated that “[t]he COPPA Rule applies to foreign-based websites and online services that are involved in commerce in the United States. This would include, among others, foreign-based sites or services that are directed to children in the United States, or that knowingly collect personal information from children in the United States.”

In both letters, the FTC warned that it had specifically reviewed the foreign operators’ online services and had identified potential COPPA violations (i.e., a failure to provide direct notice or obtain parental consent prior to collecting geolocation data). The FTC stated that it expected the companies to come into compliance with COPPA, including in the case of Tinitell, which had stopped marketing the watch in an effort to adhere to COPPA’s ongoing obligation to keep children’s data secure.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On May 4, 2018, St. Kitts and Nevis’ legislators passed the Data Protection Bill 2018 (the “Bill”). The Bill was passed to promote the protection of personal data processed by public and private bodies.

Attorney General the Honourable Vincent Byron explained that the Bill is largely derived from the Organization of Eastern Caribbean States model and “seeks to ensure that personal information in the custody or control of an organization, whether it be a public group like the government, or private organization, shall not be disclosed, processed or used other than the purpose for which it was collected, except with the consent of the individual or where exemptions are clearly defined.”

Read more about the Bill.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On May 1, 2018, the Information Security Technology – Personal Information Security Specification (the “Specification”) went into effect in China. The Specification is not binding and cannot be used as a direct basis for enforcement. However, enforcement agencies in China can still use the Specification as a reference or guideline in their administration and enforcement activities. For this reason, the Specification should be taken seriously as a best practice in personal data protection in China, and should be complied with where feasible.

The Specification constitutes a best practices guide for the collection, retention, use, sharing and transfer of personal information, and for the handling of related information security incidents. It includes (without limitation) basic principles for personal information security, notice and consent requirements, security measures, rights of data subjects and requirements related to internal administration and management. The Specification establishes a definition of sensitive personal information, and provides specific requirements for its collection and use.

Read our previous blog post from January 2018 for a more detailed description of the Specification.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

On April 30, 2018, the Federal Trade Commission announced that BLU Products, Inc. (“BLU”), a mobile phone manufacturer, agreed to settle charges that the company allowed ADUPS Technology Co. Ltd. (“ADUPS”), a third-party service provider based in China to collect consumers’ personal information without their knowledge or consent, notwithstanding the company’s promises that it would keep the relevant information secure and private. The relevant personal information allegedly included, among other information, text message content and real-time location information.

The FTC’s complaint alleged that BLU falsely claimed that the company (1) limited third-party collection of data from users’ devices to information needed to perform requested services, and (2) implemented appropriate physical, technical and administrative safeguards to protect consumers’ personal information. The FTC alleged that BLU in fact failed to implement appropriate security procedures to oversee the security practices of its service providers, including ADUPS, and that as a result, ADUPS was able to (and did in fact) collect sensitive personal information from BLU devices without consumers’ knowledge or consent. ADUPS allegedly collected text message contents, call and text logs with full telephone numbers, contact lists, real-time location data, and information about applications used and installed on consumers’ BLU devices. The FTC alleged that BLU’s lack of oversight allowed ADUPS to collect this information notwithstanding the fact that ADUPS did not need this information to perform the relevant services for BLU. The FTC further alleged that preinstalled ADUPS software on BLU devices “contained common security vulnerabilities that could enable attackers to gain full access to the devices.”

The terms of the proposed settlement prohibit BLU from misrepresenting the extent to which it protects the privacy and security of personal information and requires the company to implement and maintain a comprehensive security program. The company also must undergo biannual third-party assessments of its security program for 20 years and is subject to certain recordkeeping and compliance monitoring requirements. The proposed settlement is open for public comment through May 30, 2018.

Read Full Article
Visit website

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview