The 3 critical privacy issues most companies face on May 25, 2018

Many companies have exactly 1 year to get their privacy house in order.  On May 25, 2018 the European Union’s General Data Protection Regulation (found here in its entirety, the reg itself without precursors is here: GDPR regulation only) goes into effect.  It brings tremendous changes to the previous data protection rules, but in this short post I discuss what I consider to be the “Big 3” issues that the new rule presents, and why even though US privacy law is almost non existent (in the general consumer privacy context), these EU rules will become more and more important even for smaller companies operating solely in the US, due to globalization of data exchange.  Ok, the Big 3:

  1. Huge fines for small errors.  The GDPR allows for fines of up to the greater of 20,000,000 Euros, or 4% of annual global turnover.  And, there is every indication that the privacy regulators will be very harsh in doling out these fines, even for fairly innocuous errors.  That has certainly been the trend in the U.S. for sensitive data like protected health information.
  2. Information included within the rule is almost everything.   The regulation (Article 4, Section (1)) defines “personal data” to mean “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”  It is clear this definition encompasses far more information than just “identifying” information – for example, an “online identifier” is just about any technology that tracks a user.
  3. Extra-territorial scope.  The regulation (Article 3) extends the reach of the GDPR well beyond the borders of the EU.  First, it states that it “applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not.”  So, any data processed by a controller or processor who is located in the EU is subject to this rule, even if the data subject is not a EU resident.  Next, it states “This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or (b) the monitoring of their behaviour as far as their behaviour takes place within the Union.”  So, regardless of the location of a business, if the business offers goods or services, whether paid or unpaid, the GDPR applies.  Finally, “[t]his Regulation applies to the processing of personal data by a controller not established in the Union, but in a place where Member State law applies by virtue of public international law.”  The US has not yet adopted public international law that accedes to this rule, but other countries may do so.  Operating in those countries would impose the rule on the controller or processor.

There are many other significant issues raised by the GDPR.  For example, in the EU one of the 6 core principles is data subject control and access.  Article 12, Section 3 states “.The controller shall provide information on action taken on a request under Articles 15 to 22 to the data subject without undue delay and in any event within one month of receipt of the request.”  A company must ask itself – can it do that?  And not if just one data subject asked, but if (thousands? hundreds of thousands?).  Clearly, any “big data” holder will not be able to meet this standard using humans – they will need an automated system to be able to meet the standards.  And see my 1st big issue above – failure to meet this requirement is going to result in a fine for sure.  The questions will be how big, and that will likely depend on what effort went into at least trying to meet this standard.  Another example:  the GDPR is crystal clear that consent to use personal data cannot be obtained through an ambiguous “privacy policy” or buried in terms of service.  The opt in requirement must be plain, unambiguous and intelligible to the data subject.  So, disclosures of how a company tracks a data subject in a privacy policy are not sufficient consent.

There is a separate issue about whether the EU could enforce the GDPR against a US based entity in the EU, or whether it would have to try and come to the US and file such claim; and there is also a separate question of whether a US court would enforce a foreign law against a US based business without an enabling treaty or other enabling statute.  However, a company that operates solely in the US would probably have to play ball with the EU authorities if they ever wanted to be able to actually do direct business in the EU.  Most large companies have already made that decision.  Smaller companies that are wholly located in the US will have to consider whether they would want to take the risk of GDPR enforcement, and whether they want to ever expand direct services into the EU.

One year seems like a long time, but the GDPR has been known for some time (it was adopted in 2016), and now the time is short and companies that might be subject to it really need to be well on their way of making an assessment of what data they are collecting, how they are using it, what efforts they have made to obtain consent to that use, and how they will meet the 6 principles in a timely fashion.

For more information, contact Mike Oliver.

FTC issues privacy guide for facial recognition technology

The FTC released a study and guide on facial recognition technology, and provided guidance on notice, transparency and options required when making use of, storing and sharing facial recognition information.  The case studies included a basic use (for example, a face is scanned and then the user may make changes to see what hair, clothes, jewelry or other things look like), a more advanced use – an interactive kiosk that takes a picture of a consumer, assesses their age and gender, and presents an advertisement specifically for that consumer, and finally an example of use of facial recognition in social media and sharing those images (a la Facebook).

Anyone making use of facial recognition technology should consult these guides as they would any other FTC advertising or privacy guide, before they commence collecting, using or sharing facial recognition images.

For more information on privacy law compliance, contact Mike Oliver or Kimberly Grimsley.

FTC Privacy Report

In April 2012, the Federal Trade Commission issued its report entitled “Protecting Consumer Privacy in an Era of Rapid Change.”  You can read that here.

The Report, while a comprehensive review of hundreds of undoubtedly conflicting filings by the various extreme factions on privacy issues, ultimately just boils down to the FTC complaining that Congress has still not taken any action to normalize privacy rules.   Let’s face it, privacy law is a mess – a hodge podge of state laws, some specific federal laws in the area of financial account, children, protected health information, and education areas, and a morass of case law and regulatory rules – rules that mostly derive from other laws (like the Lanham Act) not really intended to address privacy.  For example, many of the actions the FTC has brought to enforce so called privacy, really involve false advertising – a company saying one thing to a consumer, and doing another, or offering some ability to control a privacy setting, and then ignoring the user setting.

The Report sets forth the FTC’s overview of its objectives and scope summarized here:

  • does not apply to companies that collect only non-sensitive data from fewer than 5,000 consumers a year, provided they do not share the data with third parties
  • “commonly accepted” information collection and use practices for which companies need not provide consumers with choice (product fulfillment, internal operations, fraud prevention, legal compliance and public purpose, and first-party marketing).
  • recommended that companies provide consumers with reasonable access to the data the companies maintain about them, proportionate to the sensitivity of the data and the nature of its use.
  • respect browser and consumer “do not track” election
  • disclose privacy in use of Mobile Applications (also, the major platform providers recently signed an agreement with California, to require all apps on their platforms to link to a privacy policy
  • allowing consumers to have access to and to correct information held by so called “data brokers”
  • industry self-regulation (“no lip service”)

In terms of the actual principles, they are:

  • Companies should incorporate substantive privacy protections into their practices, such as data security, reasonable collection limits, sound retention and disposal practices, and data accuracy
  • Companies should maintain comprehensive data management procedures throughout the life cycle of their products and services
  • Companies should simplify consumer choice (Companies do not need to provide choice before collecting and using consumer data for practices that are consistent with the context of the transaction or the company’s relationship with the consumer, or are required or specifically authorized by law)
  • For practices requiring choice, companies should offer the choice at a time and in a context in which the consumer is making a decision about his or her data. Companies should obtain affirmative express consent before (1) using consumer data in a materially different manner than claimed when the data was collected; or (2) collecting sensitive data for certain purposes
  • Privacy notices should be clearer, shorter, and more standardized to enable better comprehension and comparison of privacy practices
  • Companies should provide reasonable access to the consumer data they maintain; the extent of access should be proportionate to the sensitivity of the data and the nature of its use
  • All stakeholders should expand their efforts to educate consumers about commercial data privacy practices

From a lawyer for small to medium size businesses, it would be very helpful for some national, pre-emptive legislation that gave a lot of guidance and safe harbors for businesses so that they do not have top worry that they are violating some esoteric rule buried in some regulation, order or arcane state law.  Unlikely to happen, though . . .

For more information, contact Mike Oliver.