First CCPA Fine heralds bad news for many websites

First CCPA Fine heralds bad news for many websites

The first fine under the California Consumer Privacy Act was issued this week against Sephora U.S.A., Inc. The complaint alleged in part “The right to opt-out is the hallmark of the CCPA. This right requires that companies follow certain straightforward rules: if companies make consumer personal information available to third parties and receive a benefit from the arrangement—such as in the form of ads targeting specific consumers—they are deemed to be “selling” consumer personal information under the law.”

There are three important observations arising from both this allegation, and the consent by Sephora:

  • “Selling” Personal Information does not just mean literally collecting actual personal information and selling it – it means, according to the California attorney general, collection of essentially any information about a web site visitor (for example, what browser they are using) and then providing that to a third party, who then uses that to track such website visitor in their own network of customers – even if the tracking company does not actually know who that person is;
  • Websites that use ANY tracking technology must meet the fairly onerous disclosure notification rules that the site sells personal information – for example Sephora had stated (as most websites do today) that they “do not sell personal information”; and
  • For all practical purposes any website visitor has the right to completely opt out of “tracking” essentially anything, and the site must provide this ability to opt out and respect it.

The other matter of significance from the complaint and resulting consent fine is that if a user instructs their browser to send a do not track signal (also known as a global privacy control, or GPC), the website must honor it.

Finally, Sephora was unable to establish that the analytics providers were “service providers,” which would have resulted in the transaction not being a sale, because they did not have valid service provider agreements with these providers – indeed, the complaint goes to great lengths to note that Sephora exchanged personal information for free or reduced price analytic services.

Under the CCPA, a service provider agreement must:

“(1) Specif[y] that the personal information is sold or disclosed by the business only for limited and specified purposes.

(2) Obligat[e] the third party, service provider, or contractor to comply with applicable obligations under this title and obligate those persons to provide the same level of privacy protection as is required by this title.

(3) Grant[] the business rights to take reasonable and appropriate steps to help ensure that the third party, service provider, or contractor uses the personal information transferred in a manner consistent with the business’ obligations under this title.

(4) Require[] the third party, service provider, or contractor to notify the business if it makes a determination that it can no longer meet its obligations under this title.

(5) Grant[] the business the right, upon notice, including under paragraph (4), to take reasonable and appropriate steps to stop and remediate unauthorized use of personal information.”

CCPA, § 1798.100(d).

Virtually no analytics provider online terms of service meet these requirements.

The whole matter is also strange in that Sephora was given 30 days notice of the violations and for some reason chose not to comply. Did they decide to contest the claims, and then later decided not to? If so, it was a costly decision.

As a result of this decision, all websites using any form of third party (data sharing) analytics providers needs to make sure that they review the agreement with the analytics company carefully to see if they meet the above requirements. If not, they need to either obtain such an agreement, or cease using such provider. They also need to make full disclosure about what data is shared (sold) to the analytics provider, and provide a full opt out notice – and of course, ensure that the site respects GPC, and respects any opt out request. This is going to be very challenging to accomplish for many reasons – in part because in most cases, these analytics providers do not actually know who the person is – they just have all the data that identifies the electronic interaction – so they are going to have to devise a system to scan their identifiers for the requests. In short, this decision is going to make using web analytics all but impossible except where the analytics are limited solely to the website operator.

Please stop – putting cookie pop-ups on your website

Almost every major website you visit today pops up a banner to warn you that it uses “cookies.” This is not legally required in the U.S. or in most places, and where it is, the vast majority of sites do not comply with legal requirements. From a policy perspective: cookie pops are just dumb – (virtually) no one reads them. There are vastly better ways to deal with the issue they present – legally and from a site usability perspective.

First, no current U.S. law requires cookie pop-ups. Some sites that are available in the European Union are required to post cookie pop ups – sites that use so called “tracking cookies.” I discuss below a recent EU case that makes this issue even worse than one would have originally thought.

Second, an anecdotal review of websites shows the vast, vast majority of them – all of them in my experience that are “U.S.” sites – utterly fail to comply with the so called EU “cookie law.” Why? Because they store the cookie before consent (which is not permitted under the cookie law) and they simply state, “This site uses cookies” and present an “OK” button (and/or an X to close the pop up) with a link to the privacy policy. See for example www.abajournal.com which, as of the date of this post, simply provides an OK button – no option to do anything like reject or manage the cookies, and a link to the privacy policy. Just a useless and legally insufficient user interface distraction.

Finally, except in very, very limited cases, these cookie pops do not in any way increase user privacy protection. Why? If a site does comply with the notice and consent requirements, it is not legally required to provide the service if a user declines tracking cookies. The site can simply not provide functionality. So in many cases, its not really a choice – the choice is either not to use the site, or consent to tracking. This is made worse because many governments and third parties use these sites for information dissemination. A truly privacy focused law would at least require that the site function if a person elected no tracking.

The whole cookie problem was started by our friends in Europe when they promulgated the ePrivacy Directive 2002/58/EC. However, no U.S. company really started focusing on compliance with the “cookie issue” presented in the ePrivacy Directive until the General Data Protection Regulation (GDPR) of the European Union, Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 came into effect. The GDPR applies in Europe, not the US, however so many U.S. companies either do business in, or ostensibly could be regulated by, EU members – so they attempt to comply with both U.S.and EU law.

Many “cookies” – the ones necessary to actually operate a website, are “exempt” and need not be identified nor are they subject to consent. However, sites that use tracking cookies and other tracking technology – even anonymized data – are required under EU law to obtain prior consent before even storing the cookie or other technology that allows such tracking.

In my opinion, this system has been an utter failure in policy and actual impact. It has not stopped companies from incessant user tracking. The companies that rely on user tracking have the power to force the choice of “allow tracking” or do not use the service. The privacy policies remain mostly unintelligible, or at the very least, its is all but impossible to tell what exact tracking a company does, primarily because they either disclose only the types of tracking, or disclose so excessively that the cookie disclosure is indecipherable.

But the EU is doubling down on the concept . . .

In a recent decision (File number: DOS-2019-0137) of the Dispute Chamber of the Data Protection Authority of Belgium issued 2/2/2022, that regulator held that the European arm of the Interactive Advertising Bureau (IAB)’s “pop up” framework used by most of its members – intentionally designed to comply with the GDPR, in fact did not. The decision is lengthy (my machine translated version into English is 139 pages long), and undoubtedly will be appealed. As an overview, IAB created a real time bidding system (RTB) – an automated system of bidding for advertising. This is their framework in the U.S. and many other countries, but in Europe, they created the “Transparency and Consent Framework” (TCF). At issue in this case was a subset of the TCF, which the Board described as follows: “Specifically for the TCF, there are also the companies that use so-called “Consent Management Platforms” (CMPs) to offer. Specifically, a CMP takes the form of a pop-up that appears on the first connection to a website appears to request permission from the internet user to collect cookies and other identification data” Para. 40 (Note, all English translations here are machine created by Google’s translation service). The original decision in Dutch is here (and I can post the English translated version if someone requests it): https://www.gegevensbeschermingsautoriteit.be/publications/beslissing-ten-gronde-nr.-21-2022.pdf.

The basic idea is that IAB manages a “consensu” cookie – that indicates if the web user has already consented (or rejected) cookies. So, a participating site would somehow take information from a user’s initial browser session, send it off to IAB, and IAB would send back a text string indicating if that user had already consented to accept cookies or not. If not, a “cookie pop up” would be presented to the user. The Board found that the IAB maintains a database of users and preferences, which can be used “in order to create an advertising profile of data subjects and to show them personalized advertising” Para. 50. It therefore concluded the IAB was a data controller (a point the IAB disputed). From this point forward the Board essentially found nearly every conceivable violation of the GDPR that could be found. Among them, that “IAB Europe [] failed to observe the principles of due regard for transparency and fairness with regard to data subjects” in part because some of the information that can be sucked up into the preference model includes “special categories of personal data … For example, participating organizations could become acquainted with the websites previously visited by a data subject, including the political opinions, religious or philosophical beliefs, sexual orientation, health data or also trade union memberships of the data subjects be inferred or disclosed.” Para 51. It also found the IAB’s privacy policy insufficient because among other reasons it was only available in English, and used unclear vague terms like “services” and “other means.” Para. 54. It also did not like that the terms “partners” and “third parties” were not explained sufficiently.

To me this is just evidence that no one really understands the law – or that the regulators think it says one thing and the industry thinks it says another. Not good either way. But after that decision, it seems like it would be all but impossible to have a centralized “cookie consent” service – or to comply, the service would be so intrusive as to make the web experience intolerable.

The solution? In my view, just stop with the cookie pop ups. They are stupid and ineffective. Enact a law that requires a service to respect the do not track signal from a browser (currently entirely voluntary), and not store any tracking cookies, clear gifs or other trackers – and require that a site not “discriminate” against users who elect no tracking – basically – provide all functions to users whether they consent or do not consent. I would also prevent any government organization to use a site that tracks users as a service for information dissemination.

DHHS fine for HIPAA Computer Security Violations held arbitrary and capricious

In University of Texas M.D. Anderson Cancer Center v. US Dept of Health and Human Services, No. 19-60226 (5th Cir. 1/14/2021) the Fifth Circuit held that the DHHS’ fine for violating the HIPAA Security Rule was “arbitrary, capricious, and contrary to law.” To say that the government lost this case is an understatement – the government’s arguments were roundly rejected in broad language – so much so that the government is going to regret ever having brought this case . . .

In brief, University of Texas M.D. Anderson Cancer Center (UT) had three computer security lapses in the early 2010 period – one laptop and two thumb drives, each that stored electronic Protected Health Information (ePHI), were not encrypted, and were lost or stolen. The DHHS originally fined them over 4 million dollars for violating rules that in most cases require ePHI to be encrypted, and that prohibit disclosure of ePHI to unauthorized persons. UT’s administrative efforts on appeal were unsuccessful, but when they petitioned to have the case reviewed by the court, the DHHS admitted that the maximum fine they could impose was $450,000. UT however objected to even that fine on 2 grounds, that a state instrumentality is not a person under the HIPAA enforcement provisions, and that the fine was arbitrary and capricious under the Administrative Procedures Act. The court did not address the first argument and assumed UT was a person subject to HIPAA enforcement.

Under the HIPAA Security Rule, “a HIPAA-covered entity must “[i]mplement a mechanism to encrypt and decrypt electronic protected health information.” 45 C.F.R. § 164.312(a)(2)(iv)” (emphasis by court). UT had done so – it had policies that required portable and mobile devices to be encrypted, it provided employees certain technology (dongles) to encrypt these devices, and it trained them how to do so. DHHS argued that the mere fact that 3 devices were not encrypted meant that UT had violated the rule. The court disagreed:

[T]he Government argues that the stolen laptop and the two lost USB drives were not encrypted at all. That appears undisputed. But that does not mean M.D. Anderson failed to implement “a mechanism” to encrypt ePHI. It means only that three employees failed to abide by the encryption mechanism, or that M.D. Anderson did not enforce that mechanism rigorously enough. And nothing in HHS’s regulation says that a covered entity’s failure to encrypt three devices means that it never implemented “a mechanism” to encrypt anything at all.

UT v. DHHS, at p. 7 (slip)

The court goes on to provide numerous examples of scenarios where unauthorized disclosure of unencrypted ePHI would likely not violate the regulation, primarily because the regulation is not written to make data loss a strict liability.

The same result was found under the Disclosure Rule. That rule in general prohibits a Covered Entity from “disclosing” PHI except as permitted by the rule. The Disclosure Rule defines “disclosure” to “mean[] the release, transfer, provision of access to, or divulging in any manner of information outside the entity holding the information.” 45 C.F.R. § 160.103. The administrative law judge held that the loss of data on unencrypted devices was a “release” however the court disagreed and stated “That interpretation departs from the regulation HHS wrote in at least three ways. First, each verb HHS uses to define “disclosure”—release, transfer, provide, and divulge—suggests an affirmative act of disclosure, not a passive loss of information. One does not ordinarily “transfer” or “provide” something as a sideline observer but as an active participant. The ALJ recognized as much when he defined “release” as “the act of setting something free.” But then he made the arbitrary jump to the conclusion that “anyloss of ePHI is a ‘release,’” even if the covered entity did not act to set free anything.It defies reason to say an entity affirmatively acts to disclose information when someone steals it.”

Finally, the court was particularly upset that the DHHS took the position that it “can arbitrarily and capriciously enforce the CMP rules against some covered entities and not others.” UT had argued that in other similar cases either no fine was imposed, or fines much smaller than the fine imposed on UT were imposed. It also argued that DHHS refused to consider factors expressly stated in its own regulations (none of which the DHHS could prove – for example, that any individual suffered financial harm)

This case is an incredible loss by the DHHS. It will need to completely overhaul its entire regulatory enforcement structure, most likely it will need to re-write regulations, and it will need to train its ALJs better about how to handle administrative law appeals in light of arguments made by the petitioners. Finally, the case is incredibly helpful for Covered Entities and Business Associates in their efforts to avoid civil money penalties for small and inadvertent infractions (as long as they otherwise meet data security requirements).

Importantly, all entities that store and process PHI should be careful in drafting their Business Associate Agreements and related agreements to distinguish between regulatory violations (which under this case are not strict liability in many scenarios), and contractual liability. Many Business Associate Agreements are written as if *any* “loss” of PHI outside of the entity is a breach. Business Associates should be careful in reviewing these agreements so as to not undertake greater liability than that imposed under the regulations.

For more information contact Mike Oliver

Privacy law is a mess

The title says it all – what should smaller companies do to comply with privacy laws?

California has now finalized the California Consumer Privacy Act (CCPA), Cal. Civ. Code §§ 1798.100 to 1798.199 – well, at least for now (please note that this link does not have all of the law changes in it as of the posting of this article). It goes into effect 1/1/2020. Regulations under it will not be issued until December at the earliest and are likely to change over time. While it is a net gain for California consumers, it is a complex law with many incidental effects and traps for the unwary business. How does a small business deal with this mess? Before we address that, let’s discuss some background:

Why is CCPA important?

The CCPA is important because so many businesses do business with California consumers that California law is the “highest common denominator” – meaning, instead of trying to comply with disparate laws in 50 states, a business could target compliance with the most onerous law (typically California law in the pro-consumer sense), and then hope for the best that such compliance will also comply with other laws. This does not always work – for example, Illinois has a much harsher bio-metric security law than California, and New York has very detailed personal information protection laws and rules as well, particularly in the financial/banking sector. So, a slight modification of the above strategy is to target the “top 3” laws (i.e. California, Illinois and New York) and again hope for the best in other states. And finally, there is the modified “top 3” strategy of adding compliance with the General Data Protection Regulation of the EU (GDPR).

What many larger companies have done is simply targeted compliance with the GDPR worldwide, assuming it is the most onerous pro-privacy law. However, the CCPA has provisions that differ from, and add to, the GDPR, for example, the regulations on businesses that sell personal information, and that broker those sales, does not really directly exist under GDPR.

How does this affect small business?

Many small businesses may think they are not subject to these privacy laws because they do not “do business” in a particular state, or in the European Union. While those are interesting issues – either the ability of a state to constitutionally require a remote business to be liable under a privacy law when interacting solely electronically with a resident of that state – or whether the GDPR regulators can fine and enforce such fines against a small US business that has no offices, employees or other contacts in the EU, the problem is that many small businesses will contract with larger businesses that are subject to those laws and regulators and those contracts will require the small business to comply with those laws indirectly. This is particularly true in heavily regulated industries such as banking and health care, where regulators apparently force their member banks to impose liability on third party vendors.

In addition, when a small business goes to sell or merge – typically with a larger entity that may apply these rules and regulations in analyzing the level of data risk it has post transaction. If the small business has not thought about basic data privacy and data security, this can negatively impact the value of the business.

So, what should a small business do (and, what are the key provisions of CCPA)?
  1. First, do not ignore remote state laws like CCPA or the GDPR. Someone in the organization should be assigned the responsibility, and be given reasonable time, to make a true assessment of the data privacy and security risks of the company. Ideally that person would have a “C” designation (CIO, CTO, CPO etc) and be incentivized to diligently complete such tasks. The business should neither marginalize nor minimize such role.
  2. Second, do a data inventory – what data is the business collecting? Why? Does it really need it? Where is it collecting this data from? (the web, forms, manual entry, data harvesting/scraping, third party lists etc) What agreements are there with those data sources (this includes terms of service)? Is this information personal information? What type of personal information ? (i.e. is it sensitive personal information). Basically, this is a review of all inbound data flow . . .
  3. Third, determine where the collected data is being disclosed or shared? This can be the critical step – because if the information being collected is personal information, and particularly if it is sensitive information, this can have significant impacts if there is a data breach. Do adequate agreements cover that data exchange? Is the data encrypted? Should it be? Has any audit or review of the recipient been done to determine the adequacy of their data protection systems? Basically, this step involves tracing all outbound data flows, and determining the business need for the disclosure and the risk level such disclosure presents.
  4. Fourth, assess the computer systems used to capture, store, and transmit data, to determine weaknesses in security where a data breach can occur. Computer security is hard, period. It is way harder today when its not just your own computer security, but the security of every link in the data disclosure chain.
  5. Fifth, consider what tools to use to address risk. Do you throw technology at the issue, like better intrusion protection, detection systems, universal threat management devices, etc.? Do you hire experts, which may be costly? Both? Do you have proper agreements in place? Indemnity? Does the downstream recipient have insurance? Are you sure about that? Do you have insurance? Do contracts you have require insurance? Be especially careful with insurance – a good insurance broker will be up on all the various changes in insurance policies that claim to cover “cyber liability” (a generic term that is meaningless without specific context). You simply cannot determine what insurance you need if you have not performed steps 1-4 above.
  6. Sixth, engage in continuous review and management. Hackers are not static, they evolve – your systems must be maintained and modified to address new threats and issues, be updated and patched, and monitored for threats. See item 1 – it is why a dedicated C level person and/or team need to be in place to really address these issues.

The key provisions of CCPA as they relate to small businesses are below:

  1. It only applies to certain “businesses” – namely – a business that has annual gross revenues in excess of $25,000,000); or that alone or in combination, annually buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices; or derives 50 percent or more of its annual revenues from selling consumers’ personal information.
  2. A consumer has a right to an accounting – essentially, a consumer can request that a business disclose to the consumer the categories and specific pieces of personal information the business has collected.
  3. A consumer has the right to request that a business delete any personal information about the consumer which the business has collected from the consumer, subject to several exceptions.
  4. A consumer has the right to an accounting of personal information that has been transferred by the business to third parties, subject to several exceptions. These rights are enhanced if the business “sells” that personal information.
  5. A consumer has the right to terminate the sale of its personal information. It is important to note that the law does not force a business to offer a service to a consumer who makes such an opt out – in other words, a business can condition its service on the right to sell the information. However: (a) a business cannot sell personal information of consumers under 16 (if they know they are under 16) without express opt in, and for consumers under 13, the parent or guardian must opt in; and (b) a business cannot discriminate against a consumer who exercises any of their rights under the law.
  6. A business must provide two methods of contacting the business to exercise these rights – one of which is a toll free number, unless the business transacts business solely online and has a direct relationship to the consumer, in which case such online business need only provide an email address to send such requests. There are affirmative disclosure requirements for websites of businesses that are subject to the law. Businesses that sell personal information have additional affirmative disclosure requirements for their websites.
  7. A consumer whose personal information has been breached now has an affirmative damage remedy. Previously there was uncertainty in the law as to whether actual damage or harm would have to be shown to recover, or just the risk of future harm. In general the cases have held that actual harm is required, but vary in what they view as “actual harm.”
  8. Significant daily penalties can be assessed for non compliance after a 30 day notice period.

The above is only a general overview. However, some of those rights, for example, the right to onward transfer accounting, the right to delete information, and the right to opt out, present not only legal compliance issues, but significant technical hurdles. For many small businesses, their systems were not designed this way, and/or, they have so many disparate systems where data is duplicated, that it might hard or near impossible to comply. If a small business runs through the above checklist and gets a handle on the who, what, where, when and why questions, it will be easier to then assess the “how hard to comply” question.

For more information or assistance in data security and privacy law compliance, please contact Mike Oliver

FTC Privacy Report

In April 2012, the Federal Trade Commission issued its report entitled “Protecting Consumer Privacy in an Era of Rapid Change.”  You can read that here.

The Report, while a comprehensive review of hundreds of undoubtedly conflicting filings by the various extreme factions on privacy issues, ultimately just boils down to the FTC complaining that Congress has still not taken any action to normalize privacy rules.   Let’s face it, privacy law is a mess – a hodge podge of state laws, some specific federal laws in the area of financial account, children, protected health information, and education areas, and a morass of case law and regulatory rules – rules that mostly derive from other laws (like the Lanham Act) not really intended to address privacy.  For example, many of the actions the FTC has brought to enforce so called privacy, really involve false advertising – a company saying one thing to a consumer, and doing another, or offering some ability to control a privacy setting, and then ignoring the user setting.

The Report sets forth the FTC’s overview of its objectives and scope summarized here:

  • does not apply to companies that collect only non-sensitive data from fewer than 5,000 consumers a year, provided they do not share the data with third parties
  • “commonly accepted” information collection and use practices for which companies need not provide consumers with choice (product fulfillment, internal operations, fraud prevention, legal compliance and public purpose, and first-party marketing).
  • recommended that companies provide consumers with reasonable access to the data the companies maintain about them, proportionate to the sensitivity of the data and the nature of its use.
  • respect browser and consumer “do not track” election
  • disclose privacy in use of Mobile Applications (also, the major platform providers recently signed an agreement with California, to require all apps on their platforms to link to a privacy policy
  • allowing consumers to have access to and to correct information held by so called “data brokers”
  • industry self-regulation (“no lip service”)

In terms of the actual principles, they are:

  • Companies should incorporate substantive privacy protections into their practices, such as data security, reasonable collection limits, sound retention and disposal practices, and data accuracy
  • Companies should maintain comprehensive data management procedures throughout the life cycle of their products and services
  • Companies should simplify consumer choice (Companies do not need to provide choice before collecting and using consumer data for practices that are consistent with the context of the transaction or the company’s relationship with the consumer, or are required or specifically authorized by law)
  • For practices requiring choice, companies should offer the choice at a time and in a context in which the consumer is making a decision about his or her data. Companies should obtain affirmative express consent before (1) using consumer data in a materially different manner than claimed when the data was collected; or (2) collecting sensitive data for certain purposes
  • Privacy notices should be clearer, shorter, and more standardized to enable better comprehension and comparison of privacy practices
  • Companies should provide reasonable access to the consumer data they maintain; the extent of access should be proportionate to the sensitivity of the data and the nature of its use
  • All stakeholders should expand their efforts to educate consumers about commercial data privacy practices

From a lawyer for small to medium size businesses, it would be very helpful for some national, pre-emptive legislation that gave a lot of guidance and safe harbors for businesses so that they do not have top worry that they are violating some esoteric rule buried in some regulation, order or arcane state law.  Unlikely to happen, though . . .

For more information, contact Mike Oliver.