In University of Texas M.D. Anderson Cancer Center v. US Dept of Health and Human Services, No. 19-60226 (5th Cir. 1/14/2021) the Fifth Circuit held that the DHHS’ fine for violating the HIPAA Security Rule was “arbitrary, capricious, and contrary to law.” To say that the government lost this case is an understatement – the government’s arguments were roundly rejected in broad language – so much so that the government is going to regret ever having brought this case . . .
In brief, University of Texas M.D. Anderson Cancer Center (UT) had three computer security lapses in the early 2010 period – one laptop and two thumb drives, each that stored electronic Protected Health Information (ePHI), were not encrypted, and were lost or stolen. The DHHS originally fined them over 4 million dollars for violating rules that in most cases require ePHI to be encrypted, and that prohibit disclosure of ePHI to unauthorized persons. UT’s administrative efforts on appeal were unsuccessful, but when they petitioned to have the case reviewed by the court, the DHHS admitted that the maximum fine they could impose was $450,000. UT however objected to even that fine on 2 grounds, that a state instrumentality is not a person under the HIPAA enforcement provisions, and that the fine was arbitrary and capricious under the Administrative Procedures Act. The court did not address the first argument and assumed UT was a person subject to HIPAA enforcement.
Under the HIPAA Security Rule, “a HIPAA-covered entity must “[i]mplement a mechanism to encrypt and decrypt electronic protected health information.” 45 C.F.R. § 164.312(a)(2)(iv)” (emphasis by court). UT had done so – it had policies that required portable and mobile devices to be encrypted, it provided employees certain technology (dongles) to encrypt these devices, and it trained them how to do so. DHHS argued that the mere fact that 3 devices were not encrypted meant that UT had violated the rule. The court disagreed:
[T]he Government argues that the stolen laptop and the two lost USB drives were not encrypted at all. That appears undisputed. But that does not mean M.D. Anderson failed to implement “a mechanism” to encrypt ePHI. It means only that three employees failed to abide by the encryption mechanism, or that M.D. Anderson did not enforce that mechanism rigorously enough. And nothing in HHS’s regulation says that a covered entity’s failure to encrypt three devices means that it never implemented “a mechanism” to encrypt anything at all.
UT v. DHHS, at p. 7 (slip)
The court goes on to provide numerous examples of scenarios where unauthorized disclosure of unencrypted ePHI would likely not violate the regulation, primarily because the regulation is not written to make data loss a strict liability.
The same result was found under the Disclosure Rule. That rule in general prohibits a Covered Entity from “disclosing” PHI except as permitted by the rule. The Disclosure Rule defines “disclosure” to “mean the release, transfer, provision of access to, or divulging in any manner of information outside the entity holding the information.” 45 C.F.R. § 160.103. The administrative law judge held that the loss of data on unencrypted devices was a “release” however the court disagreed and stated “That interpretation departs from the regulation HHS wrote in at least three ways. First, each verb HHS uses to define “disclosure”—release, transfer, provide, and divulge—suggests an affirmative act of disclosure, not a passive loss of information. One does not ordinarily “transfer” or “provide” something as a sideline observer but as an active participant. The ALJ recognized as much when he defined “release” as “the act of setting something free.” But then he made the arbitrary jump to the conclusion that “anyloss of ePHI is a ‘release,’” even if the covered entity did not act to set free anything.It defies reason to say an entity affirmatively acts to disclose information when someone steals it.”
Finally, the court was particularly upset that the DHHS took the position that it “can arbitrarily and capriciously enforce the CMP rules against some covered entities and not others.” UT had argued that in other similar cases either no fine was imposed, or fines much smaller than the fine imposed on UT were imposed. It also argued that DHHS refused to consider factors expressly stated in its own regulations (none of which the DHHS could prove – for example, that any individual suffered financial harm)
This case is an incredible loss by the DHHS. It will need to completely overhaul its entire regulatory enforcement structure, most likely it will need to re-write regulations, and it will need to train its ALJs better about how to handle administrative law appeals in light of arguments made by the petitioners. Finally, the case is incredibly helpful for Covered Entities and Business Associates in their efforts to avoid civil money penalties for small and inadvertent infractions (as long as they otherwise meet data security requirements).
Importantly, all entities that store and process PHI should be careful in drafting their Business Associate Agreements and related agreements to distinguish between regulatory violations (which under this case are not strict liability in many scenarios), and contractual liability. Many Business Associate Agreements are written as if *any* “loss” of PHI outside of the entity is a breach. Business Associates should be careful in reviewing these agreements so as to not undertake greater liability than that imposed under the regulations.
The title says it all – what should smaller companies do to comply with privacy laws?
California has now finalized the California Consumer Privacy Act (CCPA), Cal. Civ. Code §§ 1798.100 to 1798.199 – well, at least for now (please note that this link does not have all of the law changes in it as of the posting of this article). It goes into effect 1/1/2020. Regulations under it will not be issued until December at the earliest and are likely to change over time. While it is a net gain for California consumers, it is a complex law with many incidental effects and traps for the unwary business. How does a small business deal with this mess? Before we address that, let’s discuss some background:
Why is CCPA important?
The CCPA is important because so many businesses do business with California consumers that California law is the “highest common denominator” – meaning, instead of trying to comply with disparate laws in 50 states, a business could target compliance with the most onerous law (typically California law in the pro-consumer sense), and then hope for the best that such compliance will also comply with other laws. This does not always work – for example, Illinois has a much harsher bio-metric security law than California, and New York has very detailed personal information protection laws and rules as well, particularly in the financial/banking sector. So, a slight modification of the above strategy is to target the “top 3” laws (i.e. California, Illinois and New York) and again hope for the best in other states. And finally, there is the modified “top 3” strategy of adding compliance with the General Data Protection Regulation of the EU (GDPR).
What many larger companies have done is simply targeted compliance with the GDPR worldwide, assuming it is the most onerous pro-privacy law. However, the CCPA has provisions that differ from, and add to, the GDPR, for example, the regulations on businesses that sell personal information, and that broker those sales, does not really directly exist under GDPR.
How does this affect small business?
Many small businesses may think they are not subject to these privacy laws because they do not “do business” in a particular state, or in the European Union. While those are interesting issues – either the ability of a state to constitutionally require a remote business to be liable under a privacy law when interacting solely electronically with a resident of that state – or whether the GDPR regulators can fine and enforce such fines against a small US business that has no offices, employees or other contacts in the EU, the problem is that many small businesses will contract with larger businesses that are subject to those laws and regulators and those contracts will require the small business to comply with those laws indirectly. This is particularly true in heavily regulated industries such as banking and health care, where regulators apparently force their member banks to impose liability on third party vendors.
In addition, when a small business goes to sell or merge – typically with a larger entity that may apply these rules and regulations in analyzing the level of data risk it has post transaction. If the small business has not thought about basic data privacy and data security, this can negatively impact the value of the business.
So, what should a small business do (and, what are the key provisions of CCPA)?
First, do not ignore remote state laws like CCPA or the GDPR. Someone in the organization should be assigned the responsibility, and be given reasonable time, to make a true assessment of the data privacy and security risks of the company. Ideally that person would have a “C” designation (CIO, CTO, CPO etc) and be incentivized to diligently complete such tasks. The business should neither marginalize nor minimize such role.
Second, do a data inventory – what data is the business collecting? Why? Does it really need it? Where is it collecting this data from? (the web, forms, manual entry, data harvesting/scraping, third party lists etc) What agreements are there with those data sources (this includes terms of service)? Is this information personal information? What type of personal information ? (i.e. is it sensitive personal information). Basically, this is a review of all inbound data flow . . .
Third, determine where the collected data is being disclosed or shared? This can be the critical step – because if the information being collected is personal information, and particularly if it is sensitive information, this can have significant impacts if there is a data breach. Do adequate agreements cover that data exchange? Is the data encrypted? Should it be? Has any audit or review of the recipient been done to determine the adequacy of their data protection systems? Basically, this step involves tracing all outbound data flows, and determining the business need for the disclosure and the risk level such disclosure presents.
Fourth, assess the computer systems used to capture, store, and transmit data, to determine weaknesses in security where a data breach can occur. Computer security is hard, period. It is way harder today when its not just your own computer security, but the security of every link in the data disclosure chain.
Fifth, consider what tools to use to address risk. Do you throw technology at the issue, like better intrusion protection, detection systems, universal threat management devices, etc.? Do you hire experts, which may be costly? Both? Do you have proper agreements in place? Indemnity? Does the downstream recipient have insurance? Are you sure about that? Do you have insurance? Do contracts you have require insurance? Be especially careful with insurance – a good insurance broker will be up on all the various changes in insurance policies that claim to cover “cyber liability” (a generic term that is meaningless without specific context). You simply cannot determine what insurance you need if you have not performed steps 1-4 above.
Sixth, engage in continuous review and management. Hackers are not static, they evolve – your systems must be maintained and modified to address new threats and issues, be updated and patched, and monitored for threats. See item 1 – it is why a dedicated C level person and/or team need to be in place to really address these issues.
The key provisions of CCPA as they relate to small businesses are below:
It only applies to certain “businesses” – namely – a business that has annual gross revenues in excess of $25,000,000); or that alone or in combination, annually buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices; or derives 50 percent or more of its annual revenues from selling consumers’ personal information.
A consumer has a right to an accounting – essentially, a consumer can request that a business disclose to the consumer the categories and specific pieces of personal information the business has collected.
A consumer has the right to request that a business delete any personal information about the consumer which the business has collected from the consumer, subject to several exceptions.
A consumer has the right to an accounting of personal information that has been transferred by the business to third parties, subject to several exceptions. These rights are enhanced if the business “sells” that personal information.
A consumer has the right to terminate the sale of its personal information. It is important to note that the law does not force a business to offer a service to a consumer who makes such an opt out – in other words, a business can condition its service on the right to sell the information. However: (a) a business cannot sell personal information of consumers under 16 (if they know they are under 16) without express opt in, and for consumers under 13, the parent or guardian must opt in; and (b) a business cannot discriminate against a consumer who exercises any of their rights under the law.
A business must provide two methods of contacting the business to exercise these rights – one of which is a toll free number, unless the business transacts business solely online and has a direct relationship to the consumer, in which case such online business need only provide an email address to send such requests. There are affirmative disclosure requirements for websites of businesses that are subject to the law. Businesses that sell personal information have additional affirmative disclosure requirements for their websites.
A consumer whose personal information has been breached now has an affirmative damage remedy. Previously there was uncertainty in the law as to whether actual damage or harm would have to be shown to recover, or just the risk of future harm. In general the cases have held that actual harm is required, but vary in what they view as “actual harm.”
Significant daily penalties can be assessed for non compliance after a 30 day notice period.
The above is only a general overview. However, some of those rights, for example, the right to onward transfer accounting, the right to delete information, and the right to opt out, present not only legal compliance issues, but significant technical hurdles. For many small businesses, their systems were not designed this way, and/or, they have so many disparate systems where data is duplicated, that it might hard or near impossible to comply. If a small business runs through the above checklist and gets a handle on the who, what, where, when and why questions, it will be easier to then assess the “how hard to comply” question.
For more information or assistance in data security and privacy law compliance, please contact Mike Oliver
Many companies have exactly 1 year to get their privacy house in order. On May 25, 2018 the European Union’s General Data Protection Regulation (found here in its entirety, the reg itself without precursors is here: GDPR regulation only) goes into effect. It brings tremendous changes to the previous data protection rules, but in this short post I discuss what I consider to be the “Big 3” issues that the new rule presents, and why even though US privacy law is almost non existent (in the general consumer privacy context), these EU rules will become more and more important even for smaller companies operating solely in the US, due to globalization of data exchange. Ok, the Big 3:
Huge fines for small errors. The GDPR allows for fines of up to the greater of 20,000,000 Euros, or 4% of annual global turnover. And, there is every indication that the privacy regulators will be very harsh in doling out these fines, even for fairly innocuous errors. That has certainly been the trend in the U.S. for sensitive data like protected health information.
Information included within the rule is almost everything. The regulation (Article 4, Section (1)) defines “personal data” to mean “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person” It is clear this definition encompasses far more information than just “identifying” information – for example, an “online identifier” is just about any technology that tracks a user.
Extra-territorial scope. The regulation (Article 3) extends the reach of the GDPR well beyond the borders of the EU. First, it states that it “applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not.” So, any data processed by a controller or processor who is located in the EU is subject to this rule, even if the data subject is not a EU resident. Next, it states “This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or (b) the monitoring of their behaviour as far as their behaviour takes place within the Union.” So, regardless of the location of a business, if the business offers goods or services, whether paid or unpaid, the GDPR applies. Finally, “[t]his Regulation applies to the processing of personal data by a controller not established in the Union, but in a place where Member State law applies by virtue of public international law.” The US has not yet adopted public international law that accedes to this rule, but other countries may do so. Operating in those countries would impose the rule on the controller or processor.
There is a separate issue about whether the EU could enforce the GDPR against a US based entity in the EU, or whether it would have to try and come to the US and file such claim; and there is also a separate question of whether a US court would enforce a foreign law against a US based business without an enabling treaty or other enabling statute. However, a company that operates solely in the US would probably have to play ball with the EU authorities if they ever wanted to be able to actually do direct business in the EU. Most large companies have already made that decision. Smaller companies that are wholly located in the US will have to consider whether they would want to take the risk of GDPR enforcement, and whether they want to ever expand direct services into the EU.
One year seems like a long time, but the GDPR has been known for some time (it was adopted in 2016), and now the time is short and companies that might be subject to it really need to be well on their way of making an assessment of what data they are collecting, how they are using it, what efforts they have made to obtain consent to that use, and how they will meet the 6 principles in a timely fashion.
The FTC released a study and guide on facial recognition technology, and provided guidance on notice, transparency and options required when making use of, storing and sharing facial recognition information. The case studies included a basic use (for example, a face is scanned and then the user may make changes to see what hair, clothes, jewelry or other things look like), a more advanced use – an interactive kiosk that takes a picture of a consumer, assesses their age and gender, and presents an advertisement specifically for that consumer, and finally an example of use of facial recognition in social media and sharing those images (a la Facebook).
Anyone making use of facial recognition technology should consult these guides as they would any other FTC advertising or privacy guide, before they commence collecting, using or sharing facial recognition images.
In April 2012, the Federal Trade Commission issued its report entitled “Protecting Consumer Privacy in an Era of Rapid Change.” You can read that here.
The Report, while a comprehensive review of hundreds of undoubtedly conflicting filings by the various extreme factions on privacy issues, ultimately just boils down to the FTC complaining that Congress has still not taken any action to normalize privacy rules. Let’s face it, privacy law is a mess – a hodge podge of state laws, some specific federal laws in the area of financial account, children, protected health information, and education areas, and a morass of case law and regulatory rules – rules that mostly derive from other laws (like the Lanham Act) not really intended to address privacy. For example, many of the actions the FTC has brought to enforce so called privacy, really involve false advertising – a company saying one thing to a consumer, and doing another, or offering some ability to control a privacy setting, and then ignoring the user setting.
The Report sets forth the FTC’s overview of its objectives and scope summarized here:
does not apply to companies that collect only non-sensitive data from fewer than 5,000 consumers a year, provided they do not share the data with third parties
“commonly accepted” information collection and use practices for which companies need not provide consumers with choice (product fulfillment, internal operations, fraud prevention, legal compliance and public purpose, and first-party marketing).
recommended that companies provide consumers with reasonable access to the data the companies maintain about them, proportionate to the sensitivity of the data and the nature of its use.
respect browser and consumer “do not track” election
allowing consumers to have access to and to correct information held by so called “data brokers”
industry self-regulation (“no lip service”)
In terms of the actual principles, they are:
Companies should incorporate substantive privacy protections into their practices, such as
data security, reasonable collection limits, sound retention and disposal practices, and data accuracy
Companies should maintain comprehensive data management procedures throughout the life
cycle of their products and services
Companies should simplify consumer choice (Companies do not need to provide choice before collecting and using consumer data for
practices that are consistent with the context of the transaction or the company’s relationship with the
consumer, or are required or specifically authorized by law)
For practices requiring choice, companies should offer the choice at a time and in a context
in which the consumer is making a decision about his or her data. Companies should obtain affirmative
express consent before (1) using consumer data in a materially different manner than claimed when the
data was collected; or (2) collecting sensitive data for certain purposes
Privacy notices should be clearer, shorter, and more standardized to enable better
comprehension and comparison of privacy practices
Companies should provide reasonable access to the consumer data they maintain; the extent
of access should be proportionate to the sensitivity of the data and the nature of its use
All stakeholders should expand their efforts to educate consumers about commercial data
From a lawyer for small to medium size businesses, it would be very helpful for some national, pre-emptive legislation that gave a lot of guidance and safe harbors for businesses so that they do not have top worry that they are violating some esoteric rule buried in some regulation, order or arcane state law. Unlikely to happen, though . . .