by Mike Oliver | Sep 9, 2022 | Privacy, Privacy Law
The first fine under the California Consumer Privacy Act was issued this week against Sephora U.S.A., Inc. The complaint alleged in part “The right to opt-out is the hallmark of the CCPA. This right requires that companies follow certain straightforward rules: if companies make consumer personal information available to third parties and receive a benefit from the arrangement—such as in the form of ads targeting specific consumers—they are deemed to be “selling” consumer personal information under the law.”
There are three important observations arising from both this allegation, and the consent by Sephora:
- “Selling” Personal Information does not just mean literally collecting actual personal information and selling it – it means, according to the California attorney general, collection of essentially any information about a web site visitor (for example, what browser they are using) and then providing that to a third party, who then uses that to track such website visitor in their own network of customers – even if the tracking company does not actually know who that person is;
- Websites that use ANY tracking technology must meet the fairly onerous disclosure notification rules that the site sells personal information – for example Sephora had stated (as most websites do today) that they “do not sell personal information”; and
- For all practical purposes any website visitor has the right to completely opt out of “tracking” essentially anything, and the site must provide this ability to opt out and respect it.
The other matter of significance from the complaint and resulting consent fine is that if a user instructs their browser to send a do not track signal (also known as a global privacy control, or GPC), the website must honor it.
Finally, Sephora was unable to establish that the analytics providers were “service providers,” which would have resulted in the transaction not being a sale, because they did not have valid service provider agreements with these providers – indeed, the complaint goes to great lengths to note that Sephora exchanged personal information for free or reduced price analytic services.
Under the CCPA, a service provider agreement must:
“(1) Specif[y] that the personal information is sold or disclosed by the business only for limited and specified purposes.
(2) Obligat[e] the third party, service provider, or contractor to comply with applicable obligations under this title and obligate those persons to provide the same level of privacy protection as is required by this title.
(3) Grant[] the business rights to take reasonable and appropriate steps to help ensure that the third party, service provider, or contractor uses the personal information transferred in a manner consistent with the business’ obligations under this title.
(4) Require[] the third party, service provider, or contractor to notify the business if it makes a determination that it can no longer meet its obligations under this title.
(5) Grant[] the business the right, upon notice, including under paragraph (4), to take reasonable and appropriate steps to stop and remediate unauthorized use of personal information.”
CCPA, § 1798.100(d).
Virtually no analytics provider online terms of service meet these requirements.
The whole matter is also strange in that Sephora was given 30 days notice of the violations and for some reason chose not to comply. Did they decide to contest the claims, and then later decided not to? If so, it was a costly decision.
As a result of this decision, all websites using any form of third party (data sharing) analytics providers needs to make sure that they review the agreement with the analytics company carefully to see if they meet the above requirements. If not, they need to either obtain such an agreement, or cease using such provider. They also need to make full disclosure about what data is shared (sold) to the analytics provider, and provide a full opt out notice – and of course, ensure that the site respects GPC, and respects any opt out request. This is going to be very challenging to accomplish for many reasons – in part because in most cases, these analytics providers do not actually know who the person is – they just have all the data that identifies the electronic interaction – so they are going to have to devise a system to scan their identifiers for the requests. In short, this decision is going to make using web analytics all but impossible except where the analytics are limited solely to the website operator.
by Mike Oliver | Aug 18, 2022 | Firm Matters, In the News
We are extremely excited and honored to announce that Mike Oliver and Kim Grimsley have been recognized in The Best Lawyers in America® for 2023. Best Lawyers is an international lawyer ranking and referring source that is currently celebrating its 40th anniversary, and today it has announced its 29th Edition of The Best Lawyers in America® for 2023, which will include Mike Oliver and Kim Grimsley. In order to be featured, lawyers are nominated, critiqued by currently recognized lawyers on the caliber of their work, and analyzed accordingly.
Mike Oliver has been recognized in this publication consecutively since 2006 – he is being recognized in the fields of Copyright Law, Information Technology Law, Trade Secrets Law and Trademark Law. Mike has been practicing corporate, business and intellectual property law for over 30 years. His knowledge as a computer programmer has been a valuable asset for those clients in the software and technology industry.
Kim Grimsley has been recognized in this publication for the past 3 years for her professional excellence by her peers – she is being recognized in the fields of Copyright Law and Trademark Law. Kim has been practicing intellectual property law for over 20 years, and she has enjoyed working with clients – from start-up businesses to publicly traded companies in all industries – in building and protecting their intellectual property in the United States and worldwide.
Everyone at Oliver & Grimsley would like to congratulate Kim and Mike on their continued hard work and excellence.
by Mike Oliver | Jun 8, 2022 | Copyrights, Intellectual Property, Litigation
The Copyright Small Claims Court will be commencing operations in a few weeks (late June, 2022), and Oliver & Grimsley is pleased to announce that we will be providing both plaintiff and defense services for copyright small claims actions.
Copyright small claims actions should be a cost effective way of enforcing copyrights in the United States, if the copyright holder is primarily seeking a determination of infringement, and willing to receive an award of no more than $30,000. There are some considerations to keep in mind, however.
One advantage is that Copyright small claims actions can be filed without having previously received a certificate of registration, and without filing an application for special expedited status (which is expensive). However, an application for a certificate of registration must have at least been filed at the time of filing a small claims action.
The ability to file small claims efficiently should also provide a slightly better basis for pre-litigation resolution, as prior to this, it has always been a bit of a poker game to figure out whether an actual full suit would be filed in Federal court. Federal cases are very expensive, and if the copyright was not timely registered (see note 1), no statutory remedies or attorneys fees are available. With the ability to file claims informally, for much less cost, and without significant risk of years of discovery, a defendant receiving a cease and desist letter will have to more carefully consider whether a small claims action might be filed. However, the defendant receiving a small claims complaint can treat that claim as a true case or controversy, opt out of the proceeding, and commence a declaratory judgment action in some remote location, so this risk is not mitigated with the small claims process.
The biggest problem with the small claims process is that the small claims court is not mandatory – it is elective. If a defendant has such a claim filed against it, it can “opt out” of the proceeding, in which case “If you opt out, the CCB will dismiss the claim against you, but the claimant can still bring the same claim in federal court.” See https://ccb.gov/respondent/. Therefore, a plaintiff could go to the trouble of filing the small claim, spending money and filing fees, only to have the defendant opt out, and then the plaintiff has to start all over again in Federal court. It is virtually never cost effective to file a Federal court claim in the $30,000 range, so it will be easy for defendants who determine their risk is only at or around that number, to opt out and thus bet that the plaintiff will not follow through.
On the other hand, if a defendant believes that the claim is higher than $30,000, and there is real risk of plaintiff winning and also collecting fees (see note 1) – then opting in might make sense for the defendant.
In short, there is no one answer whether a plaintiff should file in small claims, and no one right answer whether a defendant should opt out. However, as the process is currently set up, it is generally going to be more likely that a defendant elects to opt out, especially where the plaintiff failed to timely register their copyright, and cannot seek statutory damages and the collection of attorney fees.
Note 1: Under 17 U.S.C. § 412, statutory remedies and attorneys fees are not available to a plaintiff/copyright holder unless the effective date of registration is either within 3 months of first publication of the work, “or 1 month after the copyright owner has learned of the infringement,” https://www.copyright.gov/title17/92chap4.html#412
by Mike Oliver | Mar 4, 2022 | Privacy, Privacy Law
Almost every major website you visit today pops up a banner to warn you that it uses “cookies.” This is not legally required in the U.S. or in most places, and where it is, the vast majority of sites do not comply with legal requirements. From a policy perspective: cookie pops are just dumb – (virtually) no one reads them. There are vastly better ways to deal with the issue they present – legally and from a site usability perspective.
First, no current U.S. law requires cookie pop-ups. Some sites that are available in the European Union are required to post cookie pop ups – sites that use so called “tracking cookies.” I discuss below a recent EU case that makes this issue even worse than one would have originally thought.
Second, an anecdotal review of websites shows the vast, vast majority of them – all of them in my experience that are “U.S.” sites – utterly fail to comply with the so called EU “cookie law.” Why? Because they store the cookie before consent (which is not permitted under the cookie law) and they simply state, “This site uses cookies” and present an “OK” button (and/or an X to close the pop up) with a link to the privacy policy. See for example www.abajournal.com which, as of the date of this post, simply provides an OK button – no option to do anything like reject or manage the cookies, and a link to the privacy policy. Just a useless and legally insufficient user interface distraction.
Finally, except in very, very limited cases, these cookie pops do not in any way increase user privacy protection. Why? If a site does comply with the notice and consent requirements, it is not legally required to provide the service if a user declines tracking cookies. The site can simply not provide functionality. So in many cases, its not really a choice – the choice is either not to use the site, or consent to tracking. This is made worse because many governments and third parties use these sites for information dissemination. A truly privacy focused law would at least require that the site function if a person elected no tracking.
The whole cookie problem was started by our friends in Europe when they promulgated the ePrivacy Directive 2002/58/EC. However, no U.S. company really started focusing on compliance with the “cookie issue” presented in the ePrivacy Directive until the General Data Protection Regulation (GDPR) of the European Union, Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 came into effect. The GDPR applies in Europe, not the US, however so many U.S. companies either do business in, or ostensibly could be regulated by, EU members – so they attempt to comply with both U.S.and EU law.
Many “cookies” – the ones necessary to actually operate a website, are “exempt” and need not be identified nor are they subject to consent. However, sites that use tracking cookies and other tracking technology – even anonymized data – are required under EU law to obtain prior consent before even storing the cookie or other technology that allows such tracking.
In my opinion, this system has been an utter failure in policy and actual impact. It has not stopped companies from incessant user tracking. The companies that rely on user tracking have the power to force the choice of “allow tracking” or do not use the service. The privacy policies remain mostly unintelligible, or at the very least, its is all but impossible to tell what exact tracking a company does, primarily because they either disclose only the types of tracking, or disclose so excessively that the cookie disclosure is indecipherable.
But the EU is doubling down on the concept . . .
In a recent decision (File number: DOS-2019-0137) of the Dispute Chamber of the Data Protection Authority of Belgium issued 2/2/2022, that regulator held that the European arm of the Interactive Advertising Bureau (IAB)’s “pop up” framework used by most of its members – intentionally designed to comply with the GDPR, in fact did not. The decision is lengthy (my machine translated version into English is 139 pages long), and undoubtedly will be appealed. As an overview, IAB created a real time bidding system (RTB) – an automated system of bidding for advertising. This is their framework in the U.S. and many other countries, but in Europe, they created the “Transparency and Consent Framework” (TCF). At issue in this case was a subset of the TCF, which the Board described as follows: “Specifically for the TCF, there are also the companies that use so-called “Consent Management Platforms” (CMPs) to offer. Specifically, a CMP takes the form of a pop-up that appears on the first connection to a website appears to request permission from the internet user to collect cookies and other identification data” Para. 40 (Note, all English translations here are machine created by Google’s translation service). The original decision in Dutch is here (and I can post the English translated version if someone requests it): https://www.gegevensbeschermingsautoriteit.be/publications/beslissing-ten-gronde-nr.-21-2022.pdf.
The basic idea is that IAB manages a “consensu” cookie – that indicates if the web user has already consented (or rejected) cookies. So, a participating site would somehow take information from a user’s initial browser session, send it off to IAB, and IAB would send back a text string indicating if that user had already consented to accept cookies or not. If not, a “cookie pop up” would be presented to the user. The Board found that the IAB maintains a database of users and preferences, which can be used “in order to create an advertising profile of data subjects and to show them personalized advertising” Para. 50. It therefore concluded the IAB was a data controller (a point the IAB disputed). From this point forward the Board essentially found nearly every conceivable violation of the GDPR that could be found. Among them, that “IAB Europe [] failed to observe the principles of due regard for transparency and fairness with regard to data subjects” in part because some of the information that can be sucked up into the preference model includes “special categories of personal data … For example, participating organizations could become acquainted with the websites previously visited by a data subject, including the political opinions, religious or philosophical beliefs, sexual orientation, health data or also trade union memberships of the data subjects be inferred or disclosed.” Para 51. It also found the IAB’s privacy policy insufficient because among other reasons it was only available in English, and used unclear vague terms like “services” and “other means.” Para. 54. It also did not like that the terms “partners” and “third parties” were not explained sufficiently.
To me this is just evidence that no one really understands the law – or that the regulators think it says one thing and the industry thinks it says another. Not good either way. But after that decision, it seems like it would be all but impossible to have a centralized “cookie consent” service – or to comply, the service would be so intrusive as to make the web experience intolerable.
The solution? In my view, just stop with the cookie pop ups. They are stupid and ineffective. Enact a law that requires a service to respect the do not track signal from a browser (currently entirely voluntary), and not store any tracking cookies, clear gifs or other trackers – and require that a site not “discriminate” against users who elect no tracking – basically – provide all functions to users whether they consent or do not consent. I would also prevent any government organization to use a site that tracks users as a service for information dissemination.
by Mike Oliver | Feb 3, 2022 | Business Law, Intellectual Property
NFTs are unique, effectively non-destructible tokens stored in the blockchain – a decentralized ledger system that uses computing resources to validate the holder of cryptographically unique data without reliance on a single source of truth such as a bank or government [further reading]. The NFT references a link to a resource – typically on the internet or in a game – where some content is available. An NFT has a single owner (which can be an entity), and generally NFT’s cannot be subdivided once they are created, though they can be transferred.
An NFT can represent anything – digital art, a book, a page from a movie script, a signature, a title document to a car or house or real estate, an in game “skin” or custom article, a representation of something in a virtual reality construct (currently being referred to as the metaverse), like clothing, shoes and so on. It can also link to something that is itself a representation of something tangible – for example, an NFT can link to a digital object that might be used in a game, as an avatar or in the metaverse, but that also is created tangibly (for example, this shoe created using artificial intelligence), or it can be an electronic representation of part of the notes to a very famous song. A decent list of many potential uses of NFTs is set out in this article 15 NFT Use Cases That Could Go Mainstream.
NFTs were originally born from a desire to find a way to establish the “provenance” (title) to digital art. See NFTs Weren’t Supposed to End Like This. As pointed out in that article, the blockchain cannot actually store the thing it points to – for example, an image – there is not enough space. It only has the space to really hold a link to that image. As NFT use becomes both widespread and also, referencing property that the NFT holder might not own, many legal issues are now coming to light.
Who owns an NFT and exactly what do they own?
A case filed a few days ago on February 1st is challenging the sale of the “very first NFT” by Sotheby’s for 1.47 million dollars. Free Holdings Inc. v McCoy et al., case number 1:22-cv-00881 (S.D.N.Y). In that case the Plaintiff claims that the sale of the NFT, which is apparently a copy of the original token the founder of NFT’s created, violated its ownership rights to the actual NFT, which it claims to have in its wallet.[1] The defendants have apparently asserted that the actual NFT (the digital token itself) was “removed” or “burned” from the Namecoin blockchain where it was created, and thus does not even exist. The plaintiff is claiming that the original owner allowed the token to “expire” and it somehow renewed that token and placed it in its wallet.
This case presents a novel issue . . . what does “title” to an NFT really mean? The NFT itself is represented solely electronically (a token stored in a blockchain), and only functions electronically – once retrieved, it points to a resource on the internet and identifies the person holding the token as the true owner of … what? That link? The actual resource that is at that link? Is a picture of the token framed as art actually the NFT (and who owns the token itself, the text string)? That token is publicly viewable by anyone – the blockchain just limits how it can be transferred, and identifies the NFT holder as at least someone who has digital rights to the link. Is the token itself a copyright? We may get some guidance in the Free Holdings case.
A person who purchases an NFT should be aware that they are only really purchasing a public and decentralized proof system that establishes that they are the owner of the token, because that token is in their wallet. The actual thing displayed at that resource location probably is under copyright protection, and the owner of the copyright is probably not transferring ALL of the rights to the copyright in whatever that resource depicts. However, there is probably at least an implied license to publicly display whatever the NFT points to on the internet (or in a game, or the metaverse, or wherever that content can be displayed). The result is very similar to the result when someone buys a physical painting or art object. While they have title to that thing (the tangible thing, the painting) – in the absence of a very specific agreement, they do not own the copyright represented by that thing.
What rights must a person who creates an NFT have to the underlying content or server?
Those are very good questions. Could I create an NFT to any URL on the internet (i.e. to any picture that is publicly viewable) and sell that? Do I have to be the “owner” of that image or have any license to it? Do I have to control the web server that serves that image upon a request? What disclosures do I have to make to a purchaser of the NFT as to what rights they are receiving, whether I am the copyright holder of the image, or that I control the server? Does the contract imply I will perpetually, until the end of time, maintain that server and that resource location? What happens if I go bankrupt, or the blockchain service I used goes bankrupt, or the server company? Can multiple NFTs be sold that point to the same resource? Does the NFT platform owe a duty of due diligence to verify rights in the underlying resource? There is nothing built into the NFT or blockchain system that requires unique resource links. Even if an NFT provider limited links to be unique, other NFT providers need not respect that prior link and can create NFTs in their blockchain, pointing to the same resource.
Most NFTs are sold using “smart contracts” – which are essentially a series of pre-made instructions in the blockchain that, when triggered, simply occur. See How smart contracts work. No human sees them, nor reviews them, approves them, or checks they were made or not made. The whole idea is that the blockchain system itself verifies the “transaction” occurred, without human involvement and without a centralized verification system (such as an intermediary bank, certificate signer, government title repository etc). They are not the proper place to agree to whatever license rights and obligations are connected to the underlying resource represented in an NFT. Even if they were, the smart contract process will not meet electronic signature requirements under UETA or ESIGN which are applicable in the US. Those terms would have to be in the underlying terms of service of the NFT provider and the NFT seller.
These issues will play out under traditional legal principles – in the author’s view predominantly under contract law (based on what the terms of service of the NFT provider and seller say), under advertising law – what disclosures must an NFT seller make to meet the requirements of advertising law – that the sale was induced by truthful, non-misleading and fair representations about the NFT?, under the law of publicity rights (use of a person’s likeness to sell a product or service), and of course finally under intellectual property law, principally copyright law.
On that last point, at least under US law, a question arises whether the Digital Millennium Copyright Act notice and take down provisions will apply to NFT transactions. For example, suppose person X sells an NFT to a linked resource at location Y, and was not the owner of the copyright of the image there. If the purchaser does nothing else (such as displaying that image embedded in a game or web page) – can the copyright owner force the NFT platform to take that down that link – assuming the copyright owner does not control that resource? Is that NFT itself a violation of 15 USC 1125(a), indicating a false association with the owner of the copyright (or perhaps implicitly stating the NFT owner owns the copyright?). Under certain cases in the US (e.g. Dastar), misrepresentation as to authorship is often not an actionable.
Are NFT providers liable for NFT sales that violate the rights of a third party?
Most third parties will not have agreed to the NFT provider’s terms of service, which undoubtedly will disclaim liability for any claims arising from acts of the NFT seller. If the third party’s rights are violated, can they sue the NFT provider? In the US, the NFT provider may have immunity under Section 230 of the Communications Decency Act if the NFT is “information provided by another information content provider”. But is it? The NFT provider actually creates the NFT and provides the functionality. However, the NFT owner is the person creating the information that is stored in the NFT.
Summary and some recommendations
There are more questions than answers today, however in any NFT sale transaction, at least the following should be closely reviewed:
- The terms of Service of the NFT Platform provider. A buyer will be agreeing to these terms. They are likely not favorable to the Buyer, and also likely not negotiable. As a result, the value of that NFT is highly dependent on the reputation and likelihood of that platform staying in business.
- The terms and conditions of the sale from the Seller. Does the Seller represent it has the IP rights to the resource? That they are unique? Will not be resold in a different NFT? Is their liability limited or remedy limited?
- Some diligence into the actual art/resource/item should be done – and this may be very difficult. There are no real regulations in this area (outside of general unfair and deceptive consumer protection laws) – so even finding the true author or owner of a work may be very difficult – even in the US where we sort of have a registration system. The less able a buyer is to verify the provenance of the underlying resource, the more strongly worded the representations, warranties and consequences of breach should be. In a worse case, an escrow should be set up so that some post transaction verification can occur before all, or at least some, of the actual transfer of the cryptocurrency occurs.
For more information, contact Mike Oliver or Kim Grimsley.
[1] – Etherscan defines a wallet as follows: “A wallet address is a publicly available address that allows its owner to receive funds from another party. To access the funds in an address, you must have its private key.” Link