The Privacy Hacker – LexBlog Legal news and opinions that matter Tue, 25 Aug 2020 03:17:41 +0000 en-US hourly 1 The Privacy Hacker – LexBlog 32 32 Addressing Cross-Border Transfers from the EU Following the Schrems II Ruling Tue, 25 Aug 2020 01:10:02 +0000 As we all know, the EU-U.S. Privacy Shield framework, the cross-border transfer mechanism relied upon by over 5,000 U.S. entities until just over a month ago, was recently invalidated by the CJEU in the Schrems II case (see here for our last post following the ruling). So what next?With Privacy Shield dead and the CJEU reaffirming that truly adequate safeguards must be coupled with the Standard Contractual Clauses (SCCs), organizations must determine how to properly transfer personal data outside of the EEA to non-adequate jurisdictions (including companies in the U.S.). Indeed, although the SCCs and other more limited mechanisms remain valid for transfers out of the EEA per the CJEU ruling, each underlying data transfer must be assessed on a case-by-case basis in order to determine if/when personal data will be transferred outside of the EEA and if so, whether the data to be transferred to a third country not otherwise deemed adequate by the EU can nonetheless be adequately protected under and as per EU data protection laws. If the third country and/or recipient organization cannot provide those same safeguards, EU data protection law mandates that the personal data not be transferred. To be clear, the requirement of adequate safeguards was already law in the EU, but merely loudly re-affirmed by the CJEU. In fact, over the years, many companies entered into SCCs as part of data processing agreements without much thought being given to whether all of the adequate safeguards were or could be met.

Recent developments post-Schrems II

Following Schrems II, most companies small and large – especially those that had self-certified Privacy Shield – took a “wait-and-see” approach given the massive confusion that ensued. However, not much more guiding clarity has emerged since the ruling – other than some official statements/comments, such as the FAQs issued by the EDPB. All in, no  groundbreaking developments, and certainly no uniform guidance on what should come next or how impacted companies might proceed. This may soon change. As of last week, Max Schrems and his privacy watchdog organization, NYOB, shook things up and filed over 100 complaints in 30 EEA countries. The complaints were filed against European companies that continued – post Schrems II – to transfer personal data about their online visitors to Google and Facebook in the U.S. More detailed information about those complaints (including a list of those companies and the individual complaints) can be found here.

In a nutshell, the complaints state that the transfers of personal data by these various companies to Facebook and/or Google are unlawful because they are either (a) still based on an  invalidated  adequacy  decision (i.e., Privacy Shield) or (b) reliant on the Standard Contractual Clauses (SCCs), the use of which is prohibited under GDPR if the third country to which personal data is transferred does not allow for the same standard of adequate protection as under EU law. As summarized by NYOB’s website, this is because, with respect to U.S. companies, the CJEU found that further transfers to recipients that fall under U.S. surveillance laws namely the Foreign Intelligence Surveillance Act (“FISA”) violate data subjects’ data protection rights (among others). Because Google and Facebook qualify as electronic communication  service providers within the meaning of FISA (50 U.S. Code §1881(b)(4)) and as such are subject to U.S. intelligence surveillance under FISA, it follows that the transfers of personal data outside of the EU to those recipients are unlawful –regardless of the relied-upon mechanism, per NYOB. What’s more, NYOB’s complaints would presumably require member state supervisory authorities to intervene and stop the transfers if indeed unlawful.

What steps should organizations take with respect to transfers pf personal data out of the EEA?

Now that it’s very clearly time for companies that have put off re-assessing the validity and bases of cross-border transfers to get busy, what does this mean concretely? First off, this means conducting transfer assessments. Then, once transfers and their bases have been validated, making necessary adjustments to cross-border transfer agreements and privacy notice(s).

At a high level, these transfer assessments require organizations transferring personal data within the scope of the GDPR to:

  • Determine the third country (or countries) to which personal data is transferred and the basis of such transfer(s) (e.g., SCCs, Privacy Shield, etc.) – keeping in mind the broad notion of “transfer” also includes access, as well as the fact that some transfers may be to other related entities within a corporate family as well as third parties or sub-contractors of third parties.
  • Review local and domestic laws in each such third country that is not deemed adequate by the EU, in order to determine whether such laws enable public authorities to access the personal data of EU (EEA) data subjects;
  • Assess whether the recipient(s) of personal data within a third country that is not adequate and that does not provide similar protection is in fact subject to the domestic laws granting access to public authorities – and whether such access is limited to what is necessary and proportionate;
  • Determine what additional safeguards, if any, might be applied to protect the personal data (e.g., encryption) and whether the domestic law at issue provides effective remedies/redress for data subjects.

Once these assessments are conducted and properly recorded (in anticipation of a potential future audit), organizations will want to adjust their data processing agreements in order to ensure, where necessary, that any recipients in third countries that do not cut it either stop transferring the data to those countries or, if this can feasibly be accomplished, provide additional safeguards (e.g., encryption, notice mechanisms etc.). Importantly, organizations that have relied on Privacy Shield must also adjust their privacy notice(s).

Note that all organizations in the supply chain (controller, processor, sub-processor) are impacted here. If your organization is a processor of EU personal data in a third country that is not deemed adequate (again, including the U.S.), your organization must be capable of (a) addressing any controller/exporters’ concerns and (b) ensuring that its own onward transfers to sub-processors provide adequate safeguards, even though the primary responsibility falls upon the controller and/or exporter of personal data to perform assessments before allowing any personal data to be transferred out of the EEA. Processors should address this issue now so that they are prepared when an EU controller reconsiders using service providers in non-adequate countries as a result of the Schrems II decision.

In other words, there is a lot to be done.



Data Security and the New York SHIELD Act: Going Beyond New York Companies Mon, 03 Aug 2020 17:47:10 +0000 With the Covid-19 crisis, many companies that may have traditionally only done business offline are transitioning and expanding into e-commerce. Others are starting new businesses and innovating new technologies and platforms. There are a multitude of considerations that go into these new ventures, an important one of which is security.

For any new or established business, the company must look to evaluate its existing security procedures and policies as compared to both legal requirements and best practices.  While many state laws include a rather vague obligation to maintain “reasonable security measures” without a clear definition, some state laws go into more detail as to what security measures are required. In 2010, Massachusetts was first to set forth more specific security requirements for businesses that maintain electronic data on any residents of Massachusetts with the Massachusetts Standards for the Protection of Residents of the Commonwealth. In short, the Massachusetts law required: user authentication; access control measures; encryption; system monitoring; firewalls and security patching; anti-malware and virus mechanisms; and employee training.

Prior to the Massachusetts law going into effect, it was seen as groundbreaking, but language requiring the security measures to be implemented only if “technically feasible” provided a lot of wiggle room and Massachusetts simply was not a focus for many companies. Now, fully effective this past March, the New York SHIELD Act has even more detailed requirements around data security that apply to a broader set of companies (even if not located in New York).

The New York SHIELD Act prioritizes the safeguard and security of personal information of New York residents. Unlike the California Consumer Privacy Act (“CCPA”), which carries security requirements but does not spell those out specifically, the SHIELD Act’s focus is security and it provides ample detail on steps that must be taken in order to comply.

The SHIELD Act’s obligations apply to “[a]ny person or business which owns or licenses computerized data which includes private information” of a resident of New York.” This definition is similar to the security breach statutes of most states that apply to any businesses regardless of location if they retain any electronic personal data. Thus, the SHIELD Act includes businesses that are not located in New York.

The SHIELD Act uses the term “private information” to refer to the key data elements to be protected under the statute. In other words, “private information” is a subset of “personal information.”

  • Personal information is any information concerning a natural person which, because of name, number, personal mark, or other identifier, can be used to identify such natural person.
  • Private information is defined as follows (again, this is very similar to the definitions of data subject to security breach laws in other states): personal information consisting of any information in combination with any one or more of the following data elements, when either the data element or the combination of personal information plus the data element is not encrypted, or is encrypted with an encryption key that has also been accessed or acquired:
    • social security number;
    • driver’s license number or non-driver identification card number;
    • account number, credit or debit card number, in combination with any required security code, access code, password or other information that would permit access to an individual’s financial account;
    • account number, credit or debit card number, if circumstances exist wherein such number could be used to access an individual’s financial account without additional identifying information, security code, access code, or password; or
    • biometric information, meaning data generated by electronic measurements of an individual’s unique physical characteristics, such as a fingerprint, voice print, retina or iris image, or other unique physical representation or digital representation of biometric data which are used to authenticate or ascertain the individual’s identity;


    • a username or e-mail address in combination with a password or security question and answer that would permit access to an online account.

Under the SHIELD Act, any person or business that owns or licenses computerized data that includes private information of a resident of New York is required to develop, implement and maintain reasonable safeguards to protect the security, confidentiality and integrity of the  private information including, but not limited to, disposal of data. A person or business is deemed to be in compliance if it implements a data security program that includes the following:

(A) Reasonable Administrative Safeguards by which the person or business:

(1) designates one or more employees to coordinate the security program;

(2) identifies reasonably foreseeable internal and external risks;

(3) assesses the sufficiency of safeguards in place to control the identified risks;

(4) trains and manages employees in the security program practices and procedures;

(5) selects service providers capable of maintaining appropriate safeguards, and requires those safeguards by contract; and

(6) adjusts the security program in light of business changes or new circumstances.

(B) Reasonable Technical Safeguards by which the person or business:

(1) assesses risks in network and software design;

(2) assesses risks in information processing, transmission and storage;

(3) detects, prevents and responds to attacks or system failures; and

(4) regularly tests and monitors the effectiveness of key controls, systems and procedures.


(C) Reasonable Physical Safeguards by which the person or business:

(1) assesses risks of information storage and disposal;

(2) detects, prevents and responds to intrusions;

(3) protects against unauthorized access to or use of private information during or after the collection, transportation and destruction or  disposal of the information; and

(4) disposes of private information within a reasonable amount of time  after it is no longer needed for business purposes by erasing electronic media so that the information cannot be read or reconstructed.

*NOTE that a person or business is also deemed compliant if it is a “compliant regulated entity”, meaning that it is subject to, and in compliance with, certain specified data security requirements, including under GLBA, HIPAA or other enumerated regulations.

Under the SHIELD Act, certain “small businesses” may adjust their data security obligations based on certain factors. A “small business” still must adopt reasonable administrative, technical, and physical safeguards, however, those safeguards can be adjusted according to:

  • The size and complexity of the small business;
  • The nature and scope of the small business’s activities; and
  • The sensitivity of the personal information the small business collects from or about consumers.

However, there are no exceptions for small businesses in the breach notification rule. A small business that experiences a data breach affecting the private information of New York residents must notify the affected persons as would any business.

Businesses that fail to comply with the SHIELD Act’s security requirements face civil penalties of up to $5,000 per violation and there are no penalty caps. By contrast, there is a penalty cap of $250,000 per violation for failing to notify authorities when a breach occurs. The SHIELD Act does not include a limited private right of action like the CCPA. However, unlike the CCPA, which applies only to certain businesses that meet certain thresholds, all businesses that own or license computerized data that includes “private information” of a resident of New York must implement appropriate cyber security measures, in addition to complying with the administrative, technical, and physical safeguards spelled out in the SHIELD Act. Security should always be a top priority, no matter whose personal information is collected and with the various state security breach and protection laws, care should be taken to comply with all applicable laws, with an emphasis on the most proscriptive and comprehensive laws.


CCPA Enforcement: What to Expect Next Mon, 27 Jul 2020 21:24:11 +0000 During a recent keynote presentation with the IAPP following the July 1 enforcement deadline of the CCPA, Stacey Schesser, Supervising Deputy Attorney General for the State of California (“Deputy AG”), provided a bit of a roadmap for CCPA enforcement actions from the California Attorney General (“AG”) that are both currently underway and expected in the near future.

A first round of notice letters was sent on July 1, 2020 to businesses, the list of which has not been made public and is unlikely to be provided before the end of the 30-day cure period from the date of the initial notice letters. However, what we were able to gather from the Deputy AG’s chat is that businesses operating online were principally targeted across multiple industries, and the notices were generally based on failures to either (a) provide key disclosures required by CCPA or (b) include the “Do Not Sell My Information” link where the AG deemed it in fact necessary. Targeted businesses were identified after review of online policies by the AG, but interestingly, others were identified from customer complaints on social media sites such as Twitter. As expected, it is clear that the sale of personal information – as defined under CCPA with all of its ambiguities – is going to be one of the main issues and key enforcement points. However, until the CCPA Regulations have been reviewed and approved by the Office of Administrative Law, some of the ambiguities are unlikely to be resolved immediately.

As to future enforcement, we already knew from a statement made by the AG a few months ago that the personal information of children would be high on the priority list, along with sensitive information such as health and financial information. However, the Deputy AG also noted that the interplay of CCPA with California Unfair Competition Law and other laws would also be on the list, as well as repeated consumer complaints including those contained in class actions filed by individuals.

The key takeaway is that enforcement is already underway, albeit “quietly”, and businesses that have put off compliance – or failed to properly comply – may soon find themselves the target of the AG. In addition to ensuring compliance with the basic required disclosures under the CCPA, businesses should ensure that they have performed a complete assessment of any disclosures of personal information to determine whether any such disclosures constitute a sale.

As a reminder, a business is deemed to be in violation of the CCPA if it fails to cure any alleged violation within 30 days of being notified of alleged noncompliance. Any business, service provider, or other person that violates the CCPA is subject to potential  injunctions and liability of civil penalties of up to $2,500 for each violation or $7,500 for each intentional violation. Coupled with the potential class actions that may come with security breaches under the CCPA, failure to comply can be costly.

Schrems II: EU Personal Data Transfers to the U.S. and the Invalidation of the Privacy Shield Mon, 20 Jul 2020 14:23:46 +0000 Despite three annual reviews by European Union Commissioners, the European Court of Justice (CJEU) invalidated the Privacy Shield and called into question many transfers of personal data pursuant to the Standard Contractual Clauses on July 16.  At stake are transfers of EU personal data to thousands of U.S. companies that rely on personal data being transferred from the EU. The case is colloquially known as “Schrems II” as it is the second case involving Maximillian Schrems (Case C-311/18 Data Protection Commissioner v Facebook Ireland and Maximillian Schrems). Mr. Schrems’ first case resulted in an invalidation of the EU-US Safe Harbor, the Privacy Shield’s predecessor in 2015.

The CJEU’s rationale in Schrems II hinges primarily on U.S. law enforcement access to EU personal data (i.e., via the Foreign Intelligence Surveillance Act) and EU individuals’ lack of meaningful rights to redress against U.S. authorities. The CJEU also raised the lack of independence of the Privacy Shield Ombudsman, who is appointed by the U.S. Secretary of State (but could be removed at any time) and is not vested with any power to enforce decisions against U.S. intelligence agencies. It likely did not lend credence to the U.S. taking privacy seriously that it did not appoint a permanent Privacy Shield Ombudsman for a couple of years after the Privacy Shield came into effect.

As to the Standard Contractual Clauses (“SCCs”) promulgated by the European Union, the CJEU found the SCCs to be valid in principle, subject to various factors and conditions. Specifically, the court reiterated that the safeguards to be taken by the controller or processor must be capable of ensuring that data subjects whose personal data are transferred to a third country pursuant to the SCCs are afforded, as in the context of a transfer based on an adequacy decision, a level of protection essentially equivalent to that which is guaranteed” within the EU. A recipient of EU personal data outside of the EU must therefore inform the data exporter of any circumstance that would prevent it from complying with the SCCs, and in such cases, as highlighted by the CJEU, the data exporter must suspend the transfer. The CJEU also called out the role of the relevant supervisory authority to prevent such transfers, which, practically speaking, is not only unrealistic but likely to lead to a fragmented application of the GDPR. Companies are cautioned not to adopt a “sign it and forget it” approach with SCCs – which is too often the case in practice. As a result, data exporters in the EU must carefully consider and document the laws of the data importer’s country (e.g., the U.S.), along with the type of personal data at issue and the importer company’s history of being subject to U.S. National Security requests for data. Certain types of personal data and industries are subject to such requests at higher rates than others. Depending on the case by case analysis, if the SCCs are used, additional terms may need to be added. EU exporters will also likely request logs or information as to the importer’s history in relation to U.S. National Security requests.

U.S. companies that relied on the Privacy Shield and/or the SCCs must review their data processing contracts and determine how to move forward. At the core of this ruling is the conflict between the U.S. government’s inherent ability to request and access personal data of EU data subjects and the protections of the GDPR –  and this conflict will not simply “vanish” where companies switch from relying on Privacy Shield to a different transfer mechanism such as the SCCs. It’s also important to note that the court’s analysis applies to any third country or territory outside of the EU that is not deemed adequate. For some companies with complex transfers, this will require quite a bit of renewed data mapping and legal analysis. Aside from beefing up SCCs based on an analysis of the validity of the transfer, derogations or exceptions must be considered. However, derogations are intended for one-time transfers and so the decision leaves wide gaps for businesses that constantly and consistently transfer data. Other options, of course, for companies that have the resources, include re-locating all personal data to the EU and ensuring that data is only accessed from the EU (lest we forget what constitutes a restricted transfer), or entering into Binding Corporate Rules for multinational corporations, international organizations, and groups of companies making intra-organizational transfers of personal data across borders.

We expect that EU regulators and/or the European Data Protection Board (EDPB) will offer more guidance as to how to strengthen the SCCs in due time. Given that the SCCs have not been updated to account for the GDPR, we hope that this will speed up the EU’s revisions to the SCCs as they have been working to issue a new set of SCCs to address the GDPR. Thus, it appears likely that companies will end up entering into enhanced SCCs and then have to update them yet again once the EU issues a new GDPR set of SCCs. We also expect to see Data Protection Authorities in the EU indicate those countries where importers should not rely on the SCCs. Indeed, just a day after the Privacy Shield decision last, Berlin’s Data Protection Commissioner issued a statement effectively stating that EU controllers should not transfer any data to the U.S. under the SCCs and that data should remain localized in the EU.

It should be noted that the Department of Commerce has stated that U.S. companies will still be required to continue to treat personal data collected under the Privacy Shield in accordance with the Privacy Shield Principles and continue to follow such principles.





CCPA and Web Accessibility Mon, 13 Jul 2020 17:04:48 +0000 The California Attorney General’s final proposed regulations under CCPA (“Regulations”) have been submitted, and pending approval by the California Office of Administrative Law, will soon become enforceable by law. One often overlooked requirement of the CCPA is the obligation of covered businesses to provide notices that are “reasonably accessible.” All drafts of the Regulations have provided more detail about the accessibility requirement contained in the CCPA, and the final Regulations make clear that for notices provided online, businesses must follow generally recognized industry standards, such as the Web Content Accessibility Guidelines, version 2.1 (WCAG) from the World Wide Web Consortium. While companies have largely focused on updating the language or substance of their notices to comply with CCPA, this requirement as to form has, by and large, slipped through the cracks, but is certain to generate some discussion (if not litigation) in coming months.

By way of background, the Americans with Disabilities Act (ADA) requires, among other things, that places of “public accommodation” remove barriers to access for individuals with disabilities. While this has long been considered the rule for physical establishments, including privately-owned, leased or operated facilities like hotels, restaurants, retail merchants, health clubs, sports stadiums, movie theaters, and so on, virtual accessibility has been much less consistent, and generally the exception rather than the norm. In fact, web accessibility hardly ever appears on businesses’ radars, due perhaps to a very short-sighted perception of what, in fact, qualifies as a disability as well as a lack of overall guidance.

Web accessibility means ensuring that websites, mobile applications, and other virtual platforms can be used by everyone, including those with disabilities, such as impaired vision. However, what exactly is required is a source of confusion. In 2019, the Department of Justice (DOJ), which is responsible for establishing regulations pursuant to the ADA, withdrew regulations that had been drafted for website accessibility, and has since yet to promulgate any such regulations. This has left courts with the task of determining how and to what extent web accessibility is required under the ADA when it comes to businesses that offer goods and services online, with varying results.

A number of courts have sided with individuals filing claims against companies that do not offer web accessibility, and this is an area of increasing litigation. By and large, lower courts have found that the statute applies, although there is quite a bit of disagreement as to when and how the ADA applies. One such recent case, Robles v. Domino’s Pizza, LLC , provides additional insight. The Robles case involved a claim filed by a blind man, who was unable to access Domino’s online services (both web and mobile app) using special screen-reading software for the visually-impaired. According to the facts, on at least two occasions, the plaintiff had unsuccessfully attempted to order online a customized pizza from a nearby Domino’s because, the plaintiff contended, Domino’s had failed to design both its website and app so that his software could read them. The Court of Appeals for the Ninth Circuit reversed a ruling of the District Court, which had dismissed the case on the narrow basis that the DOJ’s failure to promulgate ADA-related guidance and regulations for web accessibility violated Domino’s due process rights, and ruled in favor of the plaintiff. In particular, the Court of Appeals stated that “[w]hile we understand why Domino’s wants DOJ to issue specific guidelines for website and app accessibility, the Constitution only requires that Domino’s receive fair notice of its legal duties, not a blueprint for compliance with its statutory obligations.” The Supreme Court subsequently denied a petition by Domino’s Pizza for a review of the Ninth Circuit’s decision, despite major efforts (backed by multiple retailers and associations) to persuade the Supreme Court to intervene.

Importantly, web accessibility lawsuits have significantly increased in recent years, and will continue to because of and despite the lack of guidance from the DOJ: 2,250 federal suits asserting ADA violations based on website or mobile application inaccessibility were filed in 2018, nearly triple the number from the year before, according to Domino’s brief. This will be further compounded by the fact that many businesses now only conduct their operations online.

Turning back to the CCPA and the accessibility requirements, the Regulations do in fact appear to have provided some blueprint for compliance, and covered businesses will need to ensure that their notices and all methods by which consumers can exercise their rights are accessible. Not doing so could have some important consequences. Yes, the consumer private right of action under CCPA is currently limited to certain security incidents, which restricts litigation regarding the accessibility of CCPA notices alone. Nevertheless, the fact that such a high-profile law as the CCPA specifically calls out accessibility and provides the means by which compliance may be achieved is likely to generate more discussions on accessibility. This in turn may lead to greater scrutiny of the accessibility of websites AND the privacy notices and legal terms that they contain. Whether the CA Attorney General makes this a priority now that enforcement has begun remains to be seen.

However, one can imagine a number of scenarios where the lack of accessibility may indirectly play out, notably in the event of a dispute. One such scenario could involve the reliance on arbitration clauses typically buried in online terms of use to avert a class action under CCPA in the event of a data breach. It’s no secret that online terms of use (which incorporate by reference a business’ privacy notices) are extremely one-sided against consumers, and many are not accessible. If a court were to determine that a company’s terms of use are not accessible to individuals with disabilities, the validity of the clause may be called into question – and a costly class action under CCPA in the event of a data breach would be able to proceed.

Businesses can refer to generally recognized industry standards, such as the WCAG (version 2.1) available here. These provide an overview of the private industry standards for website accessibility developed by technology and accessibility experts. The WCAG 2.0 have been widely adopted, including by federal agencies, which conform their public-facing, electronic content to WCAG 2.0.  Aside from the fact that the web should be all-inclusive, business owners – and particularly those whose offerings are considered to be made available to the general public – should take steps to ensure that their websites and mobile apps, including all of their legal disclaimers, notices and privacy-focused terms, are accessible to users with disabilities. There is little doubt that we can expect to see an increase in ADA website litigation, and the CCPA may be just the catalyst for this.


Apple’s iOS 14 Transformative Privacy Announcements Mon, 06 Jul 2020 14:31:39 +0000 At the Worldwide Developers Conference on June 23, Apple announced an assortment of new privacy features – some quite significant for developers – that will be included as part of iOS 14. Some of the new privacy features include added protections against user tracking on apps and websites, as well as transparency measures to prevent apps from using cameras or microphones without a user’s knowledge. How location data is collected will also be impacted: iOS already enables users to block specific apps from collecting data about their location, but now users will be able to share approximate location data.

One very significant change is that app developers will now be required to disclose the types of data that their app collects, and importantly, call out specific information that could be used to track users across platforms. Inspired by nutrition labels that are typically affixed to food products, these new disclosure mandates from Apple will require developers to complete a specific form (showcased at the Worldwide Developers Conference). When users search for an app, the summary of collected data will appear alongside other information about the app.

This “privacy label” concept is certainly innovative, although – unsurprisingly – also a clear signal that the ubiquitous privacy policy is really not doing its job of accurately informing users of all data processing activities in an app or on a website. Nothing new here, and so I say unsurprisingly because many privacy notices are indeed bloated, convoluted, and time-consuming to read while still not clear enough on tracking. To be fair, however, the fact that many companies must comply with different laws around the world has done nothing to make privacy policies shorter and more to the point, despite these very same laws mandating plain language and accessibility. And on the other end of the spectrum, here in the United States,  the patchwork of state and federal and sectoral rules (unlike the “uniformity” of GDPR for instance) has not been very conducive overall to companies prioritizing privacy, except with respect to certain sectors such as healthcare or certain state laws such as CCPA, which only applies to certain businesses. In many ways, with these new requirements, Apple is forcing all developers who want to use the platform to take stock of their data collection and tracking, clearly disclose their practices and comply with new user controls, where legislation has failed to impose clear, global standards.

Note also that for iOS 14, developers will be required to obtain users’ permission through Apple’s “AppTrackingTransparency” Framework in order to track them or access their device’s advertising identifier. Apple provides examples of what it considers tracking, including the display of targeted advertisements in the app based on user data collected from apps and websites owned by other companies, sharing device location data or email lists with a data broker, sharing advertising IDs or other IDs with a third-party advertising network that uses that information to retarget those users in other developers’ apps or to find similar users, placing a third-party SDK that combines user data from other apps to target advertising or measure advertising efficiency, or using an analytics SDK that re-purposes the data it collects from the app to enable targeted advertising in other developers’ apps. No doubt, these requirements will have some major implications for the many players in the adtech ecosystem, as I and others discussed with Ronan Shields in a recent article he wrote for Adweek.

So what does this all mean for developers? Apple’s latest privacy move spells the beginning of a “pay-to-play” privacy era of sorts: being transparent about data collection and use can no longer be a de minimis consideration to simply avoiding fines or lawsuits, but actually a precondition for developers to utilizing the very platform without which their products may not exist. This may be a challenge for many companies, even if it really shouldn’t be. Tracking, in particular, is an area where companies often fail to comprehend the extent to which their own platforms – and more importantly their partners – actually track users. In fact, developers often don’t fully understand how much tracking they are allowing when using third parties (e.g., advertising or analytics SDKs). Apple’s changes will now require developers who have not made privacy a priority or dismissed it as a risk-based compliance strategy to really think about all of this. For developers who have not included a privacy policy in their app as Apple has always required,  it’s also time to re-evaluate how data is collected and used.

Many in the privacy world had predicted that privacy regulation would come in the form of self-regulation by private companies in the absence of global laws and proper enforcement, and despite many hits and misses, this is one instance where the private sector may perhaps do what legislation has not done: require companies to be more transparent while offering their users choices.

For more information on iOS upcoming disclosure requirements:


The California Privacy Rights Act: CCPA Part Two Tue, 23 Jun 2020 21:32:10 +0000 As if businesses did not already have enough to address with the COVID-19 pandemic and compliance with the California Consumer Privacy Act (the “CCPA”), businesses need to consider the California Privacy Rights Act (the “CPRA”), which will almost certainly be on the November ballot. Structured as an amendment to the CCPA and also known as “CCPA 2.0”, the CPRA ballot initiative was spawned by Alastair Mactaggart. You may recall Mr. Mactaggart as the real estate developer who submitted a ballot initiative that resulted in a negotiation with the state legislature to replace the initiative with the CCPA. If the CPRA is passed and becomes law, it would be effective and enforceable January 1, 2023, with certain provisions having a look-back provision.

The CPRA would establish a new category of “sensitive data” that is reminiscent of the GDPR’s definition of special categories of data but it is much broader. The definition is overly-inclusive, spanning from race, religion, and sexual orientation to financial account information and government identifiers (e.g., social security numbers). Consumers could choose to limit the use, sale and sharing of their sensitive data. Additional links on business websites may be required to “Limit the Use of My Sensitive Personal Information” in addition to the current “Do Not Sell My Personal Information” link that some businesses must now include under the CCPA.

Service providers would become directly liable for complying with certain portions of the CPRA. In particular, service providers would be required to cooperate with businesses in honoring consumer rights and to agree to maintain the same level of privacy as required by the CPRA and the CCPA.

The CPRA also adds a definition of “share” to expressly address lingering confusion over “sales” of personal information under the CCPA – and to ward off further arguments that sharing personal information for behavioral advertising in the adtech space is not a “sale” under the CCPA. Setting the record straight once and for all where the CCPA and implementing regulations have failed, “sharing” would include any provision or transfer to a third party for “cross-context behavioral advertising, whether or not for monetary or other valuable consideration.” Consumers would have rights to opt-out of having their information shared or sold and the “Do Not Sell My Personal Information” link would be expanded to “Do Not Sell or Share My Personal Information.”

Security mandates are also increased under the CPRA, which enlarges the scope of data security breaches to include unauthorized access to or disclosure of an email address and password or security question that would allow access to an account. If the breach were attributable to the business not maintaining reasonable security measures, consumers could bring private rights of action for these breaches. Businesses that conduct higher risk processing of certain personal information would be required to undergo annual audits, thereby putting more teeth in the security requirements. There would also be heightened fines for violations involving children under age 16.

Notably, and also similar to the GDPR, a new data protection agency would be created to take off some of the enforcement load from the Attorney General. In addition to enforcing the CCPA and the CPRA, the agency would be authorized to prepare rules and regulations. The agency would be able to issue subpoenas and would have audit powers, as well as rights to impose regulatory fines.

Given today’s pro-privacy environment (especially in California), as businesses continue to navigate the CCPA, they would be well served to also consider the CPRA with an eye toward the not-too-distant future.

The Trademark Fork in the Open Source Road Wed, 27 May 2020 17:45:46 +0000 When open source developers call us asking to confirm that they can use the trademark or name of an open source project for their newly forked project, they do not get the black and white answer “Yes” that they desire but rather the grey area lawyer response – “It depends on what you propose to use, how you propose to use it, the license, and whether there is a naming or trademark policy.” My partner, Gail Hashimoto and I recently authored a client alert discussing in detail the issues and how to work around them [link} ]]> Returning to Work: CCPA Considerations Thu, 21 May 2020 22:02:37 +0000 As cities and states gradually open up, companies have begun to assess under what circumstances they can re-open the workplace – and in particular, what health-related personal information can and should be collected. When it comes to monitoring employees, generally speaking, privacy and employment law are increasingly overlapping as more stringent laws are adopted, and COVID-19 has brought this overlap to the forefront. Our employment team at Hopkins & Carley has provided a number of resources and webinars on the employment-related issues of COVID-19 and what can and cannot be done (available here). Here we will focus on the intertwined privacy implications of allowing individuals – employees and non-employees – back into offices and facilities, particularly with respect to the California Consumer Privacy Act (CCPA).

What are the CCPA’s notice requirements?

The CCPA has been in effect since January 1, 2020, and applies to many businesses across all industries, from tech companies to traditional brick and mortar retailers. To find out if your business is subject to CCPA, please see our prior post available here. Businesses that are subject to the CCPA have certain notice-related obligations to fulfill where they collect and retain certain health-related information. If a covered business measures the temperatures of employees, or otherwise assesses health-related symptoms prior to entry into a facility or office, and collects and retains this information, it must provide notice at the time of collection. These practices will no doubt also apply to visitors and guests who enter the premises. In each case, a notice of collection must be provided. However, depending on whether the individual is an employee or a visitor, this notice (and the related rights) will differ.

  • Employees. In 2019, the CCPA, still in the legislative process, was amended on several fronts. In particular, Amendment AB25 provided some limitations on employee-related information by carving out from certain obligations under the CCPA any information collected “about a natural person acting as a job applicant to, an employee of, owner of, director of, officer of, medical staff member of, or contractor of [a covered] business” but only to the extent that the covered business’s collection and use of the information is solely within the employment context. What does this mean? A business’s compliance obligations with respect to employment-related information are limited because employees and job applicants may not exercise their rights to know, to request deletion and to opt-out out of sales. However, the business must still provide notice of collection of the different categories of information collected in the employment context. Importantly, this exemption has a one-year moratorium, meaning that unless the exemption is extended or the CCPA is further amended, beginning January 1, 2021, employees and job applicants will be able to exercise their rights to know, to request deletion and to opt-out of sales under CCPA.
  • Visitors. For visitors, this employee specific moratorium does not apply, and any visitor whose information is collected will be able to exercise his/her rights under CCPA. This means that the notice to be provided at the time of collection must, in addition to disclosing what is collected and why, explain how the visitor may exercise his/her right to know and deletion (opt-out presumably will not apply unless a company is actually selling the information it collects, and one would hope that this scenario never unfolds).

In practice, we recommend that a covered business provide a notice at or prior to collection – one for visitors and one for personnel. Note that if the information is collected but not retained or otherwise stored (e.g., the temperature is taken but not recorded), no notice is required. We recommend providing the notice at the entrance of each facility or office, as well as obtaining a signature from the employee or visitor acknowledging receipt and ensuring that a copy is provided to each individual. This will be helpful to comply with documentation requirements in the event of an audit. Whoever administers the collection of information should also be trained on CCPA as required, and if any service providers have access to the information, even if they are not engaged to actively use the data, care must also be taken to ensure that the relevant service provider agreements are in place. Lastly, businesses must consider how long this information should be retained. If there is no legitimate need to retain the information once the pandemic is behind us, the information should be securely deleted.

What security measures should businesses consider?

Leaving aside notice obligations (and rights), it’s also important to remember that health information is considered sensitive information under virtually all data protection laws. What this means in the United States (at least) is that an organization that collects and stores this type of identifiable data must ensure that it has commensurate security measures in place. While many states do not have overarching privacy laws like the CCPA, all have “security incident” rules that trigger data breach notifications to regulators and/or affected individuals where the unauthorized access or loss concerns certain types of personal information, including health information. Moreover, certain laws like the CCPA or the NY SHIELD Act specifically require preventative measures be taken by organizations that collect and process this type of information. Notably, the CCPA provides for limited consumer actions where a data breach affects health information. Since the CCPA’s effective date, we have seen an uptick in class action lawsuits under this limited right of action. Most of these do not include unauthorized access to the types of information that are singled out by CCPA, but this increase is a clear indicator that class action litigators have the CCPA on their radars. NY’s SHIELD Act does not carry a private right of action, but fines are sure to be steep where companies are found not to have implemented proper security measures. In sum, the more sensitive the information, the greater the exposure. All companies should take steps to ensure that their security measures reasonably protect the personal information that they collect.  But where health information is collected and retained, the risks increase, and what may appear as reasonable security measures for the collection of a name and IP addresses, for instance, may not suffice. The risk is all the greater for companies subject to CCPA.


When it comes to any personal information, compliance with the various laws that may apply requires a comprehensive review of outward-facing and internal policies, both as to transparency and security. Ensuring that notices are provided at the time of collection, maintaining adequate security measures, and evaluating data retention procedures are some of the key components of protecting data. This also requires businesses to monitor service providers that may have access to personal information. When it comes to sensitive health information, these steps are all the more important in order to ensure that individuals understand what is being collected and why it is collected, who may have access to it, and of course that the data is properly secured.

Privacy Issues in Bankruptcy Sales Fri, 15 May 2020 22:21:58 +0000 I recently co-wrote the following client alert with one of my colleagues, Monique Jewett-Brewster. Monique advises creditors, commercial landlords and tenants, and asset purchasers in business bankruptcies and in all other aspects of insolvency law.

As we move closer to a global recession caused by the current pandemic, some companies will find themselves in the unfortunate position of having to seek bankruptcy relief. This may have some important and often overlooked privacy implications. There is no question that in this day and age, one of a business’ most valuable assets is the personal information that it has collected from its customers and/or end-users – often more so than any of its tangible assets. Increasingly, as business shifts online, this is true not only of technology companies but also of “brick and mortar” companies.

However, when a business becomes a debtor, the sale of personal information can be problematic. Section 363(b) of the US Bankruptcy Code provides that a debtor that has a privacy notice prohibiting the transfer of personally identifiable information (“personal information”) may not use, sell or lease such information other than in the ordinary course of business unless (1) the use, sale or lease is consistent with the terms of the privacy notice or (2) after the appointment of a consumer privacy ombudsman (“CPO”) the court finds, after giving due consideration to the facts, circumstances, and conditions, that the sale or lease would not violate applicable non-bankruptcy law. These restrictions only apply if the debtor disclosed to its customers a privacy notice prohibiting the transfer of personal information to persons not affiliated with the debtor and the policy was in effect on the date of the bankruptcy filing.

While there are some existing decisions discussing these consumer privacy protections, because privacy has only recently become such a “hot topic” – certainly as compared to the last big round of corporate bankruptcies – these restrictions are now much more likely to come up in the context of bankruptcy sales in this next round. Organizations contemplating bankruptcy proceedings should, therefore, be aware of these restrictions in connection with a Section 363(b) sale motion. Although most technology companies, accustomed to doing business online, have been more proactive about ensuring that they have privacy notices and the proper disclosures posted to their websites, this is often not the case with “brick and mortar” businesses, such as retailers and restaurant groups. Additionally, while many privacy notices do already address sales of information, including in bankruptcy, how the sale language is specifically worded, the nature of the purchaser, and, importantly, the number of prior versions of the privacy notice could become impediments to effortlessly transferring users’ personal information. One question likely to be debated is whether the failure to disclose that personal information may be sold in a bankruptcy sale or otherwise remaining silent on the issue – rather than an outright preclusion – is tantamount to a prohibition on the transfer of personal information under those circumstances – an argument likely to be made by consumer privacy advocates.

Indeed, broad disclosures typically made in privacy notices will certainly garner much attention. In a not-so-recent yet very relevant 2010 case involving a debtor magazine, the bankruptcy trustee sought to transfer the debtor’s subscribers’ personal information that had been collected over the course of 11 years. However, the FTC became aware of the negotiations and notified the parties that any sale, transfer, or use of the information would potentially violate the Federal Trade Commission Act’s prohibition against unfair or deceptive acts or practices. In sum, because the debtor’s “simple, explicit, and clear” privacy notice advised subscribers that their personal information “would not be sold, shared, or given away to anybody” – even stating upon sign-up “[p]lease note our amazing privacy policy” – the sale or transfer of the data would potentially constitute an unfair or deceptive trade practice by the debtor. You can read the FTC’s full letter here. This case actually dates back to 2010. Since then, the FTC has stepped up its review of privacy practices outside of the bankruptcy context for unfair or deceptive acts or practices under Section 5 of the FTC Act, especially as companies continue to make overbroad promises (e.g., “we will never sell your information”) or deceptive statements when it comes to their privacy practices – certainly in an attempt to reassure consumers who have become increasingly savvy about their privacy rights. As such, one can only imagine that in this next round of bankruptcies, consumer information is unlikely to go “unnoticed” in the event of a sale.

With respect to how the issue is handled in bankruptcy court, outcomes will differ. A bankruptcy court may approve a debtor’s proposed sale of personal information even where its privacy notice prohibits the transfer of personal information. In the event that a CPO is appointed in the bankruptcy case, the CPO’s task will be to educate the court about the personal information and the privacy implications. As an initial step, the CPO will identify personal information in the debtor’s custody or control and where it is located. As we all know, personal information may be stored in several locations, including company devices, personal devices, a cloud-based platform and on-premise servers, which means taking stock of all personal information that is collected and where it is stored. The CPO will then advise the court about the debtor’s privacy notice, evaluate the sensitivity of the personal information, and possibly recommend limits on the sale to protect consumer privacy. In some cases, a debtor may avoid the need for the appointment of a CPO if the proposed buyer is materially in the same line of business of the debtor and agrees to use the personal information for the same purpose specified in the debtor’s privacy notice – as well as comply with that privacy notice. No matter the outcome, the appointment of a CPO, if required, can substantially delay the process and possibly reduce the value of a debtor’s assets.

Leaving aside the US Bankruptcy Code, specific privacy laws such as the GDPR may further impact the conditions under which personal information may be transferred to a purchaser. This could further impact sales of personal information where the company is handling the personal data of individuals located in the European Economic Area. As such, it’s important to review existing privacy notices and assess applicable data protection laws prior to filing, particularly if customer personal information is your organization’s most prized asset.

A Spectrum of Issues in the Time of COVID-19 Wed, 08 Apr 2020 22:47:21 +0000 While this post may not fit under the header of the “Privacy Hacker”, I wanted to step aside from privacy and security and share some insight on common issues and topics with which we are assisting clients during this unprecedented time.

Contract Interpretation and Updates

Clients are seeking our guidance on contract interpretation, including the ability to terminate contracts. With the supply chain disruptions that flow through the entire chain, force majeure clauses are now being closely scrutinized.  Depending on the law that governs the contract, force majeure events may or may not excuse performance: factors hinge on whether the event causing the failure to perform was foreseeable and if performance is truly impossible (as opposed to much more difficult or expensive to perform). Notably, not being able to pay generally is not considered a breach that can be excused due to a force majeure event.

While some force majeure clauses are written broadly and refer to “events beyond the reasonable control” of a party to the contract, other clauses refer to an enumerated list of events. It is questionable how courts will interpret the broad clauses and whether or not a pandemic or quarantine will be read into the clause if there is only a broad catch-all statement. Again, this will depend on the law that governs the contract, a fact-specific analysis and of course how the language actually reads. Even when a contract lacks a force majeure clause, common law defenses to performance, such as impossibility, impracticability and frustration of purpose, should be considered. Without question, this is an intertwined, case-by-case analysis.

In any event, for new contracts, we are ensuring that pandemics and epidemics are added to force majeure clauses. One can also add quarantines to the list. When drafting a clause, the language should cover both delays as well as the failure to completely perform. Care should be taken to specify how a party is to notify the other party of a delay or failure due to a force majeure event. If the clause refers to a notice provision in the agreement, you will want to ensure that the notice provision is accurate and reflects reality, and this would, for instance, include removing antiquated facsimile notices.

Additionally, when negotiating new agreements, particularly any supply agreements that require a customer to purchase all of its products or parts from the vendor, the customer should be able to purchase those products and parts from a second source. Having such a clause works as a future mitigation effort if there is any catastrophic force majeure event that prevented the vendor from supplying its products to the customer.

For online businesses, now is a good time to refresh your standard online contracts and terms in general. With recent cases providing further guidance on the formation and validity of online contracts, businesses should also evaluate the method (including placement) and language employed when requiring users to agree to their terms of use/service and other contract terms. To turn back to privacy for a moment, whether your company’s terms are enforceable may also impact how some weaker aspects of your privacy and security practices are enforced against you.

Employment and Privacy

Our employment group has been extremely busy answering client questions on managing this dynamically changing situation. There are many employment issues surrounding this global pandemic in the workplace, and you can see their posts here for reference:

Of course, privacy and security are extremely important right now – not just with the transition to WFH and the security issues that this creates – but because regulators continue to enforce and lawsuits continue to be filed. If Zoom is any indication of this, now more than ever, assessing where your company stands with privacy and security should be a key priority. Our next post will follow shortly.

Corporate, Financing and M&A

In the past week, our corporate group has been advising clients on the CARES Act, including, the paycheck protection program. See here for that post-

Venture-funded companies can obviously expect the board and investors to take a much closer look at the company’s financials. Investors will want to better manage burn rates, and implement short and long term cost saving solutions. Among high costs for many organizations are outside counsel fees, and many clients have already begun to engage medium-sized law firms, such as Hopkins & Carley. With lawyers of the same (or a higher caliber), lower billing rates and streamlined teams, mid-sized law firms offer more bang for less buck, companies would be well poised to reevaluate whether they wish to continue with the big law billing rates and practices that they were willing to put up with in better economic times. For a sense of how much of a difference this can mean, first-year associates at some of the largest firms are billed out at higher hourly rates than experienced senior associates and, in some cases, partners of mid-sized firms – many of whom have their own big law experience.

On the M&A side, looking at the statistics from the Institute of Mergers, Acquisitions and Alliances and elsewhere, we see a trend where M&A takes a sharp downshift in the year of the downturn (e.g. 2001, 2008 and 2020) and then creeps up rather swiftly the following year. With COVID-19 being such an unprecedented triggering event for a down economy, and with some industries at a complete standstill for months to come, it remains to be seen what the effect will be on M&A.

Until next time, be well.

SHORT AND LONG TERM PRIVACY CONSIDERATIONS TO NAVIGATE OUR NEW REALITY Mon, 30 Mar 2020 17:18:54 +0000 As businesses struggle to navigate the new reality created by Covid-19, there are a few things to keep in mind both in the short and long term, when it comes to privacy and security.

Security & WFH.

With employees working remotely, now more than ever organizations are at risk of cybersecurity incidents. Malicious players will seek to exploit increased vulnerabilities in this age of WFH, and with IT teams scrambling to ensure that all of their employees can connect remotely and remain productive, some of the most obvious risks should not be overlooked:

  • A large number of organizations had not anticipated the need for laptops or other devices for ALL of their employees. As such, many workers across the country are now using their personal devices to perform their jobs, which may include handling proprietary and/or personal information. However, a number of these personal devices will not only lack some of the basic security tools and software (e.g., firewalls or antivirus software) and controls on what can be downloaded, but may also already contain some unsavory software or applications that increase the risk or malware distribution. In fact, some personnel may shortcut and use personal email accounts to transfer documents, which adds yet another level of risk, as further noted below. Add to this mix the exchange, transfer, and processing of proprietary and personal information, and this could lead to some very problematic unintended or unauthorized disclosures.
  • To connect and get work done, workers need a WiFi network, and unfortunately, some employees may be using unsecured WiFi networks. This could potentially be a very big problem if employees are accessing information via an unsecured or vulnerable WiFi network – such as a neighbor’s unsecure network. Some of the many risks of using unsecured WiFi networks include eavesdropping – which enables malicious players to access and capture everything remote workers are doing online including login credentials, emails, and other or proprietary information – as well as exposure to malicious attacks. No doubt, it is important to ensure that employees are using secure WiFi networks coupled with company VPN’s to prevent any malicious scanning activity.
  • Many organizations lack specific policies that specifically warn employees NOT to use personal email or messaging applications lacking encryption when they exchange the organization’s confidential information. Some of these policies, also commonly referred to as “BYOD” policies, are intended to inform workers of what they can and cannot do with their devices. Consider Bob sending a personal email to a friend and colleague that Mike in marketing tested positive for COVID-19 (i.e., sensitive health information) or an employee transferring customer lists with personal data via unencrypted messages. WFH devices aside, employees should also be reminded not to toss confidential documents in household garbage bins, to turn off smart devices that are voice-activated, and to take calls that involve confidential information in a “private area” of the home. Failing to clarify policies with personnel is very risky. Now would be a good time to remind employees of how they should minimize these risks.

Ensuring that your organization’s  IT and legal teams are working closely together to develop policies and procedures will help identify and minimize these increasing cybersecurity risks.

Health Information.

When to collect and how to handle health Information – which by definition is considered sensitive in most jurisdictions – is of course front and center with Covid-19. Finding a balance between respecting employees’ privacy and ensuring the safety of other employees (as well as the public) is no easy feat. While the current situation was unexpected and presents new challenges on the privacy front, the rules still apply, and it is important to process any information relating to an employee’s health in compliance with those rules. Organizations should make sure that they understand what laws may apply to them, review their internal policies and procedures and act based on the particular circumstances. While employers have a duty to safeguard health and safety, this does not mean collecting any and all information whenever they please, or forcing employees to hand  over information no matter the circumstance. Organizations should also ensure, as a rule of thumb that sensitive information, such as health data, is stored with added security. This includes limiting access to a “need-to-know” basis, and once the data is no longer needed, deleting it.

Increasing Marketing.

The financial fallout from Covid-19 will no doubt be tremendous. Organizations are scrambling to stay afloat – and we are just a couple weeks in, here in the United States. As organizations push their marketing and advertising teams to generate leads, whether in the B2B or B2C space, it may be tempting to skirt some of the rules. But again, privacy laws still apply. Some, such as the GDPR and CCPA, have specific rules that affect marketing and advertising – namely, with respect to how personal data is collected and whether individuals receive proper notice or consent where required. Companies must consider CAN-SPAM rules and the TCPA in determining whether or not they can email or text individuals who had previously opted-out of marketing messages (or who did not previously opt-in to text messages) about steps they are taking to address Covid-19 – taking care to determine if anything in the email or text language could be construed as marketing or would be deemed a transactional, relationship message or “emergency” message.

Plans to build databases with personal information, implement lead generation tools, increase ad partners, or send direct marketing communications are just some of the things that should be evaluated in conjunction with applicable privacy laws. For instance, in the EEA, you cannot just send random individuals direct marketing (with some exceptions depending on the Member States) without first obtaining their specific consent. Likewise, if you want to start using an ad intermediary to promote advertising in California, there may be some restrictions if your organization is subject to CCPA, and similarly if your organization purchases information from lead generation databases, Article 14 notice requirements of the GDPR may kick in. And while courts are closed in many places right now, regulators are still enforcing. Case in point: here in California, a group in the advertising space had recently sent a letter to the Attorney General (tasked with enforcing the CCPA beginning July 1) requesting a delay in CCPA enforcement given the current circumstances. Their attempt to delay enforcement was rebuffed by the office of the Attorney General, which responded that it remained committed to enforcing the law as planned. Ensure that your organization doesn’t ignore existing privacy rules just because of the current chaos, as this will not be a valid excuse for non-compliance if a regulator comes knocking.

Bankruptcy Considerations.

Unfortunately, some companies will have no choice but to file for bankruptcy once the dust settles, and this may have some privacy implications. In this day and age, one of a business’ most valuable assets is the personal information that it has collected from its customers or end-users – often more so than any of its tangible assets. But when a business becomes a debtor, the sale of personal information can be problematic. Section 363(b) of the US Bankruptcy Code provides that a debtor that has a privacy policy prohibiting the transfer of personally identifiable information (or that fails to disclose that the debtor may sell or transfer such information to third parties) may not sell or lease such information unless (1) the sale or lease is consistent with the terms of the privacy policy or (2) after the appointment of a consumer privacy ombudsman (CPO) the court finds, after giving due consideration to the facts, circumstances, and conditions, that the sale or lease would not violate applicable non-bankruptcy law. These restrictions only apply if the debtor disclosed the privacy policy to persons not affiliated with the debtor and the policy was in effect on the date of the bankruptcy filing. There are some existing decisions discussing these consumer privacy protections, but because privacy has been pushed to the forefront since the last big round of corporate bankruptcies, it is very likely to be a much bigger issue in this next round. Organizations contemplating bankruptcy proceedings should therefore be aware of these restrictions in connection with a Section 363 sale motion. While many privacy notices do already address sales of information, including in bankruptcy, how the sale language is specifically worded, the nature of the purchaser, and, importantly, the number of prior versions of the privacy policy could become impediments to effortlessly transferring users’ personal information. Moreover, specific privacy laws such as the GDPR may further impact the conditions under which personal information may be transferred to a purchaser. As such, it’s important to review existing policies and assess applicable laws prior to filing, particularly if customer personal information is the organization’s most prized asset.

These are just some of the many issues that must be considered in these difficult and uncertain times. Organizations may understandably be tempted to put privacy and security on the back-burner for now, but this could have some far-reaching consequences in the long run as our new normal settles in.

Above all, stay safe.

Privacy (& CCPA) In Commercial Real Estate Wed, 11 Mar 2020 22:31:12 +0000 While much of the discussion around the California Consumer Privacy Act (CCPA) has centered around organizations that collect personal information online, less attention has been directed to the requirements that may come into play when personal information is collected offline. We recently wrote about how CCPA applies to the restaurant industry specifically (you can read that blog here), but there is no question that many other industries and businesses really ought to be paying close attention to CCPA and how to comply with the various requirements. One of those is commercial real estate.

On a recent visit to a client’s office in the San Francisco financial district, I arrived in the lobby of a large commercial office building and headed to the security desk. As is common, I was asked for my ID, which I promptly surrendered. I am accustomed to having security personnel look at my ID and hand it back immediately, but this time, the gentleman behind the counter actually wrote down the details of my information before handing back my ID. As it happens, the process was unusually slow enough to give me time to look around for some privacy notice or reference to privacy practices – something that has become a bit of a habit for a privacy practitioner like myself, post-CCPA. Unsurprisingly, there was no privacy notice (or reference to a privacy notice) to be found – be it on the counter, the wall behind the counter, or anywhere else. I asked the security guard where I might be able to locate a privacy notice, but when he looked at me like I was speaking a foreign language, I knew better than to insist.

As I made my way to the elevator, I quickly did some math and concluded that the owner of (or company managing) this massive commercial building must surely be subject to CCPA. What’s more, I had just handed over the details of my California ID, yet had no idea whatsoever what company was collecting it, how my information would be secured, how long it would be retained, and to whom it might potentially be disclosed. Granted, the security guard had transcribed my ID details by hand, but how are those handwritten logs stored and where? And who’s to say this information isn’t then entered into a computer system at the end of each business day? I arrived for my meeting and quickly turned my attention elsewhere, but this occurrence made me realize just how personal information that is collected offline has been overlooked in all of the CCPA frenzy.

While the commercial real estate company (or property manager) at the center of my story may otherwise have taken steps to comply with CCPA, one glaring detail was clearly overlooked: transparency and notice at the time of collection of my personal information. To be clear, CCPA requires more than a well-drafted privacy policy: it requires a covered business to provide consumers with clear and effective notice of its privacy practices at or before the time of collection. Further, such notice must be visible or accessible where consumers will see it before any personal information is collected. Thus, at a minimum, the commercial real estate company – or the security company to which it may have outsourced on-premise security, as the case may be – must provide a notice of privacy practices, even for data collected offline, in some conspicuous manner at or before collection. This could potentially take the form of a conspicuous reference to an online privacy notice (e.g., on a sign in the lobby). In addition, the security attendant should be capable of answering relevant questions or directing visitors appropriately, if asked.

Security obligations should also be top of mind. The fact that the personal information in this instance was collected offline makes this requirement no less applicable. In fact, this is a key point because CCPA grants consumers a limited private right of action against the unauthorized access and exfiltration, theft, or disclosure of certain types of personal information, including the right to seek statutory damages. While this new cause of action represents a significant change in existing cybersecurity litigation, it is by no means an unlimited right (nor is it automatic). However, it will come into play in connection with certain breaches of more “sensitive” types of personal information that are not encrypted or redacted – and this includes government-issued IDs. In other words, in addition to providing notice to individuals at the time of collection about how the personal information will be used, the property management company should review its security policies and procedures to ensure that the personal information it collects when people stroll into the lobby is reasonably secured.

What are the risks of failing to do so? This depends, but they could include (a) worst case scenario, a security breach that “fits the bill” and ends in a class action lawsuit, or (b) best case scenario, a complaint to the California Attorney General. Truth be told, until enforcement of CCPA begins in July 2020, or until a major class action lawsuit is filed for a security breach under CCPA, nobody has much visibility on how things will play out. The California Attorney General does not have unlimited resources, so this will be interesting. In the meantime, for some high-level points on how to address the offline collection of personal information, I would again refer you back to our last blog post. Although geared toward the restaurant industry, it touches on some of the questions that come up for many brick and mortar companies dealing with data collected offline.

Leaving aside the offline collection of personal information at the point of entry, there are plenty other forms of data collected “online” in the real estate space, and those too should be regularly re-evaluated. To many in the commercial real estate space, we are increasingly moving toward “smart buildings”. A 2016 Deloitte report predicted that the new mantra in commercial real estate would be location, information, analytics. In a recent 2020 outlook report, Deloitte points out that the industry has evolved even further and that “the most successful commercial real estate companies could follow the mantra: location, experience, analytics.” Commercial (and some residential) buildings today are fraught with IoT-connected devices, sensors and myriad technology-assisted mechanisms or devices designed to measure and control entry/egress, monitor the use of certain spaces or provide various usage metrics (just to name a few). As buildings get smarter, the data footprint increases. What’s more, many of these technologies also control other aspects such as HVAC, lighting and WiFi, which can trigger security issues. As we know from Target’s massive data breach linked to a third-party HVAC vendor in 2013 (and many other high-profile data breaches since then), poor security practices and vendor management can lead to massive data security incidents.


Bottom line? If your organization collects or handles personal information (and/or relies on service providers to do so), privacy should be front and center. As technology merges with every aspect of operating a building, privacy and security practices should be regularly assessed to ensure compliance with applicable laws, beginning with CCPA. In fact, we can’t stress this enough for all types of companies: the more quickly they address privacy and security in their day-to-day operations, the better equipped they will be to face the oncoming onslaught of broader and stronger privacy and security laws.

How CCPA Affects Brick & Mortar Industries: Restaurants Mon, 24 Feb 2020 23:14:15 +0000

Not so long ago, technology and the restaurant industry were worlds apart. If you wanted a reservation, you’d leave a voicemail that would be transcribed only to be deleted shortly thereafter. Loyalty cards were punch cards with no name attached. And if the wait for brunch was too long, you’d add your first name to a scrappy list that was discarded at the end of the day, or be handed a small buzzing device to let you know when your table was ready. Those “carefree” (or data-free) days have been replaced with a multitude of interconnected applications that all require the collection of personal information in some way – and importantly, that hang on to this information for longer periods. Restaurants and restaurant groups that collect the personal information of California residents and meet any one of the CCPA thresholds (i.e., over $25 million in annual revenue, collection of data on more than 50,000 consumers or 50% of revenue from sales) must comply with California’s stringent new law. Because the definition of personal information under CCPA is very broad and includes online identifiers, email addresses, and location data, as well as offline data (just to name a few), many successful restaurant groups are likely to fall within these thresholds and be subject to the CCPA.

Even if a restaurant group is not physically located in California, but is found to be doing business in California, such as by marketing to California residents, or having a website that collects data or allows purchases online from California residents, the CCPA may apply. Restaurants that are close to the California border or have a high number of patrons or employees from California also may be subject to the CCPA.

With respect to the CCPA’s provisions around affiliates, it is questionable how franchisees and franchisors will be affected. While the definition of a business under the CCPA includes another entity that has “the power to exercise a controlling influence over the management of a company” and with common branding, we do not yet know how the California Attorney General will consider franchisees and franchisors. However, based on the current letter of the law, a franchisor entity that is itself subject to the CCPA and shares common branding, and has management control of a franchisee may render the franchisee subject to the CCPA. What’s more, if the franchisor and/or franchisees share personal information with each other, these transactions may be considered a sale under the CCPA, requiring additional obligations.

Companies that collect the personal information of California residents online have been required, under CalOPPA, to maintain an updated privacy notice since 2004, though many do not. But with CCPA, restaurants need to consider both online and offline (or in-person) collection of personal information. Email lists, loyalty programs, raffles with business cards in a fishbowl, payment card data and reservations that notify others of the reservation all involve the collection of personal information and could potentially trigger the requirement to provide a privacy notice.

Applications and software that are intended to provide reservation and delivery services to restaurants also collect and retain a lot of personal information. Restaurants must therefore carefully consider the handling of personal information received or collected by third-party online reservation or delivery services, such as OpenTable and Doordash. These relationships will need to be addressed with a service provider or third-party contracts that include CCPA-mandated or recommended language between the restaurants that are subject to CCPA and the third parties that provide these services.

Another important question raised by CCPA is how must restaurants provide the required privacy notice? The latest CCPA draft regulations indicate that for in-person or offline collection of personal information, the privacy notice may be given to the consumer either manually via a paper notice or with prominent signage directing the consumer to the online privacy notice. Additionally, notice may be provided orally, although this begs the question of how the business would then prove that the required information was properly provided at the time of collection. Although we have yet to see restaurants providing privacy notices when we show up for a table, it is possible that we will start seeing privacy notices on the back of restaurant menus and/or signs in restaurants directing consumers to their privacy notices.

The CCPA’s financial incentive notice requirements will affect how restaurants provide special offers – such as requesting a consumer’s email in exchange for discounts or free meals. The coupon flyer for that free dessert in exchange for your loyalty signup accompanying your bill will (or at least should) include the required notice of financial incentive. That’s because if a consumer signs up, but later requests that his or her information be deleted, taking away the discount and removing him or her from the loyalty program is likely to be considered discriminatory and in violation of the CCPA. Does this spell the end to this type of loyalty program? Likely not, but these programs will need to be more carefully crafted.

Complicating matters a bit more, customer service personnel and those directly interacting with consumers must, at a minimum, have a general awareness of privacy, know where to direct consumers requesting to see a privacy notice, and in some cases be trained as required for CCPA if they handle consumer requests. Speaking of consumer requests, restaurants will need to consider how to handle those if a customer exercises his/her rights while in the establishment.

Lastly, ensuring that any personal data is securely transferred and/or stored should be a priority – especially where payment information or employee information is involved. CCPA carries a private right of action for certain security breaches, which, if exercised by way of a class action, could wipe out an otherwise thriving restaurant in no time.

As different industries merge with technology that collects and stores their customers’ personal information, stakeholders must consider the various applicable privacy laws and rules, and how to properly implement them. There is no question that the CCPA affects businesses in all sectors, even those that operate largely “offline”, and in fact, some of the language was drafted with those specifically in mind. In turn, consumers everywhere will begin to see more privacy notices, which may alter their experience and awareness of the personal information that is required, these days, to make an entire industry run.

Staying on Top of Security Practices Fri, 07 Feb 2020 02:03:35 +0000  

If it’s not already, security should be a top priority for all companies that collect and hold personal data. Companies subject to the California Consumer Privacy Act (CCPA), effective since January 1, should be even more concerned given the new consumer right of action in the event of certain security incidents, and the increase in class actions to which this will inevitably lead (more on that below).

And yet...

During a recent discussion with friends in the hospitality/travel industry, I was surprised to hear of shockingly poor security practices when they described how travelers’ information was shared and transmitted on a daily basis. I learned, for instance, that travelers’ information – especially when it comes to groups – is often sent in unprotected, unencrypted documents, such as excel spreadsheets or pdfs, to equally insecure email addresses, with multiple recipients copied. These documents, which circulate freely among various players in the ecosystem, contain hyper-sensitive information, such as passport numbers, credit card information, location, and travel dates and addresses. We are not talking about a name and a device ID, here, but troves of data that hackers would love to get their hands on.

To be fair, not all companies have such cringe-worthy practices. The industry is comprised of big players, who themselves are not immune to breaches (Marriott, for one), but have (sometimes forcibly) invested in privacy and security, and smaller players that do not always have the resources – or in some cases prefer to look the other way – to beef up their privacy and security. What’s more, this is a VAST industry: it also encapsulates “ancillary” companies, such as those that plan events, group tours, conferences, not to mention the myriad marketing and advertising providers who have maximized their reach online. Yet, as automation increases daily, travel still requires sensitive data such as government-issued IDs, payment information, location information, and other data that most travelers would hope is securely held and transmitted by those to whom they entrust it. Nonetheless, many companies are still operating the “old-fashioned” way without any proper security practices, policies or checks, and potentially exposing sensitive client data.

Coming back to CCPA, the private right of action is not an unlimited right (nor is it automatic). It requires:

the unauthorized access and exfiltration, theft, or disclosure ...


of a California resident’s first name or first initial and his or her last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted or redacted: (i) social security number; (ii) driver’s license number or California ID card number; (iii) account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account; (iv) medical information; or (v) health insurance information ...


as a result of a business’s violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information.  

In other words, it stems from breaches of more sensitive types of personal information that is not encrypted or redacted, and it takes more than just unauthorized access to a device ID and email to trigger the right. But some industries such as the hospitality and travel industries – by the very nature of the services they provide – have to collect the type of sensitive data that triggers this section of CCPA if other conditions are also met. Many of these companies are also probably themselves subject to CCPA (based on one of the three thresholds) or may be servicing companies that are. As such, they should be even more “incentivized” to take all necessary precautionary measures when collecting, processing, transferring and sharing personal information that can trigger a private right of action under CCPA. If not, with statutory damages up to $750 per individual per incident, it is time to really invest in getting security right, beginning with STRONG internal practices, including encryption, and, most importantly, educating employees such that the mere thought of transmitting or receiving passport numbers, names, addresses and payment information in an unencrypted, unredacted spreadsheet immediately raises a BIG red flag.

CCPA is the first in the United States to introduce this private right, but all 50 states have data breach notification requirements, and some states, like New York, have recently strengthened laws relating to data security. Meanwhile, consumers are becoming savvier about data privacy in general and wearier of data breaches, meaning that increasingly, the opportunity to file a class action will not fall on deaf ears. Having solid security-driven policies, systems and software is one piece of the puzzle: also implementing best security practices within an organization and ensuring through diligent oversight that those practices are followed by employees and contractors is crucial, now more than ever.

And leaving the potential fines and damages aside, one would be hard-pressed to find anyone, anywhere, who is comfortable with having their name, passport information, location information and financial information floating around unprotected in a world where hacking is a sport. This is where privacy truly meets ethics.


CCPA Is Here: What Does It Look Like So Far? Tue, 07 Jan 2020 06:09:34 +0000 The new decade started off with a flurry of emails informing us of updated privacy notices being posted on websites in response to the California Consumer Privacy Protection Act (“CCPA”). While most people began their new year resolutions or happily watching football on January 1, 2020, some of us were busy peeling through these updated privacy notices. What our review reveals is that companies are handling the CCPA in many different ways. Some take a strict approach to the letter of the law and proposed regulations, while others outright challenge the CCPA’s broad definitions and sweeping requirements by flouting language suggesting that their original privacy policy already disclosed everything it needed to, but, paraphrasing, “we now also have to disclose the same thing this way just because of CCPA.”

From our perspective, some companies are getting it “right” and some are not, but generally speaking – and judging from the high number of comments to the proposed regulations – most are hoping for some finality from the Attorney General with a final set of regulations. While some businesses may not have paid proper attention to the CCPA proposed regulations, which introduced new and more detailed practical obligations, and some are simply adopting a wait-and-see approach, other large entities can take the risk of challenging the law in their disclosures and take on large fines that are sure to come once the California Attorney General begins its enforcement efforts. And they already are. In fact, one very large social media giant that is no stranger to regulators in the U.S. and abroad takes the questionable position that it is not “selling” personal information despite the extremely broad definition in the CCPA. Actually, several internet titans have taken this approach.

CCPA “Sale”

Under the CCPA, “sale” or “selling” is to be interpreted broadly. It is NOT limited to what we think of colloquially as a real purchase for money. In fact, a CCPA “sale” includes releasing, disclosing, disseminating, making available, transferring, selling, renting, or otherwise communicating (in any manner or allowing them to collect through cookies) personal information by the business to another business or a third party for monetary or other valuable consideration (including a promise or commitment). In other words, if a business discloses personal information to any other person or entity that may use it for its own commercial gain or purposes (e.g., internal analytics, or disclosure to other parties for commercial gain) and thus not solely in order to fulfill a business purpose as a true service provider, this would likely be considered a sale. There are only limited exceptions, and the proposed regulations have made this even clearer. The definition is key when evaluating a business’s vendor relationships – in particular adtech services.

As such, unless the California Attorney General adjusts the definition in its final regulations, if said social media company uses personal information collected, in particular from certain of its adtech products, for its own purposes and other than to provide ads on behalf of its customers, it cannot be considered a service provider and would be deemed to sell personal information. It will be interesting to see how this position will be justified.

GAFA aside, we have also found a large Q&A site, as well as many other smaller companies, get it wrong on the issue of sale. On the one hand, these companies initially indicate that they are not “selling” personal information under the CCPA, yet they also include a seemingly conflicting “Let the Sunshine In” disclosure under a separate – but related – California privacy law. The disclosure indicates that they are providing your personal information to third parties for the third parties to use for their own marketing purposes. Going back to the CCPA “sale” definition, these companies cannot possibly be providing your personal information to third parties solely as service providers if the third parties can use the personal information for their own purposes. This could potentially raise a big red flag once enforcement comes around.

“Do Not Sell My Information”

Another observation is that the much reviled “Do Not Sell My Information” button – for which the Attorney General has yet to provide more information as promised in the proposed regulations – is notably absent from the homepages of many companies that do in fact sell personal information. Rather, the opt-out links are generally buried in the privacy notices, making them far less obvious than what the law had contemplated. One large media company got its “Do Not Sell My Information” right, with a link on the homepage that takes you directly to a banner that looks a lot like the cookie banners for the EU – only with an opt-out.


The CCPA also requires businesses to specifically list each defined category of personal information that is collected and disclosed and/or sold in the prior 12 months, as well as categories of recipients. While many companies are using the CCPA mandated terminology, others are simply referring back to their standard privacy disclosures without reference to the itemized categories set forth in the CCPA’s definition of personal information. It is questionable if this approach will suffice in the eyes of the Attorney General. Notably, many companies have included tables laying out the various categories of information that must be disclosed, presumably for easier reading, another requirement mapped out in the proposed regulations.

What Next?

With the Attorney General’s CCPA regulations still to be finalized, we expect that companies will continue to update their privacy notices throughout this year based on the market and what others are doing. Indeed the CCPA mandates that such notices are updated annually to ensure that they remain current and accurate reflect a company’s evolving privacy practices.

Importantly, although full enforcement is not expected to begin until July of this year, Attorney General Becerra announced mid-December that even prior to July, his team would be monitoring for potential violations on a large scale that involve the “sensitive, critical data” of California residents, and will prosecute cases as necessary, particularly as regards children. It should also be noted that the Attorney General sought earlier this year to increase the protections of the CCPA. The bill ultimately failed, but it should come as no surprise that Becerra, who is considered by many as having a “pro-privacy” penchant, is more likely than not to zealously seek enforcement of the CCPA once the regulations are finalized. So while we have seen many posts across the internet encouraging companies to sit back and relax until July 2020, we believe that this is probably not the best advice. Much like EU regulators prior to GDPR’s May 2018 deadline, Becerra himself has stated that making a good faith effort to comply is one thing, but ignorance of the law is not an excuse.

For now, our attention turns to whether Californians actually exercise their rights en masse, and how… In the meantime, it’s clear that companies across the board are still figuring out the ins and outs of the CCPA while eagerly awaiting a final set of regulations. And hoping that they are not the first company to be dragged into the limelight with a highly publicized class action following a security incident.

Stay tuned for more...


CCPA Global Checklist Fri, 22 Nov 2019 19:43:13 +0000 The California Consumer Privacy Act (CCPA) goes live in six weeks. While many companies have been working on mapping their data for some time, others are just getting started. Some of the issues left open by the language of the CCPA and the proposed regulations have yet to be resolved, but there is no question that come January 1, 2020, many California residents will be looking to their favorite apps, sites and businesses to see what, if anything, they have done to comply with this new data protection law. If your business has not begun its work, we recently created a client alert with a high-level checklist to move toward CCPA compliance. ]]> Privacy FAQ #2 – CCPA Tue, 22 Oct 2019 15:21:00 +0000 As part of our blog series, we share some of the most frequently asked questions that we receive from organizations across different industries regarding data privacy and security, and more specifically GDPR and CCPA. This is the second FAQ in our series.

Even though the California Consumer Privacy Act (“CCPA”) will be effective January 1, 2020, the time to plan for compliance is now.  It may seem as though you have plenty of time to prepare but it is a mistake to not start preparing. Indeed with the twelve-month lookback provisions, companies must have proper records of personal information that they collected as of January 1, 2019.

Under the CCPA, individuals have various new rights that must be detailed in a company’s just in time privacy notice (a new requirement under the Attorney General’s proposed regulations) and a company’s privacy policy, including the right to access their information, to request deletion of their information, to be informed of certain transfers of their information, to opt-out (if over 16) of or opt-in (if under 16) to sales of their information, and receive equal service and price even if they exercise their rights.

There are many nuanced questions to consider that may not be apparent on a cursory read of the CCPA or the proposed Attorney General regulations. Some basic common questions arise when companies first hear about the CCPA, as follows.

Does the CCPA really apply to my small business?

The CCPA applies to businesses (and their parent and subsidiaries) that process information of California residents and have annual gross revenue exceeding $25 million or derive more than 50% of its revenue from sales of personal information. The CCPA also applies to businesses that handle personal information of 50,000 or more consumers, households, or devices.  Setting aside the question as to how to allocate and account for information of a single household with multiple individuals, the CCPA would apply to businesses that collect information and have only 137 unique users a day.  A typical website alone will easily meet this prong thereby becoming subject to the CCPA.

My business does not have an office in California, so am I still subject to the CCPA?

The CCPA applies to businesses that collect information from a “natural person who is a California resident,” meaning an individual in California other than for a temporary or transitory purpose (e.g. a tax paying resident) and every individual who is domiciled in the State who is outside the State for a temporary or transitory purpose (e.g. if a California resident is on vacation in Hawaii).  A business with no offices or connections to California that does not collect information from any California resident may not be subject to the CCPA.

I don’t think we really collect personal information. Does the CCPA apply?

Keep in mind that the CCPA defines personal information extremely broadly. Under the CCPA, personal information is data that is capable of being associated with a consumer or household, including, IP addresses, cookies, beacons and pixel tags that can be used to recognize a data subject, probabilistic identifiers, and gait patterns.  If you have a “Contact Us” form on your website, you take résumés for job postings or if your website tracks cookies, you are collecting personal information.

We do not collect personal information online, only offline. Does the CCPA apply?

Yes, the CCPA applies with respect to both online and offline personal information.  If you operate a retail store, and take payments or have a ledger of purchases, you are collecting personal information.  Indeed the Attorney General’s proposed regulations clearly indicate that brick and mortar companies must offer privacy policies on site or refer customers to where they can be found.

But the information we collect is all public. How is that information addressed?

There is a very limited exception for publicly available information. Publicly available information is information that is available from government records. So even if an individual’s corporate email address can be found on another website, if you collect that email address on your website that information falls within the scope of the CCPA.

My business is a non-profit. How does the CCPA affect me?

Even if you are a non-profit entity that is not a “business” subject to the CCPA alone, certain non-profit subsidiaries of for-profit businesses may mean that your non-profit must comply with the CCPA. Additionally, your service providers are likely subject to the CCPA and you must ensure that they comply with the CCPA.

Who are consumers under the CCPA? Are employees covered?

A “consumer” under the CCPA is defined broadly.  A consumer is not only a customer or user of your services, products or websites.  Your employees are also consumers – for now. This is a shift from the norm of having a company policy that indicates there is no expectation of privacy in the workplace.  Companies need to prepare internal privacy policies for their employees and provide their employees with the rights under the CCPA. Employers had hoped that certain amendments to CCPA, notably AB 25, would completely remove employee data from the scope of CCPA and pass through committee without modification for final Senate approval. But in July, 2019, the California Senate Judiciary Committee advanced AB 25 with changes, which means that employers still will have to grapple with their handling of employee data under CCPA.

AB 25 provides a one year hold for 2020 on CCPA’s application of many of its provisions to the personal information of employees, contractors, and job applicants. This hold is limited and only applies when the employer uses the data in the scope of its employment relationship for employment purposes. Any use by an employer outside the scope of the strict employment relationship would remain covered under CCPA. For example, if an employer allowed its insurance company to collect employee data in order to market other insurance services to those individuals, this would be subject to CCPA.

Employers must still notify employees, contractors and job applicants of the personal information that they collect and how they use it in a privacy policy. Such employee data will also fall within the purview of CCPA’s private right of action for data breaches resulting from the failure to implement reasonable security measures.

Why do I have to prepare now?

The recordkeeping requirements require companies to have detailed records that are organized based on the CCPA’s categories with respect to personal information dating back to January 1, 2019.  Detailed records and data maps must be prepared now to meet the CCPA’s recordkeeping obligations.  Companies should inventory the information that they have collected since January 1, 2019.  Businesses must publish their new privacy notices, updated privacy policies and include the “Do Not Sell My Personal Information” link (if required) by the time that the CCPA is effective. Businesses must have proper agreements with their service providers and have systems, policies, and procedures in place to manage user rights and update their privacy policies annually.

From the Golden State to the Silver State – Privacy Law in Nevada Thu, 12 Sep 2019 22:01:36 +0000 Similar to the months before the GDPR went into effect at the end of May 2018, companies are now actively preparing for compliance with the California Consumer Privacy Act (CCPA).  As California leads the pack of states in terms of privacy and technology laws, other states have followed suit, including Nevada.

The Nevada statute (SB 220) is an amendment to Nevada’s existing law, which requires website operators to have a privacy policy with certain disclosures.Although recently passed, the Nevada statute (SB 220) quickly goes into effect October 1, 2019, three months before the CCPA’s January 1, 2020 effective date.  SB 220’s scope of coverage is much narrower in some aspects yet much broader as to other issues.  The law applies to any website operator that collects information about Nevada consumers.  In contrast, the CCPA applies to businesses that meet certain thresholds of revenue, collection of certain amounts of personal information, and a percent of revenue from personal information sales.

While the Nevada statute applies broadly to various website operators, its requirements regarding opt-out rights for sales of personal information are narrower.  The Nevada law defines “sale” as the sale or license of personal information for “monetary consideration” to a company or individual to then “license or sell the covered information to additional persons.”  Under the CCPA, a sale is considered any transfer of personal information for any type of consideration, monetary or otherwise (e.g. a promise to do something).  The second prong of the sale definition is more akin to an aggregator or reseller that will further sell the personal information, rather than a disclosure or sale to any third party like the CCPA.

So, even if one meets the operator definition under the Nevada law they must ensure their privacy policy includes the appropriate disclosures.  They also need to consider if they “sell” according to the Nevada law definition.  If they do, the company will need to include the opt-out right.  If the company does not “sell” according to the Nevada law definition, they will need to consider offering the opt-out or not, or including alternative language.

The Nevada law does not include a private right of action (with a great risk of class actions) as does the CCPA.  Rather, under the Nevada law, the Nevada Attorney General can seek an injunction or impose a penalty of up to a maximum of $5,000 per violation.  As with many of these new privacy laws, we do not yet have guidance or case law as to how to interpret some of the language.  So, “per violation” could be construed broadly as per violation, per individual, or per violation period (aggregated for all individuals affected).  With the ever-changing privacy and data security landscape, companies need to pay attention to these new privacy laws and keep up to date on how to address them.

School and Student Privacy vs. Security – How to Balance Tue, 27 Aug 2019 16:00:47 +0000 With schools starting this fall, one invariably will think about the safety of their children – both online and in the real world. There are numerous security programs and apps now that tout data security technology and online measures to keep students safer in the real world classroom. The technology generally markets itself as having the ability to predict the propensity of students to conduct acts of violence in schools. In order to do so, the software offered by these companies reads our kids’ emails and social media posts insofar as they are publicly available or sent through school networks. The technology contains certain key words and phrases that trigger alerts, which are then sent to the provider’s customer, typically schools. It sounds promising and is definitely optimistic given today’s climate, which I like. But are they really getting the full picture? If a message is privately sent between students on social media as opposed to a school’s network email, it seems that the software would not have access important information indicating a kid’s nefarious plans or potential harmful activities if it were included in private interaction. It is also questionable if the limited scope of the protection services offered by these companies is worth what we give up in terms of privacy. 

In reviewing some of these companies and their technology online, I could not find a leading provider’s privacy policy that applies to their collection of data through their security service. Presumably (hopefully....), they provide their privacy policies to their actual customers (e.g., the schools), although one would think that this should be available to anyone who might be affected by the technology and services that they provide. Additionally, it is unclear how they may use the data other than to provide the service to their customers – such as for separate, commercial purposes or in connection with law enforcement requests or databases. On one provider’s website, I could not even locate a privacy policy. All in all, I find it difficult to trust these providers and their privacy practices. If a child is being monitored while using their service, the service is collecting the username (and real name) and any information posted or transmitted as well as various cookies, IP address etc. They potentially collect a ton of data on our children, without providing a whole lot of information as to how it is all used, retained and secured. Care should be taken to ensure that these companies do not misuse the data and have robust security practices to guard against data security breaches. Moreover, if this technology is targeted to use by children under age 13 (12 and under), the company needs to consider its obligations under the Children’s Online Privacy and Protection Act (COPPA). Interestingly, under COPPA, schools can agree to data collection on behalf of parents and stand in the parents’ shoes when making decisions on their students’ privacy. However, as an experienced privacy practitioner, I had a hard enough time deciphering what personal data is collected, how it used and whether it is disclosed – so I am not 100%  comfortable with a school administrator reviewing the privacy and security policies of these technologies for my own children.

If this technology does serve its purpose and make our kids safer, it is worth trying something new, but at what cost?

For more information regarding Hopkins & Carley’s Data Privacy & Security practice, please visit our site.

Privacy FAQ #1 Tue, 27 Aug 2019 16:00:04 +0000 As part of our blog, from time to time we will share some of the most frequently asked questions that we receive from organizations across different industries regarding data privacy and security, and more specifically GDPR and CCPA. This is the first FAQ in our series.

What’s the Deal with the Data Protection Officer?

Not to be confused with a CPO (Chief Privacy Officer) or EU Representative, the role of data protection officer (DPO) has specific legal meaning under the GDPR. The primary role of a DPO is to ensure that the organization to which it is appointed processes the personal data of its staff, customers or any other individuals (i.e., data subjects) in accordance with applicable data protection rules. Many, but not all organizations subject to GDPR, are required to appoint a DPO, but given the unique nature of the DPO, the why, when and how of this topic is definitely at the top of our US clients’ FAQs.

Do we really have to appoint a DPO?

Many organizations assume that the requirement applies only to data controllers. This is not true. The requirement applies to both controllers and processors, and an organization must appoint a DPO under certain circumstances set forth in Article 37.

Most relevant to our clients is the second threshold:  organizations that engage in (or rely on) large scale, regular and systematic monitoring of individuals (for example, online behavior tracking) in order to perform their core activities. Any organization that is subject to GDPR and processes personal data (not as a side hobby) should really perform a comprehensive analysis and weigh the risks of noncompliance with this requirement. Specific guidance from the former Article 29 Working Party contains useful factors to assess applicability and several examples, and it is quite clear that in this day and age, many tech companies, no matter the size, fall squarely into this category.

Of course, as a friendly reminder, failure to appoint a DPO (where it is required) is subject to administrative fines up to 10M EUR or 2% of global revenue of the preceding financial year, whichever is higher. But, as I explain below, there are some real advantages to appointing someone whose top priorities are essentially to (a) educate on and implement privacy-friendly measures and (b) let your organization know when it is dropping the ball on data protection. We can safely say that implementing proper planning and getting ahead of potential privacy-related issues is entirely better than dealing with privacy fails after the fact.

Note that if your organization determines that it is not required to designate a DPO because it does not meet the above criteria, documenting the analysis will demonstrate compliance with the accountability principle of the GDPR. It is highly recommended.

What’s so special about the DPO anyway?

When a DPO is appointed (whether on a mandatory or voluntary basis), he or she becomes responsible for all of the processing activities carried out by your organization, meaning that your organization cannot limit the role of the appointed DPO to a few cherry-picked data processing activities. If appointed, a DPO must have an independent and fully supported role reporting to the highest level in the company (i.e., the Board) in order to ensure compliance with applicable data protection rules. Among other things, this includes:

  • Ensuring that all parties involved (i.e., controllers, data subjects, etc.) are informed of their rights, obligations and responsibilities;
  • Advising the organization on the interpretation or application of the data protection rules to ensure compliance and accountability;
  • Cooperating with the regulators in the event of any investigations, complaints, audits, etc.;
  • Notifying the organization where it fails to comply with the applicable data protection rules.

The GDPR also sets out very specific rules regarding the DPO’s autonomy and support from within the organization. One point that comes up regularly in client discussions is that a DPO may not be dismissed or penalized by the organization for performing his or her data protection tasks just because there is a disagreement or dissent on the part of the DPO with respect to internal data-related measures, activities or products. Simply put, this is not your typical corporate officer, because the DPO’s primary allegiance is to data protection as a whole, and not just the organization to which he or she is appointed.

Do we really have to?

Here in the U.S., the question of whether to appoint a DPO comes up regularly with companies that are subject to the GDPR. In my experience, it is a loathed topic that C-level execs prefer to sweep under the rug or put on the back burner, despite its importance. When speaking with U.S. companies, I often sense an unspoken determination to categorically overlook the DPO provisions altogether by simply putting the discussion off to a later date. I suspect that this is due to several factors, including (a) the perceived lack of clarity as to when a DPO must be appointed (despite the above-mentioned guidance on the topic), (b) the cost of appointing a DPO and difficulty of identifying a competent one, (c) the DPO’s unusual status and autonomy within the organization, and (d) the role of DPO being perceived as one of ubiquitous data “cop” at odds with business purposes and goals.

While there may be some truth to the latter argument depending on where you stand, in this age of increased scrutiny from regulators and data subjects, a good DPO can actually be a boon for any organization handling personal data because a good DPO is a tremendously effective preventative tool when it comes to data protection. First, it may simply be mandatory, and complying with applicable law is always recommended. But more importantly, the EU notes that the DPO is a cornerstone of accountability and that appointing a DPO can facilitate compliance or even become a competitive advantage for businesses – and is fast becoming an industry standard. Today, providing privacy-friendly products, services and platforms are not only not optional given the increasing framework of global data protection regulations, but it is also what most individuals seek. Appointing a DPO shows a company’s willingness to move in this direction.

Finally, if you operate globally, data protection laws of other jurisdictions also require the appointment of an officer whose role and tasks are essentially equivalent or similar to the DPO (think South Korea or Brazil).

Instead of asking “do we really have to” companies should be asking “where do we find a good DPO?”