Cybersecurity does not just pose technical challenges; companies must always keep their eye on the human component of cyber risk.  For example, even the most damaging and sophisticated hacks – such as the recent Twitter hacks – can result from spear-phishing. Imagine that: multi-billion-dollar new technology communication apparatuses brought to their knees by charming fraudsters on the phone. But the pseudo-insider risk does not end with phishing schemes. Instead, hackers and criminals of all stripes are seeking weaknesses that will enable them to gain leverage over companies.

On August 26, 2020, the United States Department of Justice charged a Russian national for offering $1 million to a Tesla employee in return for them infecting their employer’s network with malware.  Egor Igorevich Kriuchkov met with the employee on multiple occasions as part of the recruitment effort.  The malware was designed to exfiltrate data from Tesla. The criminal group behind the attack allegedly would then demand $4 million in return for the information.

A ransomware operation, like the one detailed in the criminal complaint, encrypts all of a company’s data and demands a hefty payment in return for the decryption key. For many companies, it is less expensive to pay the criminal’s fee than to undergo lengthy service outages. Ransomware is often spread via malware. However, this case describes using corrupted insider employees as the agents of infection. This altered tactic shows how determined criminal hackers can be.

Based upon the allegations contained in the complaint, this constituted a long-term, concerted effort by the criminals. The criminal recruiter traveled from Russia to Nevada multiple times and apparently spent many thousands of dollars wooing the individual.  While remarkable, it would be foolish for companies to think this approach was novel. The fact of the matter is that international criminal espionage is a real and persistent threat.

Determined adversaries – just as with the traditional espionage world – will search for and develop human assets in their search for data.  Numerous legal consequences can flow from these types of attacks.  If the crime is successful, and the ransom is paid, companies can faces years of litigation to make themselves whole again. This litigation could be with their vendors, who had their services interrupted, or with the company’s own insurers.

In January 2020, long-running litigation over the cyber coverage afforded by a business owner’s policy in a 2016 ransomware attack was resolved at summary judgement by a Maryland federal judge’s order.  In that case, insurance coverage was finally ordered but only after years of litigation. If an insider was the source of the ransomware, the path to coverage would be even longer and more legally treacherous.

The question then becomes, what can be done?  First, companies must recognize just how enticing and valuable their digital assets have become. Just like any other valuable asset, companies must adopt a 360-degree approach to security.  That approach should be regularly re-examined and scrutinized. Now that the human element has become an obvious and well-funded vector for criminal mischief, companies must re-double their internal training and education. Company employees must be taught that their access will be targeted by criminal elements and how to respond. You do not want your employee to be surprised by novel and unexpected attention. It is best to let everyone know that they are not participants in a spy movie but could participate in a prison movie, if they choose poorly.

Companies also have to begin planning for cyber litigation now, not later. The preparation on a litigation standing will re-enforce proper workflows and decision-making, even under pressure. Early litigation preparation will also strengthen later arguments that the cyber response process should be considered privileged, which is a burgeoning litigation fight.  Those privilege issues should be the subject of a separate discussion.  However, to paraphrase digital godfather Benjamin Franklin, smart companies know that one byte of preparation equals one terabyte of cure.

Have you seen the new headline about Twitter in the news?  It may be time to double-check your corporate practices and check-in with your employees.
The top new FTC privacy probe concerns Twitter, which has been charged by the FTC for breaching a 2011 consent decree by using phone numbers and email addresses that users provided for security purposes (e.g., two-factor authentication), and using that information to target users with advertisements.  According to Twitter, the FTC has alleged that this conduct breached the 2011 consent decree (resulting from a hacker incident), which purportedly “bars Twitter for 20 years from misleading users about the extent that it protects their privacy.”  Twitter’s misuse of user’s phone numbers and email addresses for direct advertising was self-revealed by the company in an October blog post, which noted that it did not know how many people were impacted.  Twitter called the misuse “inadvertent.”  Twitter said on Monday it expected to pay between $150-250 million to resolve the FTC investigation.

This story should have all corporations taking a look at their own corporate practices and making sure that similar actions are not happening within their closed doors.  All companies are “barred” from misleading users about the extent that they protect user privacy by virtue of, inter alia, FTC section 5 and state UDAP statutes.  (In Twitter’s case, because of its 2011 security incident, it also was barred via a consent decree.)  Also, with many employees working remotely these days, it may be harder for companies to oversee how different sections of the company are interacting.  Perhaps in Twitter’s case this alone led to the issue?  Who knows. 

In any event, let Twitter’s big headline be a reminder to all companies to: (1) review your privacy policy and any other representations that you make to customers regarding the privacy and security of their data; (2) review your corporate procedures (not just policies, but check in with the boots on the ground) to ensure they are consistent with your privacy policy and other representations that you make to customers; and (3) make sure corporate training events regarding privacy and security are in place, so as to create a corporate culture of data protection and privacy by design.

Mistakes happen, but diligence can prevent them…and can help serve as a defense for when they do happen.

Last week, on July 16, 2020, Europe’s top court invalidated the EU-US data flow arrangement called Privacy Shield.  In a world with competing privacy regulations, many thousands of global businesses relied heavily Privacy Shield to conduct their business across EU-US borders (there are 5300+ current Privacy Shield participants, and the transatlantic economic relationship is valued at $7.1 trillion), so the decision sent shockwaves through the business/data privacy community.  Further, the decision has implications that extend beyond the Privacy Shield framework, and beyond EU-US data transfers.

The decision arose from a case brought by Mr. Maximillian Schrems, an Austrian lawyer, who requested in 2015 that the Irish Data Protection Commissioner (the “Irish DPA”) order the suspension or prohibition, in the future, of the transfer by Facebook Ireland of his personal data to Facebook, Inc. (the latter being located in the United States), a case commonly referred to in the privacy community as Schrems II.  (Notably, Schrems I was an earlier case brought by the same lawyer challenging Facebook’s transfer of personal data to the U.S. under a prior EU-US data transfer framework that had been determined adequate, the US-EU Safe Harbor framework, which was struck down as a result of that case.  Following that, Facebook turned to Standard Contract Clauses (SCCs) as a basis for cross-border data transfers, causes Schrems to file the Schrems II case.  And thereafter, the Privacy Shield framework was established and determined adequate, providing a second basis for Facebook’s cross-border data transfers.)

A copy of the Schrems II decision can be found here.  A copy of the European Court’s 3-page press release, summarizing the 31 page decision, can be found here.

In Schrems II, the grounds for review of Facebook’s cross-border data transfers was the United States’ digital surveillance policies and practices, including the Foreign Intelligence Surveillance Act (FISA) and executive order 12,333 (which sanctions bulk data collections)).  Schrems argued that these U.S. surveillance practices are inconsistent with European fundamental rights giving citizens the rights to privacy and data protection, as set out in EU Charter of Fundamental Rights, the European Convention on Human Rights, and several pieces of EU legislation, including the General Data Protection Regulation (specifically, Mr. Schrems called out Articles 7, 8 and 47 of the Charter).  In other words, transferred EU personal data may be at risk of being accessed and processed by the U.S. government (e.g., the CIA, FBI and/or NSA) in a manner incompatible with privacy rights guaranteed in the EU, and EU data subjects may have no right if this happens to an effective remedy.  [Notably, while the original request was focused solely on the SCCs, the Court found the validity of the Privacy Shield Decision relevant to assessing the sufficiency of SCCs, and also any obligations to which the supervisory authority may be subject to suspect or prohibit such a transfer.  See, e.g., Decision at 25.]

In reaching its decision to invalidate the Privacy Shield, the European Court pointed out issues with the U.S. surveillance framework, as it applies to EU-US data transfers, such as (1) that “E.O. 12333 allows the NSA to access data “in transit” to the United States, by accessing underwater cables on the floor of the Atlantic, and to collect and retain such data before arriving in the United States and being subject there to FISA”; (2) “activities conducted pursuant to E.O. 12333 are not governed by statute”; and (3) “non-US persons are covered only by PPD-28, which merely states that intelligence activities should be ‘as tailored as feasible’.”  See Decision at 14-15.  The Court also pointed out, and focused quite heavily on, the lack of remedies for EU citizens whose rights have been violated as a result of US surveillance practices.  For example, it pointed out that the Fourth Amendment to the U.S. Constitution does not apply to EU citizens; the NSA’s activities based on E.O. 12333 are not subject to judicial oversight and are non-justiciable; and the Privacy Shield Ombudsperson is not a tribunal within the meaning of Article 47 of the Charter, and thus, U.S. law does not afford EU citizens the protection required.  See Decision at 15, 29 (“the existence of such a lacuna in judicial protection in respect to interferences with intelligence programmes based on that presidential decree [E.O. 12333] makes it impossible to conclude, as the Commission did in the Privacy Shield Decision, that United States law ensues a level of protection essentially equivalent to that guaranteed by Article 47 of the Charter.”).

With respect to SCCs, the European Court held that “the competition supervisory authority is required to suspend or prohibit a transfer of data to a third country pursuant to standard data protection clauses adopted by the Commission, if, in the view of that supervisory authority and in the light of all the circumstances of that transfer, those clauses are not or cannot be complied with in that third country and the protection of the data transferred that is required by the EU law, in particular by Articles 45 and 46 of the GDPR and by the Charter, cannot be ensured by other means, where the controller or a processor has not itself suspended or put an end to the transfer.”  See Decision at 22.

On the same day as the European Court issued its decision, the U.S. Secretary of Commerce Wilbur Ross issued the following statement regarding the ruling: “While the Department of Commerce is deeply disappointed that the court appears to have invalidated the European Commission’s adequacy decision underlying the EU-U.S. Privacy Shield, we are still studying the decision to fully understand its practical impacts,” said Secretary Wilbur Ross. “We have been and will remain in close contact with the European Commission and European Data Protection Board on this matter and hope to be able to limit the negative consequences to the $7.1 trillion transatlantic economic relationship that is so vital to our respective citizens, companies, and governments. Data flows are essential not just to tech companies—but to businesses of all sizes in every sector. As our economies continue their post-COVID-19 recovery, it is critical that companies—including the 5,300+ current Privacy Shield participants—be able to transfer data without interruption, consistent with the strong protections offered by Privacy Shield.”  The full press release can be found here.  []

The United Kingdom’s Information Commissioner’s Office made a similar statement, implicitly acknowledging that the impact of the Schrems II decision extends to all EU cross border data transfers, not just EU-US transfers under the Privacy Shield.  A copy of the ICO’s statement can be found here.

Since the decision, the big headline has seemingly been two-fold: (1) SCCs survive, but (2) Privacy Shield has been invalidated.  Respectfully, the first half of this is a half-truth.  Companies proceeding with cross-border data transfers using SCCs and binding corporate rules should consult with counsel to assess the risk involved in their transfers and evaluate alternative transfer frameworks.  Unless and until the United States changes its surveillance practices, including not conducting surveillance of in-transit data (i.e., before it arrives in the U.S.), and providing EU data subjects with a right of redress regardless of the surveillance program that their data is subject to, the Schrems II decision puts nearly all EU-US data transfers at risk in industries subject to government surveillance.  For companies that have received requests for information from U.S. law enforcement in the past, and who want to avoid risk, the safest way to proceed may be to (1) consider whether the data at issue is even needed in the first place, and (2) consider simply transfer data processing for European data subjects to Europe.  Other bases for data transfers, such as binding corporate rules and SCCs could also be considered, but back-up plans should be in place, as proceeding under these frameworks could be risky in view of the Schrems II decision.

Companies should also take a close look at their policies and practices for responding to requests for information from U.S. law enforcement, such as the number of requests the company has received; the number of user accounts the requests involved and how many of the user accounts were for EU data subjects; the types of requests the company received, e.g., subpoenas or search warrants; the records the company produced, and in what cases those records were for EU data subjects; the bases for the requests (e.g., were they pursuant to government surveillance programs that provide data subjects with a right to a remedy in the event their rights are violated, or subject to, e.g., E.O. 12333 which provides no such remedy); whether, to the company’s knowledge, EU data subjects whose information was shared ever contended that their rights had been violated.  The more information a company has to show that it has not provided information to U.S. law enforcement pursuant to surveillance programs that do not offer EU data subjects a remedy in the event their rights are violated, the safer footing the company should be on going forward with respect to their EU-US data transfer practices.

For companies that have not received requests for information from U.S. law enforcement pursuant to surveillance programs, the path forward has more options.  While Privacy Shield has been struck down for all companies, it is likely that a new or revised framework will be designed and an adequacy decision will be sought with respect thereto (just as it was for Privacy Shield, when Safe Harbor was struck down).  In the interim, it is prudent for these companies to consider alternative data transfer frameworks (such as SCCs), and in the future, try a “belt and suspenders” approach such that their business does not “hang their hat” on a single framework for cross-border data transfers (this is what Facebook and numerous other companies learned, thus causing them to rely on both Privacy Shield and SCCs).  Companies should also take a good look at the data they are processing, particularly with respect to EU data subjects, and ask whether it is even necessary, and whether processing in the U.S. is necessary.  In some cases the answer may be “yes,” but the more a company can practice data minimization – particularly when cross-border data transfers are at issue – the safer it may be.  Finally, just because your company has never received a request from U.S. law enforcement pursuant to a surveillance program yet, does not mean you never will—particularly in certain industries, such as tech and telecommunications.  You should plan for such requests prior to them happening.

If you need any help evaluating your company’s risks in view of the Schrems II decision, or determining best practices for going forward, please contact us at

Rothwell Figg attorneys Martin M. Zoltick and Jenny L. Colgate published a chapter titled “Privacy, Data Protection, and Cybersecurity: A State-Law Analysis” in the International Comparative Legal Guide to: Data Protection 2020, published by Global Legal Group Ltd.

While some countries have enacted comprehensive privacy and data protection laws, like the EU’s General Data Protection Regulation (GDPR), the United States does not have a single, comprehensive federal law regulating the collection, use, and disclosure of personal information. Instead, U.S. privacy and data protection legislation is comprised of a patchwork system of federal and state laws and regulations – hundreds of them! The chapter in the Guide aims to identify and provide some guidance regarding the state privacy and data protection laws broken into three sections: 1) technology-specific laws; 2) industry-specific laws, and 3) generally applicable laws.

To read the article in its entirety, access the article on the ICLG website here:

As of July 1, 2020, the California Attorney General began enforcing the California Consumer Privacy Act (CCPA). While many details about CCPA enforcement remain uncertain, many states have enacted or will enact their own privacy laws. Businesses clearly must wrestle with this mosaic of new and emerging privacy restrictions.  Some industries have explored legal challenges to these statutes. One of these legal challenge, against a new Maine privacy law, was just defeated.

In February 2020, lobby groups representing the broadband industry sued the state of Maine under the theory that the Maine privacy law violates their First Amendment protections on business speech The Maine privacy law in question prohibits ISPs from using, disclosing, or selling browsing history and other personal information without customers’ opt-in consent. The plaintiffs’ First Amendment argument draws from the theory that a company’s sale of private individual’s browsing history constitutes a First Amendment protected form of “commercial speech.”  The lawsuit’s second theory flows from the history of these types of browser privacy protections. Similar federal protections were rolled back by the President and Congress in 2017. After the federal roll-back, Maine legislators decided to revive these protections for their residents. However, the lawsuit argues that the federal deregulatory action had to effect of preempting and barring any supplementary state law protections.

In a July 7, 2020 order, Judge Lance Walker agreed in general with the characterization of the browser history sales as “commercial speech,” but held that “intermediate scrutiny” of the privacy law, not “strict scrutiny,” governed judicial review of this law.  Against that legal backdrop, Judge Walker held that the law is not unconstitutional on its face and declared the preemption arguments “dubious.” While Judge Walker allowed the case to continue, the decision dealt a major blow to the theories underpinning the case.

The legal fight against privacy regulations will continue.  The Maine law addresses only a discrete privacy issue. The CCPA sprawls by comparison and will face many legal challenges, especially now that enforcement activity has begun.  However, the CPPA and other similar privacy laws will likely face similar First Amendment and preemption claims, in addition to a few other interesting wrinkles. The fight will continue.

Today the U.S. Supreme Court found in Barr v. American Association of Political Consultants, Inc. that the federal debt collection exemption to the Telephone Consumer Protection Act’s general prohibition on autodialed calls violates the First Amendment.  The Supreme Court held that the exemption was a content-based restriction on speech because it favors speech made for the purpose of collecting government debt over political and other speech.  Such content-based restrictions are subject to the “strict scrutiny” standard, which the government conceded it could not satisfy.

Rather than strike down the TCPA in its entirety, as some advocates have proposed, the Supreme Court held that the appropriate remedy was to sever the provision from the statute.  Justice Kavanaugh wrote the majority opinion, noting that “Americans passionately disagree about many things.  But they are largely united in their disdain for robocalls.”  The decision further notes that the Federal Government received a staggering 3.7 million complaints about robocalls in 2019 alone.

Due to the increasing prevalence and sophistication of robocallers, industry has proposed a number of creative technical solutions to combat robocallers.  For example, there are bots that answer calls and use artificial intelligence generated speech designed to waste telemarketers’ time by keeping them on the line, voice biometric technology that automatically identifies synthesized speech, and services that will automatically gather a robocaller’s details and generate a demand letter and court documents for filing.

With today’s Supreme Court decision, TCPA lawsuits will likely remain one of the most filed types of class actions in courts across the country, especially since private parties can sue to recover up to $1500 per violation or three times their actual monetary losses.

The California Consumer Privacy Act (CCPA) went into effect on January 1, 2020, and enforcement begins tomorrow, July 1, 2020.  Is your privacy policy compliant?  Here are a few quick questions that may help you determine the answer –

  • Does your privacy policy have a “last updated” date that is less than a year old?
  • Do you fully identify the personal information that your company collects?
  • Do you fully explain the purposes for which you use the collected personal information?
  • Do you share or sell the information, and if so, do you explain to whom and why?
  • Do you identify the rights individuals have with respect to their personal data, including:
    • the right to erase or delete all or some of one’s personal data;
    • the right to a copy of one’s personal data, including in machine readable form;
    • the right to change, update, or fix one’s personal data if it is inaccurate;
    • the right to stop using all or some of one’s personal data (where you have no legal right to keep using it) or to limit use of one’s personal data; and
    • the right to opt-out of the same of information.
  • Do you provide at least two forms of contact, for individuals to submit requests for information, including at least a toll-free number (or email address if your business is online-only)?

For more information, see Rothwell Figg’s Privacy, Data Protection, and Cybersecurity Page, including our CCPA Compliance Guide, or contact us directly at

The question is – do wiretapping statutes apply in cases where there is no traditional third party interceptor?  And more practically speaking, how does an entity using plug-ins and cookies avoid liability under wiretapping statutes while there is so much uncertainty in the law?

We previously blogged about this issue In re: Facebook, Inc. Internet Tracking Litigation (here).  We reported how: (i) the district court dismissed the plaintiff’s action, which brought claims under, inter alia, the Electronic Communications Privacy Act (ECPA) and California Invasion of Privacy Act (CIPA), pursuant to the “party” exception (i.e., there was no third party intermediary); and (ii) the Ninth Circuit reversed on grounds that the “party” exception is applicable where the sender is unaware of the transmission.  We also explained how the result of this Ninth Circuit decision was a split between circuit courts on the applicability of wiretapping statutes where a sender’s own computer transmits messages, and also a split within the Ninth Circuit.  And at the time we blogged, Facebook had filed a motion for rehearing before the Ninth Circuit.

Earlier this week, the Ninth Circuit denied Facebook’s motion for reconsideration, thereby solidifying the aforementioned splits.  While according to public sources Facebook has thus far declined to comment on its next steps, it seems likely that Facebook may file a petition for a writ of certiorari in the Supreme Court.

Unless and until the Supreme Court clarifies the scope of the Wiretap Act, those using third party cookies or plug-ins to track users’ Internet activity would be wise to (1) review their disclosures and ensure that they provide detailed information about what third party plug-ins and cookies the site uses, and exactly when and how they work (while being careful to ensure that, at the same time, they do not reveal corporate trade secrets or other confidential information); (2) review their consent procedures to ensure that affirmative consent to the use of the disclosed plug-ins and cookies is sought; and (3) review any contracts and terms of service with the third party.

On the flip side, companies that offer plug-ins may want to contractually require website operators to provide detailed disclosures and seek affirmative consent from users before installing code, or alternatively, they may want to implement their own consent mechanisms for their plug-ins, such as a “2-click solution.”  A 2-click solution is where a user clicks on an image (such as a “Like” button), and then informed consent is obtained directly from the user before installing the plug-in (e.g., “By clicking ‘Like’ you install a plug-in from Company X, which will direct your browser to send a copy of the URL of the visited page to Company X”).

If you have any privacy questions related to your company’s use of plug-ins or cookies, please contact us at

Articles summarizing CCPA often state that it applies to for-profit businesses that do business in California that satisfy certain criteria, and they fail to ever mention that CCPA does apply to some non-profits.

The CCPA defines “business” as “a sole proprietorship, partnership, limited liability company, corporation, association, or other legal entity that is organized or operated for the profit or financial benefit of its shareholders or other owners, that collects consumers’ personal information, or on the behalf of which such information is collected and that alone, or jointly with others, determines the purposes and means of the processing of consumers’ personal information, that does business in the State of California, and that satisfies one or more of the following thresholders…[annual gross revenue in excess of $25M; buys/receives/sells/shares personal information of 50K or more consumers; or derives 50 percent or more of its annual revenues from selling consumers’ personal information].”  See CA 1798.140(c)(1).

However, the statute does not stop there.  It goes on to explain instances when a non-profit could be subject to CCPA.  Specifically, if the non-profit entity controls or is controlled by an entity that qualifies as a “business” under CCPA (i.e., meets the above criteria), and shares common branding with that business, then the non-profit is subject to CCPA.

“Control” or “controlled” is defined broadly by CCPA as “ownership of, or the power to vote, more than 50 percent of the outstanding shares of any class of voting security of a business; control in any manner over the election of a majority of the directors, or of individuals exercising similar functions; or the power to exercise controlling influence over the management of a company” (emphasis added).  What “the power to exercise controlling influence over the management of a company” means is yet to be determined.

“Common branding” is defined by CCPA as “a shared name, servicemark, or trademark.”

If you need help determining whether your non-profit is exempt from CCPA, please contact us at