Is a U.S. federal privacy law on the horizon?

Tomorrow, September 23rd at 10:00 a.m., U.S. Senator Roger Wicker (R-Miss.), chairman of the Committee on Commerce, Science, and Transportation, will convene a hearing titled, “Revisiting the Need for Federal Data Privacy Legislation.”

The hearing will examine the current state of consumer data privacy and legislative efforts to provide baseline data protections for all Americans. It will also examine lessons learned from the implementation of state privacy laws in the U.S. and the E.U. General Data Protection Regulation, as well as how the COVID-19 pandemic has affected data privacy.

Witness testimony will be provided by several Former Commissioners of the FTC as well as the California Attorney General.

Watch the live hearing here or stay tuned for a discussion of the hearing in a follow-up post.

Partners Martin Zoltick and Jenny Colgate with associate Caitlin Wilmot will present a webinar in conjunction with Lexology titled “Employee privacy and security considerations in the age of COVID-19” on Wednesday, September 23, 2020, from 11 am – 12 pm ET.

The world changed dramatically in early 2020 as COVID-19 forced companies worldwide to change their practices and policies seemingly overnight. Some employees began working from home as companies scrambled to accommodate them; others began working on staggered schedules or with a reduced workforce as companies implemented safety measures to minimise risk for both employees and customers. In some cases, companies began furloughing or laying off employees.

In this webinar, attorneys from Rothwell Figg will discuss the important privacy, data protection and cybersecurity considerations that employers need to evaluate in any of these uncharted situations. The webinar will cover:

  • employees working from home (eg, technologies that may track employees, the use of videoconferencing tools and the pitfalls of home networks);
  • corporate policies and practices that ensure network security (eg, multifactor authentication, the use of secure platforms for conducting all employment-related work, acceptable use and remote access and network monitoring for suspicious activity);
  • entering into new contracts with software or other vendors, particularly when in a rushed situation;
  • legal and regulatory compliance issues when employees are working in alternative set-ups and when employers are sidetracked (eg, California Consumer Privacy Act (CCPA) enforcement began on 1 July 2020);
  • employees having less direct oversight;
  • companies implementing back-to-work measures (eg, temperature checks and symptom, travel and exposure questionnaires);
  • companies considering surveillance measures (eg, physical surveillance, electronic surveillance and geolocation tracking);
  • data protection and trade secret issues associated with furloughed and laid-off workers; and
  • legislative efforts to create COVID-19-related privacy bills.

Registration is free and open to all. Please click here to register.

Partners Martin Zoltick and Jenny Colgate, along with associate Caitlin Wilmot, will present a webinar titled “Connected Healthcare – Navigating the Patchwork of US Privacy Laws and Developing a Platform that Promotes Trust” for the American Bar Association (ABA) on Monday, September 21, 2020, at 1 pm ET.

As the field of connected healthcare grows exponentially, so too are the fields of privacy and data protection law. The problem is that the growth of each is independent. While connectivity can directly benefit both patients and healthcare providers, they also come with risks. Legal non-compliance risks. Security risks. Trust risks. It is important that those in the field of connected healthcare stay informed of the ever-developing body of U.S. and state privacy and data protection law, as compliance with the huge patchwork of privacy laws is essential for avoiding fines, bad headlines, and being the subject of the next FTC or AG investigation, or the next class action lawsuit.

We will discuss some of the areas of privacy and data protection law that those in the field of connected healthcare should be paying attention to, such as HIPAA, CCPA, BIPA, and other biometric laws, IoT security laws, COPPA, ECPA, and web scraping laws including but not limited to the CFAA. We will also share some advice on best practices, as it is likely that for connected medical devices to be successful in the future compliance alone may not be enough. As consumers become more educated on privacy and data protection, they are looking for platforms that are built on the concepts of transparency and control. Transparency promotes trust.

The webinar is sponsored by the ABA Section of Science & Technology Law, and is open to ABA members and non-members. To learn more about the webinar, or to register, please click here.

Cybersecurity does not just pose technical challenges; companies must always keep their eye on the human component of cyber risk.  For example, even the most damaging and sophisticated hacks – such as the recent Twitter hacks – can result from spear-phishing. Imagine that: multi-billion-dollar new technology communication apparatuses brought to their knees by charming fraudsters on the phone. But the pseudo-insider risk does not end with phishing schemes. Instead, hackers and criminals of all stripes are seeking weaknesses that will enable them to gain leverage over companies.

On August 26, 2020, the United States Department of Justice charged a Russian national for offering $1 million to a Tesla employee in return for them infecting their employer’s network with malware.  Egor Igorevich Kriuchkov met with the employee on multiple occasions as part of the recruitment effort.  The malware was designed to exfiltrate data from Tesla. The criminal group behind the attack allegedly would then demand $4 million in return for the information.

A ransomware operation, like the one detailed in the criminal complaint, encrypts all of a company’s data and demands a hefty payment in return for the decryption key. For many companies, it is less expensive to pay the criminal’s fee than to undergo lengthy service outages. Ransomware is often spread via malware. However, this case describes using corrupted insider employees as the agents of infection. This altered tactic shows how determined criminal hackers can be.

Based upon the allegations contained in the complaint, this constituted a long-term, concerted effort by the criminals. The criminal recruiter traveled from Russia to Nevada multiple times and apparently spent many thousands of dollars wooing the individual.  While remarkable, it would be foolish for companies to think this approach was novel. The fact of the matter is that international criminal espionage is a real and persistent threat.

Determined adversaries – just as with the traditional espionage world – will search for and develop human assets in their search for data.  Numerous legal consequences can flow from these types of attacks.  If the crime is successful, and the ransom is paid, companies can faces years of litigation to make themselves whole again. This litigation could be with their vendors, who had their services interrupted, or with the company’s own insurers.

In January 2020, long-running litigation over the cyber coverage afforded by a business owner’s policy in a 2016 ransomware attack was resolved at summary judgement by a Maryland federal judge’s order.  In that case, insurance coverage was finally ordered but only after years of litigation. If an insider was the source of the ransomware, the path to coverage would be even longer and more legally treacherous.

The question then becomes, what can be done?  First, companies must recognize just how enticing and valuable their digital assets have become. Just like any other valuable asset, companies must adopt a 360-degree approach to security.  That approach should be regularly re-examined and scrutinized. Now that the human element has become an obvious and well-funded vector for criminal mischief, companies must re-double their internal training and education. Company employees must be taught that their access will be targeted by criminal elements and how to respond. You do not want your employee to be surprised by novel and unexpected attention. It is best to let everyone know that they are not participants in a spy movie but could participate in a prison movie, if they choose poorly.

Companies also have to begin planning for cyber litigation now, not later. The preparation on a litigation standing will re-enforce proper workflows and decision-making, even under pressure. Early litigation preparation will also strengthen later arguments that the cyber response process should be considered privileged, which is a burgeoning litigation fight.  Those privilege issues should be the subject of a separate discussion.  However, to paraphrase digital godfather Benjamin Franklin, smart companies know that one byte of preparation equals one terabyte of cure.

Have you seen the new headline about Twitter in the news?  It may be time to double-check your corporate practices and check-in with your employees.
The top new FTC privacy probe concerns Twitter, which has been charged by the FTC for breaching a 2011 consent decree by using phone numbers and email addresses that users provided for security purposes (e.g., two-factor authentication), and using that information to target users with advertisements.  According to Twitter, the FTC has alleged that this conduct breached the 2011 consent decree (resulting from a hacker incident), which purportedly “bars Twitter for 20 years from misleading users about the extent that it protects their privacy.”  Twitter’s misuse of user’s phone numbers and email addresses for direct advertising was self-revealed by the company in an October blog post, which noted that it did not know how many people were impacted.  Twitter called the misuse “inadvertent.”  Twitter said on Monday it expected to pay between $150-250 million to resolve the FTC investigation.

This story should have all corporations taking a look at their own corporate practices and making sure that similar actions are not happening within their closed doors.  All companies are “barred” from misleading users about the extent that they protect user privacy by virtue of, inter alia, FTC section 5 and state UDAP statutes.  (In Twitter’s case, because of its 2011 security incident, it also was barred via a consent decree.)  Also, with many employees working remotely these days, it may be harder for companies to oversee how different sections of the company are interacting.  Perhaps in Twitter’s case this alone led to the issue?  Who knows. 

In any event, let Twitter’s big headline be a reminder to all companies to: (1) review your privacy policy and any other representations that you make to customers regarding the privacy and security of their data; (2) review your corporate procedures (not just policies, but check in with the boots on the ground) to ensure they are consistent with your privacy policy and other representations that you make to customers; and (3) make sure corporate training events regarding privacy and security are in place, so as to create a corporate culture of data protection and privacy by design.

Mistakes happen, but diligence can prevent them…and can help serve as a defense for when they do happen.
 

Last week, on July 16, 2020, Europe’s top court invalidated the EU-US data flow arrangement called Privacy Shield.  In a world with competing privacy regulations, many thousands of global businesses relied heavily Privacy Shield to conduct their business across EU-US borders (there are 5300+ current Privacy Shield participants, and the transatlantic economic relationship is valued at $7.1 trillion), so the decision sent shockwaves through the business/data privacy community.  Further, the decision has implications that extend beyond the Privacy Shield framework, and beyond EU-US data transfers.

The decision arose from a case brought by Mr. Maximillian Schrems, an Austrian lawyer, who requested in 2015 that the Irish Data Protection Commissioner (the “Irish DPA”) order the suspension or prohibition, in the future, of the transfer by Facebook Ireland of his personal data to Facebook, Inc. (the latter being located in the United States), a case commonly referred to in the privacy community as Schrems II.  (Notably, Schrems I was an earlier case brought by the same lawyer challenging Facebook’s transfer of personal data to the U.S. under a prior EU-US data transfer framework that had been determined adequate, the US-EU Safe Harbor framework, which was struck down as a result of that case.  Following that, Facebook turned to Standard Contract Clauses (SCCs) as a basis for cross-border data transfers, causes Schrems to file the Schrems II case.  And thereafter, the Privacy Shield framework was established and determined adequate, providing a second basis for Facebook’s cross-border data transfers.)

A copy of the Schrems II decision can be found here.  A copy of the European Court’s 3-page press release, summarizing the 31 page decision, can be found here.

In Schrems II, the grounds for review of Facebook’s cross-border data transfers was the United States’ digital surveillance policies and practices, including the Foreign Intelligence Surveillance Act (FISA) and executive order 12,333 (which sanctions bulk data collections)).  Schrems argued that these U.S. surveillance practices are inconsistent with European fundamental rights giving citizens the rights to privacy and data protection, as set out in EU Charter of Fundamental Rights, the European Convention on Human Rights, and several pieces of EU legislation, including the General Data Protection Regulation (specifically, Mr. Schrems called out Articles 7, 8 and 47 of the Charter).  In other words, transferred EU personal data may be at risk of being accessed and processed by the U.S. government (e.g., the CIA, FBI and/or NSA) in a manner incompatible with privacy rights guaranteed in the EU, and EU data subjects may have no right if this happens to an effective remedy.  [Notably, while the original request was focused solely on the SCCs, the Court found the validity of the Privacy Shield Decision relevant to assessing the sufficiency of SCCs, and also any obligations to which the supervisory authority may be subject to suspect or prohibit such a transfer.  See, e.g., Decision at 25.]

In reaching its decision to invalidate the Privacy Shield, the European Court pointed out issues with the U.S. surveillance framework, as it applies to EU-US data transfers, such as (1) that “E.O. 12333 allows the NSA to access data “in transit” to the United States, by accessing underwater cables on the floor of the Atlantic, and to collect and retain such data before arriving in the United States and being subject there to FISA”; (2) “activities conducted pursuant to E.O. 12333 are not governed by statute”; and (3) “non-US persons are covered only by PPD-28, which merely states that intelligence activities should be ‘as tailored as feasible’.”  See Decision at 14-15.  The Court also pointed out, and focused quite heavily on, the lack of remedies for EU citizens whose rights have been violated as a result of US surveillance practices.  For example, it pointed out that the Fourth Amendment to the U.S. Constitution does not apply to EU citizens; the NSA’s activities based on E.O. 12333 are not subject to judicial oversight and are non-justiciable; and the Privacy Shield Ombudsperson is not a tribunal within the meaning of Article 47 of the Charter, and thus, U.S. law does not afford EU citizens the protection required.  See Decision at 15, 29 (“the existence of such a lacuna in judicial protection in respect to interferences with intelligence programmes based on that presidential decree [E.O. 12333] makes it impossible to conclude, as the Commission did in the Privacy Shield Decision, that United States law ensues a level of protection essentially equivalent to that guaranteed by Article 47 of the Charter.”).

With respect to SCCs, the European Court held that “the competition supervisory authority is required to suspend or prohibit a transfer of data to a third country pursuant to standard data protection clauses adopted by the Commission, if, in the view of that supervisory authority and in the light of all the circumstances of that transfer, those clauses are not or cannot be complied with in that third country and the protection of the data transferred that is required by the EU law, in particular by Articles 45 and 46 of the GDPR and by the Charter, cannot be ensured by other means, where the controller or a processor has not itself suspended or put an end to the transfer.”  See Decision at 22.

On the same day as the European Court issued its decision, the U.S. Secretary of Commerce Wilbur Ross issued the following statement regarding the ruling: “While the Department of Commerce is deeply disappointed that the court appears to have invalidated the European Commission’s adequacy decision underlying the EU-U.S. Privacy Shield, we are still studying the decision to fully understand its practical impacts,” said Secretary Wilbur Ross. “We have been and will remain in close contact with the European Commission and European Data Protection Board on this matter and hope to be able to limit the negative consequences to the $7.1 trillion transatlantic economic relationship that is so vital to our respective citizens, companies, and governments. Data flows are essential not just to tech companies—but to businesses of all sizes in every sector. As our economies continue their post-COVID-19 recovery, it is critical that companies—including the 5,300+ current Privacy Shield participants—be able to transfer data without interruption, consistent with the strong protections offered by Privacy Shield.”  The full press release can be found here.  [https://www.commerce.gov/news/press-releases/2020/07/us-secretary-commerce-wilbur-ross-statement-schrems-ii-ruling-and]

The United Kingdom’s Information Commissioner’s Office made a similar statement, implicitly acknowledging that the impact of the Schrems II decision extends to all EU cross border data transfers, not just EU-US transfers under the Privacy Shield.  A copy of the ICO’s statement can be found here.

Since the decision, the big headline has seemingly been two-fold: (1) SCCs survive, but (2) Privacy Shield has been invalidated.  Respectfully, the first half of this is a half-truth.  Companies proceeding with cross-border data transfers using SCCs and binding corporate rules should consult with counsel to assess the risk involved in their transfers and evaluate alternative transfer frameworks.  Unless and until the United States changes its surveillance practices, including not conducting surveillance of in-transit data (i.e., before it arrives in the U.S.), and providing EU data subjects with a right of redress regardless of the surveillance program that their data is subject to, the Schrems II decision puts nearly all EU-US data transfers at risk in industries subject to government surveillance.  For companies that have received requests for information from U.S. law enforcement in the past, and who want to avoid risk, the safest way to proceed may be to (1) consider whether the data at issue is even needed in the first place, and (2) consider simply transfer data processing for European data subjects to Europe.  Other bases for data transfers, such as binding corporate rules and SCCs could also be considered, but back-up plans should be in place, as proceeding under these frameworks could be risky in view of the Schrems II decision.

Companies should also take a close look at their policies and practices for responding to requests for information from U.S. law enforcement, such as the number of requests the company has received; the number of user accounts the requests involved and how many of the user accounts were for EU data subjects; the types of requests the company received, e.g., subpoenas or search warrants; the records the company produced, and in what cases those records were for EU data subjects; the bases for the requests (e.g., were they pursuant to government surveillance programs that provide data subjects with a right to a remedy in the event their rights are violated, or subject to, e.g., E.O. 12333 which provides no such remedy); whether, to the company’s knowledge, EU data subjects whose information was shared ever contended that their rights had been violated.  The more information a company has to show that it has not provided information to U.S. law enforcement pursuant to surveillance programs that do not offer EU data subjects a remedy in the event their rights are violated, the safer footing the company should be on going forward with respect to their EU-US data transfer practices.

For companies that have not received requests for information from U.S. law enforcement pursuant to surveillance programs, the path forward has more options.  While Privacy Shield has been struck down for all companies, it is likely that a new or revised framework will be designed and an adequacy decision will be sought with respect thereto (just as it was for Privacy Shield, when Safe Harbor was struck down).  In the interim, it is prudent for these companies to consider alternative data transfer frameworks (such as SCCs), and in the future, try a “belt and suspenders” approach such that their business does not “hang their hat” on a single framework for cross-border data transfers (this is what Facebook and numerous other companies learned, thus causing them to rely on both Privacy Shield and SCCs).  Companies should also take a good look at the data they are processing, particularly with respect to EU data subjects, and ask whether it is even necessary, and whether processing in the U.S. is necessary.  In some cases the answer may be “yes,” but the more a company can practice data minimization – particularly when cross-border data transfers are at issue – the safer it may be.  Finally, just because your company has never received a request from U.S. law enforcement pursuant to a surveillance program yet, does not mean you never will—particularly in certain industries, such as tech and telecommunications.  You should plan for such requests prior to them happening.

If you need any help evaluating your company’s risks in view of the Schrems II decision, or determining best practices for going forward, please contact us at privacy@rothwellfigg.com.

Rothwell Figg attorneys Martin M. Zoltick and Jenny L. Colgate published a chapter titled “Privacy, Data Protection, and Cybersecurity: A State-Law Analysis” in the International Comparative Legal Guide to: Data Protection 2020, published by Global Legal Group Ltd.

While some countries have enacted comprehensive privacy and data protection laws, like the EU’s General Data Protection Regulation (GDPR), the United States does not have a single, comprehensive federal law regulating the collection, use, and disclosure of personal information. Instead, U.S. privacy and data protection legislation is comprised of a patchwork system of federal and state laws and regulations – hundreds of them! The chapter in the Guide aims to identify and provide some guidance regarding the state privacy and data protection laws broken into three sections: 1) technology-specific laws; 2) industry-specific laws, and 3) generally applicable laws.

To read the article in its entirety, access the article on the ICLG website here: https://iclg.com/practice-areas/data-protection-laws-and-regulations/2-privacy-data-protection-and-cybersecurity-a-state-law-analysis.

As of July 1, 2020, the California Attorney General began enforcing the California Consumer Privacy Act (CCPA). While many details about CCPA enforcement remain uncertain, many states have enacted or will enact their own privacy laws. Businesses clearly must wrestle with this mosaic of new and emerging privacy restrictions.  Some industries have explored legal challenges to these statutes. One of these legal challenge, against a new Maine privacy law, was just defeated.

In February 2020, lobby groups representing the broadband industry sued the state of Maine under the theory that the Maine privacy law violates their First Amendment protections on business speech The Maine privacy law in question prohibits ISPs from using, disclosing, or selling browsing history and other personal information without customers’ opt-in consent. The plaintiffs’ First Amendment argument draws from the theory that a company’s sale of private individual’s browsing history constitutes a First Amendment protected form of “commercial speech.”  The lawsuit’s second theory flows from the history of these types of browser privacy protections. Similar federal protections were rolled back by the President and Congress in 2017. After the federal roll-back, Maine legislators decided to revive these protections for their residents. However, the lawsuit argues that the federal deregulatory action had to effect of preempting and barring any supplementary state law protections.

In a July 7, 2020 order, Judge Lance Walker agreed in general with the characterization of the browser history sales as “commercial speech,” but held that “intermediate scrutiny” of the privacy law, not “strict scrutiny,” governed judicial review of this law.  Against that legal backdrop, Judge Walker held that the law is not unconstitutional on its face and declared the preemption arguments “dubious.” While Judge Walker allowed the case to continue, the decision dealt a major blow to the theories underpinning the case.

The legal fight against privacy regulations will continue.  The Maine law addresses only a discrete privacy issue. The CCPA sprawls by comparison and will face many legal challenges, especially now that enforcement activity has begun.  However, the CPPA and other similar privacy laws will likely face similar First Amendment and preemption claims, in addition to a few other interesting wrinkles. The fight will continue.

Today the U.S. Supreme Court found in Barr v. American Association of Political Consultants, Inc. that the federal debt collection exemption to the Telephone Consumer Protection Act’s general prohibition on autodialed calls violates the First Amendment.  The Supreme Court held that the exemption was a content-based restriction on speech because it favors speech made for the purpose of collecting government debt over political and other speech.  Such content-based restrictions are subject to the “strict scrutiny” standard, which the government conceded it could not satisfy.

Rather than strike down the TCPA in its entirety, as some advocates have proposed, the Supreme Court held that the appropriate remedy was to sever the provision from the statute.  Justice Kavanaugh wrote the majority opinion, noting that “Americans passionately disagree about many things.  But they are largely united in their disdain for robocalls.”  The decision further notes that the Federal Government received a staggering 3.7 million complaints about robocalls in 2019 alone.

Due to the increasing prevalence and sophistication of robocallers, industry has proposed a number of creative technical solutions to combat robocallers.  For example, there are bots that answer calls and use artificial intelligence generated speech designed to waste telemarketers’ time by keeping them on the line, voice biometric technology that automatically identifies synthesized speech, and services that will automatically gather a robocaller’s details and generate a demand letter and court documents for filing.

With today’s Supreme Court decision, TCPA lawsuits will likely remain one of the most filed types of class actions in courts across the country, especially since private parties can sue to recover up to $1500 per violation or three times their actual monetary losses.