On Wednesday, September 23, 2020, Rothwell Figg attorneys Martin Zoltick, Jenny Colgate, and Caitlin Wilmot presented a Lexology webinar titled “Employee privacy and security considerations in the age of COVID-19”.

To view the recording of the webinar, please click here. To request the slides from the webinar, please send an email to RFPrivacy@rfem.com

The world changed dramatically in early 2020 as COVID-19 forced companies worldwide to change their practices and policies seemingly overnight. Some employees began working from home as companies scrambled to accommodate them; others began working on staggered schedules or with a reduced workforce as companies implemented safety measures to minimize risk for both employees and customers. In some cases, companies began furloughing or laying off employees.

In this webinar, attorneys from Rothwell Figg discuss the important privacy, data protection, and cybersecurity considerations that employers need to evaluate in any of these uncharted situations. The webinar covers:

  • employees working from home (eg, technologies that may track employees, the use of videoconferencing tools and the pitfalls of home networks);
  • corporate policies and practices that ensure network security (eg, multifactor authentication, the use of secure platforms for conducting all employment-related work, acceptable use and remote access and network monitoring for suspicious activity);
  • entering into new contracts with software or other vendors, particularly when in a rushed situation;
  • legal and regulatory compliance issues when employees are working in alternative set-ups and when employers are sidetracked (eg, California Consumer Privacy Act (CCPA) enforcement began on 1 July 2020);
  • employees having less direct oversight;
  • companies implementing back-to-work measures (eg, temperature checks and symptom, travel and exposure questionnaires);
  • companies considering surveillance measures (eg, physical surveillance, electronic surveillance and geolocation tracking);
  • data protection and trade secret issues associated with furloughed and laid-off workers; and
  • legislative efforts to create COVID-19-related privacy bills.

On Wednesday, the Senate Committee on Commerce, Science and Transportation conducted a hearing to revisit the potential for a national data privacy standard. While the Committee had met last December to discuss what Congress should consider when drafting a federal privacy bill, the game has since changed. Given that COVID-19 has drastically altered life as we knew it, now with working from home, remote learning, and the whole country trying to curtail the spread of and recover from the pandemic, what was considered “merely urgent” 10 months ago, is now “absolutely critical.” On the table was the Committee’s Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act, introduced on September 17, 2020, which served as a backdrop for the discussion.

From the start of the hearing, it was evident that the witnesses, comprised of four Former Commissioners of the FTC and the California Attorney General, as well as the other hearing participants, unanimously agreed that now is the time for a comprehensive U.S. privacy legislation for several reasons:

COVID-19.  A majority of workers are still working remotely, and children are starting school this Fall in online classrooms. As we rely – now more than ever – on social media, videoconferencing, chat rooms, and our smart phones to stay connected, and as we spend most days inside, surrounded by our Alexas, Nest thermostats and other IoT/connected devices, we’re starting to realize just how much data is being collected from us and what little protection and power we have. Individuals would also be more likely to provide health and location data and use “contact tracing” apps to help track the coronavirus, if that data were protected.

Current Data Privacy Laws Have Gaps. COPPA only protects the data of children under the age of 13. But as the Committee members pointed out, teenagers and young adults need protection too, especially in light of the shift to remote learning and the use of social media platforms such as TikTok. Likewise, HIPAA only applies to certain covered healthcare entities, leaving the data provided by consumers on health and fitness apps/devices unprotected. As technologies continue to advance and we enter the world of 5G, the law lags behind. And opt-in consent for everything (like the cookie banners and privacy policies we encounter on nearly every website) won’t cut it.

The “Patchwork” of State, Federal, and Industry-Specific Laws Isn’t Working. Consumers travel across the U.S. and, as of now, have different privacy rights in every state, leading to confusion for consumers, businesses, and law enforcement alike. Furthermore, the internet and data know no (state) bounds. Internet service providers cannot be expected to create different systems for each geographic area.  

The U.S. Risks Losing Its Competitive Edge. Without a federal privacy law, the U.S. will take a backseat on the global stage, and the GDPR will become the global privacy standard, with no input from thought leaders in the United States. And with the EU-US Privacy Shield now invalidated, we need to address alternative means for international data transfer for U.S. businesses that operate overseas, and reduce skepticism and concern from Europeans (and the rest of the world) about our own privacy regime.

As to what the framework of the U.S. privacy law should look like, the Committee members generally agreed that it should:

  • Give consumers more control over their personal data, with the rights to access, modify, delete, and opt-out of the sale of their personal information (provided that consumers are provided meaningful choices and are not discriminated against for exercising their privacy rights);
  • Use clear and plain language so that consumers can understand their rights;
  • Be drafted to allow for flexibility with regard to advancing technologies and innovative data collection (such that the law is adaptable to future technologies); and
  • Expand the enforcement and rule-making authority of the FTC, along with increased funding and a larger staff. While some Committee members floated the idea of an independent Data Protection Authority, the general consensus was that we should build on the experience of the FTC. The panel also recognized that the FTC’s other functions, antitrust and consumer protection, have a strong nexus with privacy.

Despite the consensus on the need for federal privacy regulation and the overall objectives of the legislation, there were still points of contention that must be resolved in order for a federal bill to pass:

Should citizens be granted a private right of action for violations of the federal law? 

  • YES: As one Congressman stated throughout the hearing, “a right without a remedy is no right at all.” Those in favor of a private right of action stated that it was critical for individuals to have the power to enforce their own rights. And as California Attorney General Xavier Becerra stated, attempting to enforce the rights of every private citizen is a massive undertaking, and state AGs just don’t have the capacity to do this.
  • NO: Those against a private right of action cited concerns such as an increase in frivolous lawsuits (especially if the consumer is not required to show harm resulting from a privacy violation), class actions lawsuits that only benefit lawyers and give little to actual victims, and the stifling of small businesses, which would not be able to engage in expensive defense litigation. The naysayers further noted that consumers would already be protected with the expanded enforcement authority of the FTC and administrative remedies within the company.

Should the federal law preempt state privacy laws?

  • YES: “Preempting state laws should not mean weakening protections for consumers.” Those in favor of preemption argued that the federal law should/will be strong and robust enough to protect consumers without significant gap-filling by the states (referring to similarities with HIPAA and COPPA). Having a federal law that doesn’t preempt state laws creates the risk of some states going above and beyond the federal law, requiring all companies that operate in that economy to comply- again, just another patchwork of strong state laws against the backdrop of a weak federal law, with different expectations, rights, and compliance efforts across regions.
  • NO: Those against preemption expressed concerns that all of the recent efforts by states to protect their consumers’ privacy rights (e.g., the CCPA in California, Illinois’s BIPA, Maine, Nevada, etc.) would be erased. As California Attorney General Xavier Becerra argued, the federal law should serve as “floor” rather than a “ceiling,” and create a privacy baseline on which states can provide more stringent data protection.

Should the federal law apply equally to all businesses that collect consumer data?

  • YES: Several of the speakers agreed that the federal legislation should be “technology-neutral,” and apply equally to any company collecting personal data.
  • NO: Those against a uniform application of the law argued that compliance is easier for larger or international companies, especially those which have already implemented steps and procedures to comply with the GDPR and CCPA. Small businesses and startups, on the other hand, may struggle with such implementation and may not be able to survive a potential violation and subsequent lawsuit. There should be distinctions for compliance based on company size, how much personal data the company collects and uses, and whether that data is particularly sensitive or risky.

Other notable arguments made at the hearing that may impact the U.S.’s federal privacy response:

  • As Senator Blumenthal (D-CT) noted, the late Supreme Court Justice Ruth Bader Ginsburg was a leader in the protection of privacy rights (citing her dissent in Spokeo v. Robbins). Several Committee members agreed with him that the new Supreme Court nominee should also be an advocate for increased consumer privacy.
  • As a country, we need to address systemic inaccuracy and racial bias issues before making laws that allow for the use of biometric technologies in law enforcement (citing a NIST study that found that facial recognition tools found that Black, Brown, and Asian individuals were up to 100 times more likely to be misidentified than white male faces).
  • We also need to address how to protect consumers from being manipulated by algorithms used by online platforms, and data filtering, which some argued is contributing to a growing polarization in the country.

The hearing left much to be discussed, but as the Committee Chairman, U.S. Senator Roger Wicker (R-Miss.), stated – we’re moving in the right direction. Members of the panel also noted that the SAFE DATA Act and the other proposed privacy bills share a lot of common ground, offering hope that a federal privacy law will be here sooner versus later.

Is a U.S. federal privacy law on the horizon?

Tomorrow, September 23rd at 10:00 a.m., U.S. Senator Roger Wicker (R-Miss.), chairman of the Committee on Commerce, Science, and Transportation, will convene a hearing titled, “Revisiting the Need for Federal Data Privacy Legislation.”

The hearing will examine the current state of consumer data privacy and legislative efforts to provide baseline data protections for all Americans. It will also examine lessons learned from the implementation of state privacy laws in the U.S. and the E.U. General Data Protection Regulation, as well as how the COVID-19 pandemic has affected data privacy.

Witness testimony will be provided by several Former Commissioners of the FTC as well as the California Attorney General.

Watch the live hearing here or stay tuned for a discussion of the hearing in a follow-up post.

Partners Martin Zoltick and Jenny Colgate with associate Caitlin Wilmot will present a webinar in conjunction with Lexology titled “Employee privacy and security considerations in the age of COVID-19” on Wednesday, September 23, 2020, from 11 am – 12 pm ET.

The world changed dramatically in early 2020 as COVID-19 forced companies worldwide to change their practices and policies seemingly overnight. Some employees began working from home as companies scrambled to accommodate them; others began working on staggered schedules or with a reduced workforce as companies implemented safety measures to minimise risk for both employees and customers. In some cases, companies began furloughing or laying off employees.

In this webinar, attorneys from Rothwell Figg will discuss the important privacy, data protection and cybersecurity considerations that employers need to evaluate in any of these uncharted situations. The webinar will cover:

  • employees working from home (eg, technologies that may track employees, the use of videoconferencing tools and the pitfalls of home networks);
  • corporate policies and practices that ensure network security (eg, multifactor authentication, the use of secure platforms for conducting all employment-related work, acceptable use and remote access and network monitoring for suspicious activity);
  • entering into new contracts with software or other vendors, particularly when in a rushed situation;
  • legal and regulatory compliance issues when employees are working in alternative set-ups and when employers are sidetracked (eg, California Consumer Privacy Act (CCPA) enforcement began on 1 July 2020);
  • employees having less direct oversight;
  • companies implementing back-to-work measures (eg, temperature checks and symptom, travel and exposure questionnaires);
  • companies considering surveillance measures (eg, physical surveillance, electronic surveillance and geolocation tracking);
  • data protection and trade secret issues associated with furloughed and laid-off workers; and
  • legislative efforts to create COVID-19-related privacy bills.

Registration is free and open to all. Please click here to register.

Partners Martin Zoltick and Jenny Colgate, along with associate Caitlin Wilmot, will present a webinar titled “Connected Healthcare – Navigating the Patchwork of US Privacy Laws and Developing a Platform that Promotes Trust” for the American Bar Association (ABA) on Monday, September 21, 2020, at 1 pm ET.

As the field of connected healthcare grows exponentially, so too are the fields of privacy and data protection law. The problem is that the growth of each is independent. While connectivity can directly benefit both patients and healthcare providers, they also come with risks. Legal non-compliance risks. Security risks. Trust risks. It is important that those in the field of connected healthcare stay informed of the ever-developing body of U.S. and state privacy and data protection law, as compliance with the huge patchwork of privacy laws is essential for avoiding fines, bad headlines, and being the subject of the next FTC or AG investigation, or the next class action lawsuit.

We will discuss some of the areas of privacy and data protection law that those in the field of connected healthcare should be paying attention to, such as HIPAA, CCPA, BIPA, and other biometric laws, IoT security laws, COPPA, ECPA, and web scraping laws including but not limited to the CFAA. We will also share some advice on best practices, as it is likely that for connected medical devices to be successful in the future compliance alone may not be enough. As consumers become more educated on privacy and data protection, they are looking for platforms that are built on the concepts of transparency and control. Transparency promotes trust.

The webinar is sponsored by the ABA Section of Science & Technology Law, and is open to ABA members and non-members. To learn more about the webinar, or to register, please click here.

Cybersecurity does not just pose technical challenges; companies must always keep their eye on the human component of cyber risk.  For example, even the most damaging and sophisticated hacks – such as the recent Twitter hacks – can result from spear-phishing. Imagine that: multi-billion-dollar new technology communication apparatuses brought to their knees by charming fraudsters on the phone. But the pseudo-insider risk does not end with phishing schemes. Instead, hackers and criminals of all stripes are seeking weaknesses that will enable them to gain leverage over companies.

On August 26, 2020, the United States Department of Justice charged a Russian national for offering $1 million to a Tesla employee in return for them infecting their employer’s network with malware.  Egor Igorevich Kriuchkov met with the employee on multiple occasions as part of the recruitment effort.  The malware was designed to exfiltrate data from Tesla. The criminal group behind the attack allegedly would then demand $4 million in return for the information.

A ransomware operation, like the one detailed in the criminal complaint, encrypts all of a company’s data and demands a hefty payment in return for the decryption key. For many companies, it is less expensive to pay the criminal’s fee than to undergo lengthy service outages. Ransomware is often spread via malware. However, this case describes using corrupted insider employees as the agents of infection. This altered tactic shows how determined criminal hackers can be.

Based upon the allegations contained in the complaint, this constituted a long-term, concerted effort by the criminals. The criminal recruiter traveled from Russia to Nevada multiple times and apparently spent many thousands of dollars wooing the individual.  While remarkable, it would be foolish for companies to think this approach was novel. The fact of the matter is that international criminal espionage is a real and persistent threat.

Determined adversaries – just as with the traditional espionage world – will search for and develop human assets in their search for data.  Numerous legal consequences can flow from these types of attacks.  If the crime is successful, and the ransom is paid, companies can faces years of litigation to make themselves whole again. This litigation could be with their vendors, who had their services interrupted, or with the company’s own insurers.

In January 2020, long-running litigation over the cyber coverage afforded by a business owner’s policy in a 2016 ransomware attack was resolved at summary judgement by a Maryland federal judge’s order.  In that case, insurance coverage was finally ordered but only after years of litigation. If an insider was the source of the ransomware, the path to coverage would be even longer and more legally treacherous.

The question then becomes, what can be done?  First, companies must recognize just how enticing and valuable their digital assets have become. Just like any other valuable asset, companies must adopt a 360-degree approach to security.  That approach should be regularly re-examined and scrutinized. Now that the human element has become an obvious and well-funded vector for criminal mischief, companies must re-double their internal training and education. Company employees must be taught that their access will be targeted by criminal elements and how to respond. You do not want your employee to be surprised by novel and unexpected attention. It is best to let everyone know that they are not participants in a spy movie but could participate in a prison movie, if they choose poorly.

Companies also have to begin planning for cyber litigation now, not later. The preparation on a litigation standing will re-enforce proper workflows and decision-making, even under pressure. Early litigation preparation will also strengthen later arguments that the cyber response process should be considered privileged, which is a burgeoning litigation fight.  Those privilege issues should be the subject of a separate discussion.  However, to paraphrase digital godfather Benjamin Franklin, smart companies know that one byte of preparation equals one terabyte of cure.

Have you seen the new headline about Twitter in the news?  It may be time to double-check your corporate practices and check-in with your employees.
The top new FTC privacy probe concerns Twitter, which has been charged by the FTC for breaching a 2011 consent decree by using phone numbers and email addresses that users provided for security purposes (e.g., two-factor authentication), and using that information to target users with advertisements.  According to Twitter, the FTC has alleged that this conduct breached the 2011 consent decree (resulting from a hacker incident), which purportedly “bars Twitter for 20 years from misleading users about the extent that it protects their privacy.”  Twitter’s misuse of user’s phone numbers and email addresses for direct advertising was self-revealed by the company in an October blog post, which noted that it did not know how many people were impacted.  Twitter called the misuse “inadvertent.”  Twitter said on Monday it expected to pay between $150-250 million to resolve the FTC investigation.

This story should have all corporations taking a look at their own corporate practices and making sure that similar actions are not happening within their closed doors.  All companies are “barred” from misleading users about the extent that they protect user privacy by virtue of, inter alia, FTC section 5 and state UDAP statutes.  (In Twitter’s case, because of its 2011 security incident, it also was barred via a consent decree.)  Also, with many employees working remotely these days, it may be harder for companies to oversee how different sections of the company are interacting.  Perhaps in Twitter’s case this alone led to the issue?  Who knows. 

In any event, let Twitter’s big headline be a reminder to all companies to: (1) review your privacy policy and any other representations that you make to customers regarding the privacy and security of their data; (2) review your corporate procedures (not just policies, but check in with the boots on the ground) to ensure they are consistent with your privacy policy and other representations that you make to customers; and (3) make sure corporate training events regarding privacy and security are in place, so as to create a corporate culture of data protection and privacy by design.

Mistakes happen, but diligence can prevent them…and can help serve as a defense for when they do happen.
 

Last week, on July 16, 2020, Europe’s top court invalidated the EU-US data flow arrangement called Privacy Shield.  In a world with competing privacy regulations, many thousands of global businesses relied heavily Privacy Shield to conduct their business across EU-US borders (there are 5300+ current Privacy Shield participants, and the transatlantic economic relationship is valued at $7.1 trillion), so the decision sent shockwaves through the business/data privacy community.  Further, the decision has implications that extend beyond the Privacy Shield framework, and beyond EU-US data transfers.

The decision arose from a case brought by Mr. Maximillian Schrems, an Austrian lawyer, who requested in 2015 that the Irish Data Protection Commissioner (the “Irish DPA”) order the suspension or prohibition, in the future, of the transfer by Facebook Ireland of his personal data to Facebook, Inc. (the latter being located in the United States), a case commonly referred to in the privacy community as Schrems II.  (Notably, Schrems I was an earlier case brought by the same lawyer challenging Facebook’s transfer of personal data to the U.S. under a prior EU-US data transfer framework that had been determined adequate, the US-EU Safe Harbor framework, which was struck down as a result of that case.  Following that, Facebook turned to Standard Contract Clauses (SCCs) as a basis for cross-border data transfers, causes Schrems to file the Schrems II case.  And thereafter, the Privacy Shield framework was established and determined adequate, providing a second basis for Facebook’s cross-border data transfers.)

A copy of the Schrems II decision can be found here.  A copy of the European Court’s 3-page press release, summarizing the 31 page decision, can be found here.

In Schrems II, the grounds for review of Facebook’s cross-border data transfers was the United States’ digital surveillance policies and practices, including the Foreign Intelligence Surveillance Act (FISA) and executive order 12,333 (which sanctions bulk data collections)).  Schrems argued that these U.S. surveillance practices are inconsistent with European fundamental rights giving citizens the rights to privacy and data protection, as set out in EU Charter of Fundamental Rights, the European Convention on Human Rights, and several pieces of EU legislation, including the General Data Protection Regulation (specifically, Mr. Schrems called out Articles 7, 8 and 47 of the Charter).  In other words, transferred EU personal data may be at risk of being accessed and processed by the U.S. government (e.g., the CIA, FBI and/or NSA) in a manner incompatible with privacy rights guaranteed in the EU, and EU data subjects may have no right if this happens to an effective remedy.  [Notably, while the original request was focused solely on the SCCs, the Court found the validity of the Privacy Shield Decision relevant to assessing the sufficiency of SCCs, and also any obligations to which the supervisory authority may be subject to suspect or prohibit such a transfer.  See, e.g., Decision at 25.]

In reaching its decision to invalidate the Privacy Shield, the European Court pointed out issues with the U.S. surveillance framework, as it applies to EU-US data transfers, such as (1) that “E.O. 12333 allows the NSA to access data “in transit” to the United States, by accessing underwater cables on the floor of the Atlantic, and to collect and retain such data before arriving in the United States and being subject there to FISA”; (2) “activities conducted pursuant to E.O. 12333 are not governed by statute”; and (3) “non-US persons are covered only by PPD-28, which merely states that intelligence activities should be ‘as tailored as feasible’.”  See Decision at 14-15.  The Court also pointed out, and focused quite heavily on, the lack of remedies for EU citizens whose rights have been violated as a result of US surveillance practices.  For example, it pointed out that the Fourth Amendment to the U.S. Constitution does not apply to EU citizens; the NSA’s activities based on E.O. 12333 are not subject to judicial oversight and are non-justiciable; and the Privacy Shield Ombudsperson is not a tribunal within the meaning of Article 47 of the Charter, and thus, U.S. law does not afford EU citizens the protection required.  See Decision at 15, 29 (“the existence of such a lacuna in judicial protection in respect to interferences with intelligence programmes based on that presidential decree [E.O. 12333] makes it impossible to conclude, as the Commission did in the Privacy Shield Decision, that United States law ensues a level of protection essentially equivalent to that guaranteed by Article 47 of the Charter.”).

With respect to SCCs, the European Court held that “the competition supervisory authority is required to suspend or prohibit a transfer of data to a third country pursuant to standard data protection clauses adopted by the Commission, if, in the view of that supervisory authority and in the light of all the circumstances of that transfer, those clauses are not or cannot be complied with in that third country and the protection of the data transferred that is required by the EU law, in particular by Articles 45 and 46 of the GDPR and by the Charter, cannot be ensured by other means, where the controller or a processor has not itself suspended or put an end to the transfer.”  See Decision at 22.

On the same day as the European Court issued its decision, the U.S. Secretary of Commerce Wilbur Ross issued the following statement regarding the ruling: “While the Department of Commerce is deeply disappointed that the court appears to have invalidated the European Commission’s adequacy decision underlying the EU-U.S. Privacy Shield, we are still studying the decision to fully understand its practical impacts,” said Secretary Wilbur Ross. “We have been and will remain in close contact with the European Commission and European Data Protection Board on this matter and hope to be able to limit the negative consequences to the $7.1 trillion transatlantic economic relationship that is so vital to our respective citizens, companies, and governments. Data flows are essential not just to tech companies—but to businesses of all sizes in every sector. As our economies continue their post-COVID-19 recovery, it is critical that companies—including the 5,300+ current Privacy Shield participants—be able to transfer data without interruption, consistent with the strong protections offered by Privacy Shield.”  The full press release can be found here.  [https://www.commerce.gov/news/press-releases/2020/07/us-secretary-commerce-wilbur-ross-statement-schrems-ii-ruling-and]

The United Kingdom’s Information Commissioner’s Office made a similar statement, implicitly acknowledging that the impact of the Schrems II decision extends to all EU cross border data transfers, not just EU-US transfers under the Privacy Shield.  A copy of the ICO’s statement can be found here.

Since the decision, the big headline has seemingly been two-fold: (1) SCCs survive, but (2) Privacy Shield has been invalidated.  Respectfully, the first half of this is a half-truth.  Companies proceeding with cross-border data transfers using SCCs and binding corporate rules should consult with counsel to assess the risk involved in their transfers and evaluate alternative transfer frameworks.  Unless and until the United States changes its surveillance practices, including not conducting surveillance of in-transit data (i.e., before it arrives in the U.S.), and providing EU data subjects with a right of redress regardless of the surveillance program that their data is subject to, the Schrems II decision puts nearly all EU-US data transfers at risk in industries subject to government surveillance.  For companies that have received requests for information from U.S. law enforcement in the past, and who want to avoid risk, the safest way to proceed may be to (1) consider whether the data at issue is even needed in the first place, and (2) consider simply transfer data processing for European data subjects to Europe.  Other bases for data transfers, such as binding corporate rules and SCCs could also be considered, but back-up plans should be in place, as proceeding under these frameworks could be risky in view of the Schrems II decision.

Companies should also take a close look at their policies and practices for responding to requests for information from U.S. law enforcement, such as the number of requests the company has received; the number of user accounts the requests involved and how many of the user accounts were for EU data subjects; the types of requests the company received, e.g., subpoenas or search warrants; the records the company produced, and in what cases those records were for EU data subjects; the bases for the requests (e.g., were they pursuant to government surveillance programs that provide data subjects with a right to a remedy in the event their rights are violated, or subject to, e.g., E.O. 12333 which provides no such remedy); whether, to the company’s knowledge, EU data subjects whose information was shared ever contended that their rights had been violated.  The more information a company has to show that it has not provided information to U.S. law enforcement pursuant to surveillance programs that do not offer EU data subjects a remedy in the event their rights are violated, the safer footing the company should be on going forward with respect to their EU-US data transfer practices.

For companies that have not received requests for information from U.S. law enforcement pursuant to surveillance programs, the path forward has more options.  While Privacy Shield has been struck down for all companies, it is likely that a new or revised framework will be designed and an adequacy decision will be sought with respect thereto (just as it was for Privacy Shield, when Safe Harbor was struck down).  In the interim, it is prudent for these companies to consider alternative data transfer frameworks (such as SCCs), and in the future, try a “belt and suspenders” approach such that their business does not “hang their hat” on a single framework for cross-border data transfers (this is what Facebook and numerous other companies learned, thus causing them to rely on both Privacy Shield and SCCs).  Companies should also take a good look at the data they are processing, particularly with respect to EU data subjects, and ask whether it is even necessary, and whether processing in the U.S. is necessary.  In some cases the answer may be “yes,” but the more a company can practice data minimization – particularly when cross-border data transfers are at issue – the safer it may be.  Finally, just because your company has never received a request from U.S. law enforcement pursuant to a surveillance program yet, does not mean you never will—particularly in certain industries, such as tech and telecommunications.  You should plan for such requests prior to them happening.

If you need any help evaluating your company’s risks in view of the Schrems II decision, or determining best practices for going forward, please contact us at privacy@rothwellfigg.com.

Rothwell Figg attorneys Martin M. Zoltick and Jenny L. Colgate published a chapter titled “Privacy, Data Protection, and Cybersecurity: A State-Law Analysis” in the International Comparative Legal Guide to: Data Protection 2020, published by Global Legal Group Ltd.

While some countries have enacted comprehensive privacy and data protection laws, like the EU’s General Data Protection Regulation (GDPR), the United States does not have a single, comprehensive federal law regulating the collection, use, and disclosure of personal information. Instead, U.S. privacy and data protection legislation is comprised of a patchwork system of federal and state laws and regulations – hundreds of them! The chapter in the Guide aims to identify and provide some guidance regarding the state privacy and data protection laws broken into three sections: 1) technology-specific laws; 2) industry-specific laws, and 3) generally applicable laws.

To read the article in its entirety, access the article on the ICLG website here: https://iclg.com/practice-areas/data-protection-laws-and-regulations/2-privacy-data-protection-and-cybersecurity-a-state-law-analysis.