Rothwell Figg attorney Christopher Ott published a chapter titled “Phantom Responsibility: How Data Security and Privacy Lapses Lead to Personal Liability for Officers and Directors” in the International Comparative Legal Guide to: Cybersecurity 2021, published by Global Legal Group, Ltd.

Boards of directors ignore data security and privacy risks to companies at the peril of their companies and – increasingly – their own personal liability. A business has its operations halted by ransomware approximately every 10 seconds. Billions of records are exposed every fiscal quarter. The global costs of these breaches and online crime reaches the trillions every year. These potential costs have elevated data security and privacy issues from mere “IT issues”, or compliance minutiae, to the centerpiece of strategic risk management. The law has grown to match this reality. As a result, boards face expanding personal legal liability for the company’s data security and privacy failures.

In 2014, Securities and Exchange Commission Commissioner Luis Aquilar stated that “boards that choose to ignore, or minimize, the importance of cybersecurity oversight responsibility do so at their own peril”. Those perils are changing in real time just as cybersecurity and privacy threats are changing. However, it is possible to identify certain concrete areas of established liability and strategically identify the emergent risks. In the chapter, Mr. Ott explores the current trends and tackles a few harder-to-classify risks related to United States national security oversight of cyber readiness.

To read the article in its entirety, access the article on the ICLG website here: https://iclg.com/practice-areas/cybersecurity-laws-and-regulations/3-phantom-responsibility-how-data-security-and-privacy-lapses-lead-to-personal-liability-for-officers-and-directors.

Although this is no ordinary campaign, recent news shows how politicians have many of the same worries as typical businesses.  On Thursday, October 29, 2020, the Wisconsin Republican Party reported that it had been victimized by a Business Email Compromise (BEC). There are many ways in which a criminal may conduct a BEC scam but one of the most common occurs when hackers compromise a vendor’s email accounts to hijack vendor payments. With this access, the hacker prepares elaborately fake invoices (or other supporting documents) mirroring the appearance, content, amount, and timing of typical documents from the vendor. The hacker then submits a request to change the usual payment procedures.  The hackers’ new payment plan always involves a well-known U.S. bank. When the victim business makes the next vendor payment, it goes quickly out of the U.S.-based bank and out of the country.

That is exactly what appears to have happened here when hackers stole $2.3 million from the Wisconsin Republican Party that was intended for use in the president’s re-election campaign. The theft was accomplished by tampering with invoices submitted to the party from four vendors. The modified invoices directed the state GOP to send money to accounts controlled by the hackers after a successful phishing campaign. (Phishing should be the subject of a separate, longer discussion. For today’s purposes, it is enough to know that “phishing” involves using emails to trick the recipient to hand over network control, credentials, and/or install malware that gives the hackers remote access to those systems.)

BEC cybercrime is big business.

  • While splashy malware attacks receive media attention, BEC fraud quietly cost businesses billions (with a “B”!) of dollars in recent reported losses every year.
  • Email remains a top attack vector for BEC attackers because, compared to hacking a company’s network infrastructure, it provides an easier, demonstrably profitable path for criminals.
  • These are often single-use email accounts and the hackers establish or hijack tens of thousands of these accounts every year.

Similarly, the election infrastructure is also grappling with ransomware attacks. Ransomware is a type of malware that threatens to publish the victim’s data or perpetually block access to it unless a ransom is paid. These attacks can be very disruptive. Imagine that you are running a hospital – which is the subject of another recent hacking campaign – and your health data is inaccessible: people could actually die.  Ransomware costs are climbing rapidly.  This is complicated by the fact that a company can also face fines for paying that ransomware.

We at Rothwell Figg often litigate and investigate the fallout from these BEC and ransomware events for clients. Although the consequences can be dire, there are real advantages to be had from smart and active cybersecurity legal response. Every organization could benefit from some work on their cyber hygiene and no organization is immune to these risks. The Wisconsin Republican Party and various election officials are learning their lessons in the public eye.

Nearly a year after the California Consumer Privacy Act (CCPA) went into effect, Californians now have a chance to weigh in on the California Privacy Rights Act of 2020 (CPRA), which is on the November 3, 2020, ballot as Proposition 24.  The CPRA is designed to strengthen consumer privacy protections by amending the CCPA to close certain loopholes, heighten enforcement through the creation of a California Privacy Protection Agency, and prevent the California legislature from weakening the law.  While many applaud the CPRA’s efforts, several groups—including the American Civil Liberties Union—have raised concerns that the CCPA is confusingly drafted, could worsen some of the CCPA’s loopholes, and may have the opposite effect of creating a ceiling on privacy legislation.

Below are a few highlights from Prop 24.

Data Retention: Section 1798.100(a)(3) specifies that a business that controls the collection of a consumer’s personal information shall, at or before the point of collection, inform consumers as to “the length of time the business intends to retain each category of personal information” or, if that is not possible, “the criteria used to determine such period.”  “[A] business shall not retain a consumer’s personal information or sensitive personal information for each disclosed purpose for which the personal information was collected for longer than is reasonably necessary for that disclosed purpose.”  Data retention was not fully addressed in the original CCPA, and it is a good common sense measure from both a privacy and data security perspective.

Sharing of Data: In response to covered entities narrowly interpreting the “sale” of personal information under the CCPA, the CPRA amends Section 1798.115 to provide that a customer shall have the right to know what personal information is sold or shared, and the categories of parties to whom the personal information was sold or shared.   Section 1798.120 is also amended to reflect that consumers have a right to opt-out of sale or sharing of personal information. The business must also provide a clear and conspicuous link titled “Do Not Sell or Share My Personal Information.”  See Section 1798.134(a)(1).

Global Opt-Out of Sale or Sharing of Personal Data:  The CPRA calls for regulations to define a universal “opt-out preference signal sent by a platform, technology, or mechanism, to indicate a consumer’s intent to opt-out of the sale or sharing of the consumers personal information and to limit the use or disclosure of the consumer’s sensitive personal information.”  The global opt-out mechanism is a response to the tedious, and sometimes opaque, task of having to opt-out of the sale of personal information individually with each covered entity.  (n.b. The California Attorney General’s final CCPA rules require companies to honor a global “Do Not Sell” user-enabled privacy control, such as a browser plug-in or privacy setting, device setting, or other mechanism.)

Sensitive Information: Sensitive personal information is expressly defined, and includes information such as social security, driver’s license, state identification and passport number; a customer’s account log-in, financial account, debit card or credit card number in combination with any required security or access code or other credentials; precise geolocation; racial or ethnic origin; religious or philosophical beliefs; union membership; the contents of mail, email and text messages; genetic data; biometric information; health information; and sex life or sexual orientation, among others.  Section 1798.140(ae).  Section 1798.121 is added to provide that a consumer has the right to limit use and disclosure of sensitive personal information.  Businesses must also provide a clear and conspicuous link on the business’s internet homepage titled “Limit the Use of My Sensitive Personal Information.”  Section 1798.135(a)(2)

Consent: Defined as “any freely given, specific, informed and unambiguous indication of the consumer’s wishes by which he or she . . . such as by a statement or by a clear affirmative action, signifies agreement to the processing of personal information relating to him or her for a narrowly defined particular purpose.”  Section 1798.140(h).

Enforcement: The CPRA establishes a California Privacy Protection Agency with full power, authority, and jurisdiction to implement and enforce the CCPA, and removes the “right to cure” language in the attorney general enforcement section.

Floor: The CPRA can be amended by the legislature only if it is with consistent with and furthers the initiative’s purposes and intent to “further protect consumers’ rights, including the constitutional right to privacy.”

With the uncertain prospect of a Federal privacy bill, and California and the CCPA setting the de facto data privacy standard in the US, perhaps other states are waiting to see what happens with Prop 24 before rolling out their own initiatives.  Some recent polling suggests that voters overwhelmingly support Prop 24, with 77% of likely voters saying they will vote YES on the ballot measure.

With all that has happened this year, most of us can’t wait until 2020 is in the rear view mirror.  The end of 2020, however, marks the end of the transition period provided, post-Brexit, to allow time for UK businesses and organizations that rely on international data flows, target European customers or operate inside the EEA, to negotiate a new data protection relationship with the EU.

According to the UK Information Commissioner’s Office (ICO), after the transition period, the “UK GDPR” will take effect, with the same key principles, rights and obligations as the GDPR.  However, there are implications for the rules on transfers of personal data between the UK and the EEA.

Understanding your international flows and cross-border processing of personal data, and specifically, transfers from the EEA to the UK, is critical to assessing what steps may need to be taken, presumably prior to the end of the transition period, to continue to lawfully receive these transfers.  A determination will need to be made whether a new data protection relationship needs to be established, considering the use of standard contractual clauses, binding corporate rules, and the guidance provided by the UK ICO, European Data Protection Board, and the European Commission.

The current guidance, while certainly well-intentioned, is murky at best, leaving many open questions about what may be required.  So, as the anxiously awaited end of 2020 approaches, keep an eye on those relationships and how to make them work.

The  Telephone Consumer Protection Act (TCPA) was passed in 1991 and is known by many as the law that created the “do-not-call” rules.  The statute includes a number of restrictions related to telephone, text, and fact solicitations, including a prohibition against what is colloquially known as “autodialing” and “robocalls,” and it creates a private right of action in the event of a violation, providing for the recovery of $500 for each violation of actual monetary loss (whichever is greater), an injunction, or both.

The question before the Supreme Court now, in Facebook, Inc. v. Noah Duguid, et al. (No. 19-511), is: What exactly is autodialing?  Or rather, what is “automatic telephone dialing system (ATDS)” (which is a defined term in the TCPA)?

Does autodialing encompass any device that can store and automatically dial phone numbers?  Or does autodialing require the use of a random or sequential number generator?

The statute defines ATDS as “equipment which has the capacity—(A) to store or produce telephone numbers to be called, using a random or sequential number generator; and (B) to dial such numbers.”  See 47 U.S.C. § 227(a)(1).  The TCPS provides the following prohibitions with respect to the use of ATDS:

It shall be unlawful for any person within the United States, or any person outside the United States if the recipient is within the United States—(A) to make any call (other than a call made for emergency purposes or made with the prior express consent of the called party) using any automatic telephone dialing system or an artificial or prerecorded voice—(i) to any emergency telephone line (including any “911” line and any emergency line of a hospital, medical physician or service office, health care facility, poison control center, or fire protection or law enforcement agency); (ii) to the telephone number assigned to a paging service, cellular telephone service, specialized mobile radio service, or other radio common carrier service, or any service for which the called party is charged for the call, unless such call is made solely to collect a debt owed to or guaranteed by the United States.

[Note: Just a few months ago, on July 6, 2020, the Supreme Court issued a decision in a case that challenged the constitutionality of the TCPA on first amendment grounds.  The basis of the challenge was a 2015 amendment to the TCPA which permitted calls that relate to the collection of debts guaranteed by the U.S. government.   The majority found that the restriction was subject to strict scrutiny and was an unconstitutional content-based speaker restriction.  But rather than find the entire statute unconstitutional, the Court severed the government debt collection exception, leaving the rest of the statute fully operative.]

So getting back to the case at hand, the question is: does “using a random or sequential number generator” modify (1) “produce” and “stored” or (2) just “produce”?  Circuit courts across the country have split over this question.  Most importantly to the case at hand, the Ninth Circuit found that “using a random or sequential number generator” modified only “produce.”  Thus, according to the Ninth Circuit, the TCPA’s prohibition on using autodialers does not require use of a random or sequential number generator.

Facebook filed its brief last month arguing that to qualify as an ATDS the equipment must include use of a random or sequential number generator.  A number of organizations, and the US government, filed amicus briefs supporting Facebook’s position.  Duguid filed his brief earlier this month, urging the Court to affirm the Ninth Circuit’s judgment, finding that any automated dialing of a stored number would constitute an ATDS.

The Supreme Court is slated to hear oral arguments on December 8, 2020.

A lawsuit recently filed against Amazon.com for a violation of the Illinois Biometric Information Protection Act (“BIPA”) should serve as a reminder to all companies engaged in COVID-19-related employee and/or customer scanning that it is important to determine what privacy and cybersecurity laws apply to your screening measures, and confirm that you are engaging in all required practices under those laws.

The recently-filed suit is a class action complaint brought by Michael Jerinic, a former Amazon employee, in the Circuit Court of Cook County, Illinois.  Jerinic alleges that in June 2020, in response to growing safety concerns related to the COVID-19 pandemic, Amazon began requiring its workers to provide a scan of their facial geometry, and possibly other biometric information, as part of a wellness check prior to being allowed access to the warehouse facility each day.  The complaint further alleges that “Defendant’s facial recognition devices and associated software collect and capture biometric identifiers such as scans of an individual’s facial geometry, retinas, and irises.  Defendant also scans and records the workers’ temperatures.”

Jerinic alleges that Amazon’s practices violated BIPA because Amazon did not, inter alia, provide written notice regarding the biometric information being collected or stored.

On Wednesday, September 23, 2020, Rothwell Figg attorneys Martin Zoltick, Jenny Colgate, and Caitlin Wilmot presented a Lexology webinar titled “Employee privacy and security considerations in the age of COVID-19”.

To view the recording of the webinar, please click here. To request the slides from the webinar, please send an email to RFPrivacy@rfem.com

The world changed dramatically in early 2020 as COVID-19 forced companies worldwide to change their practices and policies seemingly overnight. Some employees began working from home as companies scrambled to accommodate them; others began working on staggered schedules or with a reduced workforce as companies implemented safety measures to minimize risk for both employees and customers. In some cases, companies began furloughing or laying off employees.

In this webinar, attorneys from Rothwell Figg discuss the important privacy, data protection, and cybersecurity considerations that employers need to evaluate in any of these uncharted situations. The webinar covers:

  • employees working from home (eg, technologies that may track employees, the use of videoconferencing tools and the pitfalls of home networks);
  • corporate policies and practices that ensure network security (eg, multifactor authentication, the use of secure platforms for conducting all employment-related work, acceptable use and remote access and network monitoring for suspicious activity);
  • entering into new contracts with software or other vendors, particularly when in a rushed situation;
  • legal and regulatory compliance issues when employees are working in alternative set-ups and when employers are sidetracked (eg, California Consumer Privacy Act (CCPA) enforcement began on 1 July 2020);
  • employees having less direct oversight;
  • companies implementing back-to-work measures (eg, temperature checks and symptom, travel and exposure questionnaires);
  • companies considering surveillance measures (eg, physical surveillance, electronic surveillance and geolocation tracking);
  • data protection and trade secret issues associated with furloughed and laid-off workers; and
  • legislative efforts to create COVID-19-related privacy bills.

On Wednesday, the Senate Committee on Commerce, Science and Transportation conducted a hearing to revisit the potential for a national data privacy standard. While the Committee had met last December to discuss what Congress should consider when drafting a federal privacy bill, the game has since changed. Given that COVID-19 has drastically altered life as we knew it, now with working from home, remote learning, and the whole country trying to curtail the spread of and recover from the pandemic, what was considered “merely urgent” 10 months ago, is now “absolutely critical.” On the table was the Committee’s Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act, introduced on September 17, 2020, which served as a backdrop for the discussion.

From the start of the hearing, it was evident that the witnesses, comprised of four Former Commissioners of the FTC and the California Attorney General, as well as the other hearing participants, unanimously agreed that now is the time for a comprehensive U.S. privacy legislation for several reasons:

COVID-19.  A majority of workers are still working remotely, and children are starting school this Fall in online classrooms. As we rely – now more than ever – on social media, videoconferencing, chat rooms, and our smart phones to stay connected, and as we spend most days inside, surrounded by our Alexas, Nest thermostats and other IoT/connected devices, we’re starting to realize just how much data is being collected from us and what little protection and power we have. Individuals would also be more likely to provide health and location data and use “contact tracing” apps to help track the coronavirus, if that data were protected.

Current Data Privacy Laws Have Gaps. COPPA only protects the data of children under the age of 13. But as the Committee members pointed out, teenagers and young adults need protection too, especially in light of the shift to remote learning and the use of social media platforms such as TikTok. Likewise, HIPAA only applies to certain covered healthcare entities, leaving the data provided by consumers on health and fitness apps/devices unprotected. As technologies continue to advance and we enter the world of 5G, the law lags behind. And opt-in consent for everything (like the cookie banners and privacy policies we encounter on nearly every website) won’t cut it.

The “Patchwork” of State, Federal, and Industry-Specific Laws Isn’t Working. Consumers travel across the U.S. and, as of now, have different privacy rights in every state, leading to confusion for consumers, businesses, and law enforcement alike. Furthermore, the internet and data know no (state) bounds. Internet service providers cannot be expected to create different systems for each geographic area.  

The U.S. Risks Losing Its Competitive Edge. Without a federal privacy law, the U.S. will take a backseat on the global stage, and the GDPR will become the global privacy standard, with no input from thought leaders in the United States. And with the EU-US Privacy Shield now invalidated, we need to address alternative means for international data transfer for U.S. businesses that operate overseas, and reduce skepticism and concern from Europeans (and the rest of the world) about our own privacy regime.

As to what the framework of the U.S. privacy law should look like, the Committee members generally agreed that it should:

  • Give consumers more control over their personal data, with the rights to access, modify, delete, and opt-out of the sale of their personal information (provided that consumers are provided meaningful choices and are not discriminated against for exercising their privacy rights);
  • Use clear and plain language so that consumers can understand their rights;
  • Be drafted to allow for flexibility with regard to advancing technologies and innovative data collection (such that the law is adaptable to future technologies); and
  • Expand the enforcement and rule-making authority of the FTC, along with increased funding and a larger staff. While some Committee members floated the idea of an independent Data Protection Authority, the general consensus was that we should build on the experience of the FTC. The panel also recognized that the FTC’s other functions, antitrust and consumer protection, have a strong nexus with privacy.

Despite the consensus on the need for federal privacy regulation and the overall objectives of the legislation, there were still points of contention that must be resolved in order for a federal bill to pass:

Should citizens be granted a private right of action for violations of the federal law? 

  • YES: As one Congressman stated throughout the hearing, “a right without a remedy is no right at all.” Those in favor of a private right of action stated that it was critical for individuals to have the power to enforce their own rights. And as California Attorney General Xavier Becerra stated, attempting to enforce the rights of every private citizen is a massive undertaking, and state AGs just don’t have the capacity to do this.
  • NO: Those against a private right of action cited concerns such as an increase in frivolous lawsuits (especially if the consumer is not required to show harm resulting from a privacy violation), class actions lawsuits that only benefit lawyers and give little to actual victims, and the stifling of small businesses, which would not be able to engage in expensive defense litigation. The naysayers further noted that consumers would already be protected with the expanded enforcement authority of the FTC and administrative remedies within the company.

Should the federal law preempt state privacy laws?

  • YES: “Preempting state laws should not mean weakening protections for consumers.” Those in favor of preemption argued that the federal law should/will be strong and robust enough to protect consumers without significant gap-filling by the states (referring to similarities with HIPAA and COPPA). Having a federal law that doesn’t preempt state laws creates the risk of some states going above and beyond the federal law, requiring all companies that operate in that economy to comply- again, just another patchwork of strong state laws against the backdrop of a weak federal law, with different expectations, rights, and compliance efforts across regions.
  • NO: Those against preemption expressed concerns that all of the recent efforts by states to protect their consumers’ privacy rights (e.g., the CCPA in California, Illinois’s BIPA, Maine, Nevada, etc.) would be erased. As California Attorney General Xavier Becerra argued, the federal law should serve as “floor” rather than a “ceiling,” and create a privacy baseline on which states can provide more stringent data protection.

Should the federal law apply equally to all businesses that collect consumer data?

  • YES: Several of the speakers agreed that the federal legislation should be “technology-neutral,” and apply equally to any company collecting personal data.
  • NO: Those against a uniform application of the law argued that compliance is easier for larger or international companies, especially those which have already implemented steps and procedures to comply with the GDPR and CCPA. Small businesses and startups, on the other hand, may struggle with such implementation and may not be able to survive a potential violation and subsequent lawsuit. There should be distinctions for compliance based on company size, how much personal data the company collects and uses, and whether that data is particularly sensitive or risky.

Other notable arguments made at the hearing that may impact the U.S.’s federal privacy response:

  • As Senator Blumenthal (D-CT) noted, the late Supreme Court Justice Ruth Bader Ginsburg was a leader in the protection of privacy rights (citing her dissent in Spokeo v. Robbins). Several Committee members agreed with him that the new Supreme Court nominee should also be an advocate for increased consumer privacy.
  • As a country, we need to address systemic inaccuracy and racial bias issues before making laws that allow for the use of biometric technologies in law enforcement (citing a NIST study that found that facial recognition tools found that Black, Brown, and Asian individuals were up to 100 times more likely to be misidentified than white male faces).
  • We also need to address how to protect consumers from being manipulated by algorithms used by online platforms, and data filtering, which some argued is contributing to a growing polarization in the country.

The hearing left much to be discussed, but as the Committee Chairman, U.S. Senator Roger Wicker (R-Miss.), stated – we’re moving in the right direction. Members of the panel also noted that the SAFE DATA Act and the other proposed privacy bills share a lot of common ground, offering hope that a federal privacy law will be here sooner versus later.

Is a U.S. federal privacy law on the horizon?

Tomorrow, September 23rd at 10:00 a.m., U.S. Senator Roger Wicker (R-Miss.), chairman of the Committee on Commerce, Science, and Transportation, will convene a hearing titled, “Revisiting the Need for Federal Data Privacy Legislation.”

The hearing will examine the current state of consumer data privacy and legislative efforts to provide baseline data protections for all Americans. It will also examine lessons learned from the implementation of state privacy laws in the U.S. and the E.U. General Data Protection Regulation, as well as how the COVID-19 pandemic has affected data privacy.

Witness testimony will be provided by several Former Commissioners of the FTC as well as the California Attorney General.

Watch the live hearing here or stay tuned for a discussion of the hearing in a follow-up post.

Partners Martin Zoltick and Jenny Colgate with associate Caitlin Wilmot will present a webinar in conjunction with Lexology titled “Employee privacy and security considerations in the age of COVID-19” on Wednesday, September 23, 2020, from 11 am – 12 pm ET.

The world changed dramatically in early 2020 as COVID-19 forced companies worldwide to change their practices and policies seemingly overnight. Some employees began working from home as companies scrambled to accommodate them; others began working on staggered schedules or with a reduced workforce as companies implemented safety measures to minimise risk for both employees and customers. In some cases, companies began furloughing or laying off employees.

In this webinar, attorneys from Rothwell Figg will discuss the important privacy, data protection and cybersecurity considerations that employers need to evaluate in any of these uncharted situations. The webinar will cover:

  • employees working from home (eg, technologies that may track employees, the use of videoconferencing tools and the pitfalls of home networks);
  • corporate policies and practices that ensure network security (eg, multifactor authentication, the use of secure platforms for conducting all employment-related work, acceptable use and remote access and network monitoring for suspicious activity);
  • entering into new contracts with software or other vendors, particularly when in a rushed situation;
  • legal and regulatory compliance issues when employees are working in alternative set-ups and when employers are sidetracked (eg, California Consumer Privacy Act (CCPA) enforcement began on 1 July 2020);
  • employees having less direct oversight;
  • companies implementing back-to-work measures (eg, temperature checks and symptom, travel and exposure questionnaires);
  • companies considering surveillance measures (eg, physical surveillance, electronic surveillance and geolocation tracking);
  • data protection and trade secret issues associated with furloughed and laid-off workers; and
  • legislative efforts to create COVID-19-related privacy bills.

Registration is free and open to all. Please click here to register.