The  Telephone Consumer Protection Act (TCPA) was passed in 1991 and is known by many as the law that created the “do-not-call” rules.  The statute includes a number of restrictions related to telephone, text, and fact solicitations, including a prohibition against what is colloquially known as “autodialing” and “robocalls,” and it creates a private right of action in the event of a violation, providing for the recovery of $500 for each violation of actual monetary loss (whichever is greater), an injunction, or both.

The question before the Supreme Court now, in Facebook, Inc. v. Noah Duguid, et al. (No. 19-511), is: What exactly is autodialing?  Or rather, what is “automatic telephone dialing system (ATDS)” (which is a defined term in the TCPA)?

Does autodialing encompass any device that can store and automatically dial phone numbers?  Or does autodialing require the use of a random or sequential number generator?

The statute defines ATDS as “equipment which has the capacity—(A) to store or produce telephone numbers to be called, using a random or sequential number generator; and (B) to dial such numbers.”  See 47 U.S.C. § 227(a)(1).  The TCPS provides the following prohibitions with respect to the use of ATDS:

It shall be unlawful for any person within the United States, or any person outside the United States if the recipient is within the United States—(A) to make any call (other than a call made for emergency purposes or made with the prior express consent of the called party) using any automatic telephone dialing system or an artificial or prerecorded voice—(i) to any emergency telephone line (including any “911” line and any emergency line of a hospital, medical physician or service office, health care facility, poison control center, or fire protection or law enforcement agency); (ii) to the telephone number assigned to a paging service, cellular telephone service, specialized mobile radio service, or other radio common carrier service, or any service for which the called party is charged for the call, unless such call is made solely to collect a debt owed to or guaranteed by the United States.

[Note: Just a few months ago, on July 6, 2020, the Supreme Court issued a decision in a case that challenged the constitutionality of the TCPA on first amendment grounds.  The basis of the challenge was a 2015 amendment to the TCPA which permitted calls that relate to the collection of debts guaranteed by the U.S. government.   The majority found that the restriction was subject to strict scrutiny and was an unconstitutional content-based speaker restriction.  But rather than find the entire statute unconstitutional, the Court severed the government debt collection exception, leaving the rest of the statute fully operative.]

So getting back to the case at hand, the question is: does “using a random or sequential number generator” modify (1) “produce” and “stored” or (2) just “produce”?  Circuit courts across the country have split over this question.  Most importantly to the case at hand, the Ninth Circuit found that “using a random or sequential number generator” modified only “produce.”  Thus, according to the Ninth Circuit, the TCPA’s prohibition on using autodialers does not require use of a random or sequential number generator.

Facebook filed its brief last month arguing that to qualify as an ATDS the equipment must include use of a random or sequential number generator.  A number of organizations, and the US government, filed amicus briefs supporting Facebook’s position.  Duguid filed his brief earlier this month, urging the Court to affirm the Ninth Circuit’s judgment, finding that any automated dialing of a stored number would constitute an ATDS.

The Supreme Court is slated to hear oral arguments on December 8, 2020.

A lawsuit recently filed against Amazon.com for a violation of the Illinois Biometric Information Protection Act (“BIPA”) should serve as a reminder to all companies engaged in COVID-19-related employee and/or customer scanning that it is important to determine what privacy and cybersecurity laws apply to your screening measures, and confirm that you are engaging in all required practices under those laws.

The recently-filed suit is a class action complaint brought by Michael Jerinic, a former Amazon employee, in the Circuit Court of Cook County, Illinois.  Jerinic alleges that in June 2020, in response to growing safety concerns related to the COVID-19 pandemic, Amazon began requiring its workers to provide a scan of their facial geometry, and possibly other biometric information, as part of a wellness check prior to being allowed access to the warehouse facility each day.  The complaint further alleges that “Defendant’s facial recognition devices and associated software collect and capture biometric identifiers such as scans of an individual’s facial geometry, retinas, and irises.  Defendant also scans and records the workers’ temperatures.”

Jerinic alleges that Amazon’s practices violated BIPA because Amazon did not, inter alia, provide written notice regarding the biometric information being collected or stored.

On Wednesday, September 23, 2020, Rothwell Figg attorneys Martin Zoltick, Jenny Colgate, and Caitlin Wilmot presented a Lexology webinar titled “Employee privacy and security considerations in the age of COVID-19”.

To view the recording of the webinar, please click here. To request the slides from the webinar, please send an email to RFPrivacy@rfem.com

The world changed dramatically in early 2020 as COVID-19 forced companies worldwide to change their practices and policies seemingly overnight. Some employees began working from home as companies scrambled to accommodate them; others began working on staggered schedules or with a reduced workforce as companies implemented safety measures to minimize risk for both employees and customers. In some cases, companies began furloughing or laying off employees.

In this webinar, attorneys from Rothwell Figg discuss the important privacy, data protection, and cybersecurity considerations that employers need to evaluate in any of these uncharted situations. The webinar covers:

  • employees working from home (eg, technologies that may track employees, the use of videoconferencing tools and the pitfalls of home networks);
  • corporate policies and practices that ensure network security (eg, multifactor authentication, the use of secure platforms for conducting all employment-related work, acceptable use and remote access and network monitoring for suspicious activity);
  • entering into new contracts with software or other vendors, particularly when in a rushed situation;
  • legal and regulatory compliance issues when employees are working in alternative set-ups and when employers are sidetracked (eg, California Consumer Privacy Act (CCPA) enforcement began on 1 July 2020);
  • employees having less direct oversight;
  • companies implementing back-to-work measures (eg, temperature checks and symptom, travel and exposure questionnaires);
  • companies considering surveillance measures (eg, physical surveillance, electronic surveillance and geolocation tracking);
  • data protection and trade secret issues associated with furloughed and laid-off workers; and
  • legislative efforts to create COVID-19-related privacy bills.

On Wednesday, the Senate Committee on Commerce, Science and Transportation conducted a hearing to revisit the potential for a national data privacy standard. While the Committee had met last December to discuss what Congress should consider when drafting a federal privacy bill, the game has since changed. Given that COVID-19 has drastically altered life as we knew it, now with working from home, remote learning, and the whole country trying to curtail the spread of and recover from the pandemic, what was considered “merely urgent” 10 months ago, is now “absolutely critical.” On the table was the Committee’s Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act, introduced on September 17, 2020, which served as a backdrop for the discussion.

From the start of the hearing, it was evident that the witnesses, comprised of four Former Commissioners of the FTC and the California Attorney General, as well as the other hearing participants, unanimously agreed that now is the time for a comprehensive U.S. privacy legislation for several reasons:

COVID-19.  A majority of workers are still working remotely, and children are starting school this Fall in online classrooms. As we rely – now more than ever – on social media, videoconferencing, chat rooms, and our smart phones to stay connected, and as we spend most days inside, surrounded by our Alexas, Nest thermostats and other IoT/connected devices, we’re starting to realize just how much data is being collected from us and what little protection and power we have. Individuals would also be more likely to provide health and location data and use “contact tracing” apps to help track the coronavirus, if that data were protected.

Current Data Privacy Laws Have Gaps. COPPA only protects the data of children under the age of 13. But as the Committee members pointed out, teenagers and young adults need protection too, especially in light of the shift to remote learning and the use of social media platforms such as TikTok. Likewise, HIPAA only applies to certain covered healthcare entities, leaving the data provided by consumers on health and fitness apps/devices unprotected. As technologies continue to advance and we enter the world of 5G, the law lags behind. And opt-in consent for everything (like the cookie banners and privacy policies we encounter on nearly every website) won’t cut it.

The “Patchwork” of State, Federal, and Industry-Specific Laws Isn’t Working. Consumers travel across the U.S. and, as of now, have different privacy rights in every state, leading to confusion for consumers, businesses, and law enforcement alike. Furthermore, the internet and data know no (state) bounds. Internet service providers cannot be expected to create different systems for each geographic area.  

The U.S. Risks Losing Its Competitive Edge. Without a federal privacy law, the U.S. will take a backseat on the global stage, and the GDPR will become the global privacy standard, with no input from thought leaders in the United States. And with the EU-US Privacy Shield now invalidated, we need to address alternative means for international data transfer for U.S. businesses that operate overseas, and reduce skepticism and concern from Europeans (and the rest of the world) about our own privacy regime.

As to what the framework of the U.S. privacy law should look like, the Committee members generally agreed that it should:

  • Give consumers more control over their personal data, with the rights to access, modify, delete, and opt-out of the sale of their personal information (provided that consumers are provided meaningful choices and are not discriminated against for exercising their privacy rights);
  • Use clear and plain language so that consumers can understand their rights;
  • Be drafted to allow for flexibility with regard to advancing technologies and innovative data collection (such that the law is adaptable to future technologies); and
  • Expand the enforcement and rule-making authority of the FTC, along with increased funding and a larger staff. While some Committee members floated the idea of an independent Data Protection Authority, the general consensus was that we should build on the experience of the FTC. The panel also recognized that the FTC’s other functions, antitrust and consumer protection, have a strong nexus with privacy.

Despite the consensus on the need for federal privacy regulation and the overall objectives of the legislation, there were still points of contention that must be resolved in order for a federal bill to pass:

Should citizens be granted a private right of action for violations of the federal law? 

  • YES: As one Congressman stated throughout the hearing, “a right without a remedy is no right at all.” Those in favor of a private right of action stated that it was critical for individuals to have the power to enforce their own rights. And as California Attorney General Xavier Becerra stated, attempting to enforce the rights of every private citizen is a massive undertaking, and state AGs just don’t have the capacity to do this.
  • NO: Those against a private right of action cited concerns such as an increase in frivolous lawsuits (especially if the consumer is not required to show harm resulting from a privacy violation), class actions lawsuits that only benefit lawyers and give little to actual victims, and the stifling of small businesses, which would not be able to engage in expensive defense litigation. The naysayers further noted that consumers would already be protected with the expanded enforcement authority of the FTC and administrative remedies within the company.

Should the federal law preempt state privacy laws?

  • YES: “Preempting state laws should not mean weakening protections for consumers.” Those in favor of preemption argued that the federal law should/will be strong and robust enough to protect consumers without significant gap-filling by the states (referring to similarities with HIPAA and COPPA). Having a federal law that doesn’t preempt state laws creates the risk of some states going above and beyond the federal law, requiring all companies that operate in that economy to comply- again, just another patchwork of strong state laws against the backdrop of a weak federal law, with different expectations, rights, and compliance efforts across regions.
  • NO: Those against preemption expressed concerns that all of the recent efforts by states to protect their consumers’ privacy rights (e.g., the CCPA in California, Illinois’s BIPA, Maine, Nevada, etc.) would be erased. As California Attorney General Xavier Becerra argued, the federal law should serve as “floor” rather than a “ceiling,” and create a privacy baseline on which states can provide more stringent data protection.

Should the federal law apply equally to all businesses that collect consumer data?

  • YES: Several of the speakers agreed that the federal legislation should be “technology-neutral,” and apply equally to any company collecting personal data.
  • NO: Those against a uniform application of the law argued that compliance is easier for larger or international companies, especially those which have already implemented steps and procedures to comply with the GDPR and CCPA. Small businesses and startups, on the other hand, may struggle with such implementation and may not be able to survive a potential violation and subsequent lawsuit. There should be distinctions for compliance based on company size, how much personal data the company collects and uses, and whether that data is particularly sensitive or risky.

Other notable arguments made at the hearing that may impact the U.S.’s federal privacy response:

  • As Senator Blumenthal (D-CT) noted, the late Supreme Court Justice Ruth Bader Ginsburg was a leader in the protection of privacy rights (citing her dissent in Spokeo v. Robbins). Several Committee members agreed with him that the new Supreme Court nominee should also be an advocate for increased consumer privacy.
  • As a country, we need to address systemic inaccuracy and racial bias issues before making laws that allow for the use of biometric technologies in law enforcement (citing a NIST study that found that facial recognition tools found that Black, Brown, and Asian individuals were up to 100 times more likely to be misidentified than white male faces).
  • We also need to address how to protect consumers from being manipulated by algorithms used by online platforms, and data filtering, which some argued is contributing to a growing polarization in the country.

The hearing left much to be discussed, but as the Committee Chairman, U.S. Senator Roger Wicker (R-Miss.), stated – we’re moving in the right direction. Members of the panel also noted that the SAFE DATA Act and the other proposed privacy bills share a lot of common ground, offering hope that a federal privacy law will be here sooner versus later.

Is a U.S. federal privacy law on the horizon?

Tomorrow, September 23rd at 10:00 a.m., U.S. Senator Roger Wicker (R-Miss.), chairman of the Committee on Commerce, Science, and Transportation, will convene a hearing titled, “Revisiting the Need for Federal Data Privacy Legislation.”

The hearing will examine the current state of consumer data privacy and legislative efforts to provide baseline data protections for all Americans. It will also examine lessons learned from the implementation of state privacy laws in the U.S. and the E.U. General Data Protection Regulation, as well as how the COVID-19 pandemic has affected data privacy.

Witness testimony will be provided by several Former Commissioners of the FTC as well as the California Attorney General.

Watch the live hearing here or stay tuned for a discussion of the hearing in a follow-up post.

Partners Martin Zoltick and Jenny Colgate with associate Caitlin Wilmot will present a webinar in conjunction with Lexology titled “Employee privacy and security considerations in the age of COVID-19” on Wednesday, September 23, 2020, from 11 am – 12 pm ET.

The world changed dramatically in early 2020 as COVID-19 forced companies worldwide to change their practices and policies seemingly overnight. Some employees began working from home as companies scrambled to accommodate them; others began working on staggered schedules or with a reduced workforce as companies implemented safety measures to minimise risk for both employees and customers. In some cases, companies began furloughing or laying off employees.

In this webinar, attorneys from Rothwell Figg will discuss the important privacy, data protection and cybersecurity considerations that employers need to evaluate in any of these uncharted situations. The webinar will cover:

  • employees working from home (eg, technologies that may track employees, the use of videoconferencing tools and the pitfalls of home networks);
  • corporate policies and practices that ensure network security (eg, multifactor authentication, the use of secure platforms for conducting all employment-related work, acceptable use and remote access and network monitoring for suspicious activity);
  • entering into new contracts with software or other vendors, particularly when in a rushed situation;
  • legal and regulatory compliance issues when employees are working in alternative set-ups and when employers are sidetracked (eg, California Consumer Privacy Act (CCPA) enforcement began on 1 July 2020);
  • employees having less direct oversight;
  • companies implementing back-to-work measures (eg, temperature checks and symptom, travel and exposure questionnaires);
  • companies considering surveillance measures (eg, physical surveillance, electronic surveillance and geolocation tracking);
  • data protection and trade secret issues associated with furloughed and laid-off workers; and
  • legislative efforts to create COVID-19-related privacy bills.

Registration is free and open to all. Please click here to register.

Partners Martin Zoltick and Jenny Colgate, along with associate Caitlin Wilmot, will present a webinar titled “Connected Healthcare – Navigating the Patchwork of US Privacy Laws and Developing a Platform that Promotes Trust” for the American Bar Association (ABA) on Monday, September 21, 2020, at 1 pm ET.

As the field of connected healthcare grows exponentially, so too are the fields of privacy and data protection law. The problem is that the growth of each is independent. While connectivity can directly benefit both patients and healthcare providers, they also come with risks. Legal non-compliance risks. Security risks. Trust risks. It is important that those in the field of connected healthcare stay informed of the ever-developing body of U.S. and state privacy and data protection law, as compliance with the huge patchwork of privacy laws is essential for avoiding fines, bad headlines, and being the subject of the next FTC or AG investigation, or the next class action lawsuit.

We will discuss some of the areas of privacy and data protection law that those in the field of connected healthcare should be paying attention to, such as HIPAA, CCPA, BIPA, and other biometric laws, IoT security laws, COPPA, ECPA, and web scraping laws including but not limited to the CFAA. We will also share some advice on best practices, as it is likely that for connected medical devices to be successful in the future compliance alone may not be enough. As consumers become more educated on privacy and data protection, they are looking for platforms that are built on the concepts of transparency and control. Transparency promotes trust.

The webinar is sponsored by the ABA Section of Science & Technology Law, and is open to ABA members and non-members. To learn more about the webinar, or to register, please click here.

Cybersecurity does not just pose technical challenges; companies must always keep their eye on the human component of cyber risk.  For example, even the most damaging and sophisticated hacks – such as the recent Twitter hacks – can result from spear-phishing. Imagine that: multi-billion-dollar new technology communication apparatuses brought to their knees by charming fraudsters on the phone. But the pseudo-insider risk does not end with phishing schemes. Instead, hackers and criminals of all stripes are seeking weaknesses that will enable them to gain leverage over companies.

On August 26, 2020, the United States Department of Justice charged a Russian national for offering $1 million to a Tesla employee in return for them infecting their employer’s network with malware.  Egor Igorevich Kriuchkov met with the employee on multiple occasions as part of the recruitment effort.  The malware was designed to exfiltrate data from Tesla. The criminal group behind the attack allegedly would then demand $4 million in return for the information.

A ransomware operation, like the one detailed in the criminal complaint, encrypts all of a company’s data and demands a hefty payment in return for the decryption key. For many companies, it is less expensive to pay the criminal’s fee than to undergo lengthy service outages. Ransomware is often spread via malware. However, this case describes using corrupted insider employees as the agents of infection. This altered tactic shows how determined criminal hackers can be.

Based upon the allegations contained in the complaint, this constituted a long-term, concerted effort by the criminals. The criminal recruiter traveled from Russia to Nevada multiple times and apparently spent many thousands of dollars wooing the individual.  While remarkable, it would be foolish for companies to think this approach was novel. The fact of the matter is that international criminal espionage is a real and persistent threat.

Determined adversaries – just as with the traditional espionage world – will search for and develop human assets in their search for data.  Numerous legal consequences can flow from these types of attacks.  If the crime is successful, and the ransom is paid, companies can faces years of litigation to make themselves whole again. This litigation could be with their vendors, who had their services interrupted, or with the company’s own insurers.

In January 2020, long-running litigation over the cyber coverage afforded by a business owner’s policy in a 2016 ransomware attack was resolved at summary judgement by a Maryland federal judge’s order.  In that case, insurance coverage was finally ordered but only after years of litigation. If an insider was the source of the ransomware, the path to coverage would be even longer and more legally treacherous.

The question then becomes, what can be done?  First, companies must recognize just how enticing and valuable their digital assets have become. Just like any other valuable asset, companies must adopt a 360-degree approach to security.  That approach should be regularly re-examined and scrutinized. Now that the human element has become an obvious and well-funded vector for criminal mischief, companies must re-double their internal training and education. Company employees must be taught that their access will be targeted by criminal elements and how to respond. You do not want your employee to be surprised by novel and unexpected attention. It is best to let everyone know that they are not participants in a spy movie but could participate in a prison movie, if they choose poorly.

Companies also have to begin planning for cyber litigation now, not later. The preparation on a litigation standing will re-enforce proper workflows and decision-making, even under pressure. Early litigation preparation will also strengthen later arguments that the cyber response process should be considered privileged, which is a burgeoning litigation fight.  Those privilege issues should be the subject of a separate discussion.  However, to paraphrase digital godfather Benjamin Franklin, smart companies know that one byte of preparation equals one terabyte of cure.

Have you seen the new headline about Twitter in the news?  It may be time to double-check your corporate practices and check-in with your employees.
The top new FTC privacy probe concerns Twitter, which has been charged by the FTC for breaching a 2011 consent decree by using phone numbers and email addresses that users provided for security purposes (e.g., two-factor authentication), and using that information to target users with advertisements.  According to Twitter, the FTC has alleged that this conduct breached the 2011 consent decree (resulting from a hacker incident), which purportedly “bars Twitter for 20 years from misleading users about the extent that it protects their privacy.”  Twitter’s misuse of user’s phone numbers and email addresses for direct advertising was self-revealed by the company in an October blog post, which noted that it did not know how many people were impacted.  Twitter called the misuse “inadvertent.”  Twitter said on Monday it expected to pay between $150-250 million to resolve the FTC investigation.

This story should have all corporations taking a look at their own corporate practices and making sure that similar actions are not happening within their closed doors.  All companies are “barred” from misleading users about the extent that they protect user privacy by virtue of, inter alia, FTC section 5 and state UDAP statutes.  (In Twitter’s case, because of its 2011 security incident, it also was barred via a consent decree.)  Also, with many employees working remotely these days, it may be harder for companies to oversee how different sections of the company are interacting.  Perhaps in Twitter’s case this alone led to the issue?  Who knows. 

In any event, let Twitter’s big headline be a reminder to all companies to: (1) review your privacy policy and any other representations that you make to customers regarding the privacy and security of their data; (2) review your corporate procedures (not just policies, but check in with the boots on the ground) to ensure they are consistent with your privacy policy and other representations that you make to customers; and (3) make sure corporate training events regarding privacy and security are in place, so as to create a corporate culture of data protection and privacy by design.

Mistakes happen, but diligence can prevent them…and can help serve as a defense for when they do happen.