While the nation was preoccupied with the presidential race, California voted “YES” on the California Privacy Rights Act (“CPRA”), which amends and expands on the CCPA. Our previous post sets forth the highlights of the balloted Proposition 24, but in a nutshell, the CPRA was designed to close a number of loopholes in the CCPA, strengthen consumer privacy protections, and establish the California Privacy Protection Agency as the primary enforcement authority.

Some dates to keep in mind – The CPRA amendments become effective on January 1, 2023, and will apply to personal information (“PI”) collected by covered businesses on or after January 1, 2022. The CCPA’s existing exemption for PI collected for employment purposes or in connection with business-to-business (B2B) communications was extended until January 1, 2023 (previously January 1, 2022). The California Privacy Protection Agency must be established by July 1, 2021 and must adopt final regulations by July 1, 2022. Enforcement of the CPRA amendments by the Agency will not begin until July 1, 2023.

While the key dates seem far off, companies should start thinking, sooner versus later, about what steps they will need to take to implement the new privacy law. And the good news is that you don’t have to start from scratch – compliance with the CPRA should simply build on current efforts towards compliance with the CCPA (which is still enforceable as currently implemented).

So what steps should your business take now?

  • Determine if your business is covered by the CPRA. The CPRA changes the thresholds for businesses that are subject to California’s privacy law. For instance, the CPRA defines covered businesses as those engaged in the buying, selling, or sharing of the PI of more than 100,000 California consumers/households. This is an increase from 50,000 under the CCPA, meaning that more small businesses are now excluded from the regulation. However, the specific addition of businesses that “share” PI, clarifies and expands the scope of the law, to now cover businesses who provide PI to others, whether or not for monetary or other valuable consideration (in an effort to further regulate the use of PI for behavioral or targeted advertising purposes).
  • Revamp your data subject request systems.  The CPRA creates new rights for California consumers, including the right to correct PI, the right to limit the use of sensitive PI, and the right to opt out of the “sharing” of PI. Your business should thus implement changes on the system back-end to accept and act on such requests by consumers. Companies will also need to consider how to distinguish and separate out “sensitive personal information,” which includes SSN, driver’s license number, passport number, credit card info in combination with log-in credentials for a bank account, geolocation data, health and biometric data, and information about race, ethnicity, religion, and sexual orientation.
  • Review and revise business agreements. The CPRA places new contractual obligations on service providers, contractors, and third parties. Specifically, it requires that businesses sending PI to third parties enter into an agreement binding the recipient to the same level of privacy protections provided by the CPRA, granting the business rights to take reasonable steps to remediate unauthorized use, and requiring the recipient to notify the business if it can no longer comply.
  • Enhance data security and implement risk assessments. The CPRA requires businesses to take reasonable precautions to protect consumers’ PI from a security breach. In addition, under the California Privacy Protection Agency’s rulemaking, businesses that process PI that presents a significant risk to consumers’ privacy or security must (i) perform an annual cybersecurity audit, and (ii) submit to the Agency on a regular basis a risk assessment with respect to the potential risks related to their processing of PI.

If you need any help navigating or implementing California’s evolving privacy law, please contact us at privacy@rothwellfigg.com.

Four Rothwell Figg attorneys, Jenny Colgate, Caitlin Wilmot, Martin Zoltick, and Jennifer Maisel, authored the U.S. section of Mondaq’s Data Privacy Comparative Guide.

Mondaq’s Comparative Guides provide an overview of key points of law and practice, and allows users to compare regulatory environments and laws across multiple jurisdictions on a global scale. Users can select a topic, choose regions, and refine subjects to view detailed analysis provided by carefully selected internationally recognized experts.

To read the U.S. section of the Data Privacy Comparative Guide, please click here.

If you’re a company that has been scratching your head and racking your brain since the Schrems II decision issued on July 16, 2020, invalidating Privacy Shield and calling into question all data transfers between the EU and third countries on surveillance-related grounds, your wish for more guidance has finally come true.

This week, the European Data Protection Board (EDPB) adopted recommendations regarding surveillance measures on the European Essential Guidelines (EEG), and recommendations on measures to supplement transfer tools.  Additionally, the European Commission published new draft standard contractual clauses (SCCs) and its draft implementing decision.  A discussion of each is below.

Accompanying the recent flurry of activity, the EDPB issued a press release, acknowledging the importance of the issued guidance to companies who have been struggling to know how to conduct cross-border data transfers following the July 2020 Schrems II ruling.  In the press release, the EDPB Chair, Andrea Jelinek said:

“The EDPB is acutely aware of the impact of the Schrems II ruling on thousands of EU businesses and the important responsibility it places on data exporters. The EDPB hopes that these recommendations can help data exporters with identifying and implementing effective supplementary measures where they are needed. Our goal is to enable lawful transfers of personal data to third countries while guaranteeing that the data transferred is afforded a level of protection essentially equivalent to that guaranteed within the EEA.”

She added:

“The implications of the Schrems II judgment extend to all transfers to third countries. Therefore, there are no quick fixes, nor a one-size-fits-all solution for all transfers, as this would be ignoring the wide diversity of situations data exporters face. Data exporters will need to evaluate their data processing operations and transfers and take effective measures bearing in mind the legal order of the third countries to which they transfer or intend to transfer data.”

Recommendations Regarding Surveillance Measures: European Essential Guidelines (EEG)

On November 10, 2020, the EDPB adopted recommendations on the European Essential Guarantees for surveillance measures.  These recommendations provide data exporters with a framework for determining if the surveillance practices in a third country with respect to public authorities’ access to data can be regarded as justifiable interference with the rights to privacy and the protection of personal data, such that they do not impinge on the commitments of the Article 46 GDPR transfer tool that the data exporter and importer rely on.

The publication starts out with an introduction, setting forth the historical framework for the issuance of the recommendations, including the Schrems I judgment, the Schrems II judgment, and the fact that the invalidation of Privacy Shield had consequences on other transfer tools as well (i.e., any tools referred to in Article 46 GDPR).  The introduction explains that the Schrems II judgment was a determination that US surveillance measures interfered with what are considered “fundamental rights” under EU law, i.e., the rights to respect for private and family life, including communications, and to the protection of personal data.  These rights are laid down in Articles 7 and 8 of the Charter of Fundamental Rights of the EU.  It is then further explained that Articles 7 and 8 of the Charter are not absolute rights, but must be considered in relation to their function in society, and it points to Article 52(1) of the Charter which specifies the scope of possible limitations to Articles 7 and 8, including: “Subject to the principles of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of general interest recognized by the Union or the need to protect the rights and freedoms of others.”  It further states that legislation involving the interference with the fundamental rights guaranteed by Articles 7 and 8 “must lay down clear and precise rules governing the scope and application of the measure and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse,” in particular where personal data is subjected to automatic processing and “where there is a significant risk of unlawful access to that data.”  Finally, the introduction explains that the “four European Essential Guarantees” (set forth in the publication) intend to specify “how to assess the level of interference with the fundamental rights to privacy and data protection in the context of surveillance measures by public authorities in a third country, when transferring personal data, and what legal requirements must consequently apply in order to evaluate whether such interferences would be acceptable under the Charter.”

The four European Essential Guarantees are as follows:

  1. Processing should be based on clear, precise and accessible rules;
  2. Necessity and proportionality with regard to the legitimate objectives pursued need to be demonstrated;
  3. An independent oversight mechanism should exist; and
  4. Effective remedies need to be available to the individual.

The paper then goes into detail explaining and developing each of these guarantees, and emphasizes on the fourth guarantee (effective remedies) the language from the Schrems II judgment explaining that “data subjects must have the possibility of bringing legal action before an independent and impartial court in order to have access to their personal data, or to obtain the rectification or erasure of such data,” as well as the court’s point that effective judicial protection against interferences with personal data can be ensured not only by a court, but also by a body which offers guarantees essentially equivalent to those required by Article 47 of the Charter.    (Note: Article 47 of the charter sets forth the right to an effective remedy and a fair trial.

Finally, in the “Final Remarks,” the paper acknowledges that the four guarantees require “a certain degree of interpretation, especially since the third country legislation does not have to be identical to the EU legal framework.” Notwithstanding, it concludes that the assessment of third country surveillance measures against the EEG (European Essential Guidelines) may lead to two conclusions: (1) the third country legislation at issue does not ensure the EEG requirements, or (2) the third party legislation satisfies the EEG.  Thus, it appears that determinations should be made on a country-by-country basis.

Recommendations on Measures that Supplement Transfer Tools

The same day, November 10, 2020, the European Data Protection Board (EDPB) also adopted “recommended measures” for complying with the GDPR requirements for EU-third party data transfers, including example “supplemental measures” to supplement third country’s laws, if an exporter determines that the laws are or may be insufficient/not comparable to those required by the EU.  The publication provides detailed guidance, i.e., a “roadmap”, on how companies can determine whether a particular EU-third party data transfer may occur and what steps are necessary.  These recommendations are open for public comment until November 30, 2020.

The Executive Summary of the recommended measures, setting forth the background and purpose of the recommendations, explains: “Standard contractual clauses and other transfer tools mentioned under Article 46 GDPR do not operate in a vacuum.  The Court [in Schrems II] states that controllers or processors, acting as exporters, are responsible for verifying, on a case-by-case basis and, where appropriate, in collaboration with the importer in the third country, if the law or practice of the third country impinges on the effectiveness of the appropriate safeguards contained in the Article 46 GDPR transfer tools.  In those cases, the Court still leaves open the possibility for exporters to implement supplementary measures to fill these gaps in the protection and bring it up to the level required by the EU law.  The Court does not specify which measures these could be.  However, the Court underlines that exporters will need to identify them on a case-by-case basis.  This is in line with the principle of accountability of Article 5.2 GDPR, which requires controllers to be responsible for, and be able to demonstrate compliance with the GDPR principles relating to processing of personal data.”  (emphasis added).

The recommended measures consist of a series of six steps for exporters to follow, and include potential sources of information and examples of some supplementary measures that could be put in place.  Below is an overview of the six steps.

  1. Exporters should know their transfers.  It is strongly advised that exporters map all of their transfers of data to third countries, even though it is a difficult exercise, because it will allow (a) determination of whether there is sufficient levels of protection; and (b) determination of whether the data transferred is adequate, relevant, and limited to what is necessary.
  2. Exporters should verify the transfer tool being used (e.g., an adequacy decision or some other transfer tool listed under Article 46 GDPR).  There is a reminder that the derogations provided in Article 49 GDPR may only be used for occasional and non-repetitive transfers, if conditions are met.
  3. Exporters should assess if there is anything in the law or practice of the third country that may impinge on the effectiveness of the appropriate safeguards of the transfer tools being relied on, in the context of the specific transfer.  While this step provides that primary focus  should be on the third country legislation that is applicable, and the EEG recommendations (see above) should be used to assess it, this step also provides that “In the absence of legislation governing the circumstances in which public authorities may access personal data, if you still wish to proceed with the transfer, you should look into other relevant and objective factors, and not rely on the subjective factors such as the likelihood of public authorities’ access to your data in a manner not in line with EU standards.”  It further provides that such an assessment should be conducted with due diligence and documented thoroughly, “as you will be held accountable to the decision you may take on that basis.”
  4. Exporters should identify and adopt supplementary measures that are necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence, if one’s assessment reveals that the third country legislation impinges on the effectiveness of the Article 46 GDPR transfer tool you are relying on/intend to rely onSome examples of effective supplementary measures are set forth in annex 2, but it is noted that some of these measures may be effective in some countries but not others, depending on the country’s laws.  It is further provided that: “You may ultimately find that no supplementary measure can ensure an essentially equivalent level of protection for your specific transfer.  In those cases where no supplementary measure is suitable, you must avoid, suspend or terminate the transfer to avoid compromising the level of protection of the personal data.  You should also conduct this assessment of supplementary measures with due diligence and document it.”  (emphasis added).
  5. Exporters should take any formal procedural steps in the adoption of any supplementary measure, and some such formalities are set out in the recommendations.  For example, if you intend to modify the SCCs or where supplementary measures added contradict the SSCs, you are no longer deemed to be relying on the SCCs, and must seek authorization from the competent supervisory authority in accordance with Article 46(3)(a) GDPR.
  6. Exporters should re-evaluate at appropriate intervals the level of protection accorded to the data transferred to third countries and monitor if there have been or will be any developments that will affect those transfers/that data, as the principle of accountability requires continuous vigilance of the level of protection of personal data.

New Draft SSCs and Implementing Decision on SCCs

Yesterday, November 12, 2020, the European Commission published a draft set of new SCCs, as well as a draft implementing decisionThe drafts are open for public comment until December 10, 2020.  They are expected to be adopted in late 2020 or early 2021.

Importantly to addressing Schrems II, clause 2(a) of the draft SSCs requires the parties to warrant that they “have no reason to believe that the laws in the third country …, including any requirements to disclose personal data or measures authorizing access by public authorities, prevent the data importer from fulfilling its obligations under these Clauses.  This is based on the understanding that the laws that respect the essence of the fundamental rights and freedoms and do not exceed what is necessary and proportionate in a democratic society to safeguard one of the objectives listed in Article 23(1) GDPR, are not in contradiction with the Clauses.”

Clause 2 further requires:

  • the parties to declare that they have taken proper due diligence measures to assess, inter alia, the specific circumstances of the transfer, the laws of the third country of destination, and any safeguards in addition to those under those clauses;
  • the data importer to warrant that it has made best efforts to provide the data exporter with relevant information and agrees it will continue to cooperate with the data exporter in ensuring compliance;
  • the parties to document their assessment process and make it available to the competent supervisory authority upon request;
  • the data importer to notify the data exporter if it has any reason to believe that it has or become subject to laws not in line with the requirements of Clause 2; and
  • the data exporter, upon learning or having reason to believe that the data importer cannot fulfill its obligations, to identify appropriate measures to be adopted by the data exporter and/or data importer to address the situation, if appropriate in consultation with the competent supervisory authority.

 

 

Earlier this week, the Federal Trade Commission (FTC) announced a settlement with Zoom that will require the company to enhance its data security practices to address allegations that the videoconferencing provider engaged in a series of deceptive and unfair practices that duped users into a false sense of security. Zoom, which has become a household name this year – with the FTC reporting a surge in users from 10 million in December 2019 to 300 million in April 2020 during the pandemic – has agreed to implement a comprehensive information security program and to stop making misrepresentations about its privacy and data security practices.

In its complaint, the FTC asserts that, since at least 2016, Zoom misled users by advertising “end-to-end, 256-bit-encryption,” which, in theory, would secure communications so that only the sender and recipient(s), and no other person– not even the platform provider– can access the content. In reality, Zoom’s practices fell far short, with the platform maintaining cryptographic keys to access user content, storing unencrypted recordings for up to 60 days, and offering a lower level of encryption than promised. Subpar security controls also allowed for unwanted intrusions, or “Zoombombing.” The complaint further alleges that Zoom engaged in deceptive and misleading practices with regard to its secretly installed software, called a ZoomOpener web server, which allowed the program to bypass computer security safeguards and would remain on users’ computers even after the Zoom app had been deleted. The Commission found that such deployment of the ZoomOpener, without adequate notice or user consent, was unfair and violated the FTC Act.

Under the settlement agreement, Zoom is required to annually assess and document any potential security risks and develop ways to protect against these vulnerabilities; to refrain from making misrepresentations about how it collects and uses personal information or the level of security offered to users; and to obtain biennial assessments of its security program by an independent third party for the next 20 years.

The Commission voted 3-2 on the proposed consent agreement with Zoom, with the two dissenters arguing that the agreement amounted to nothing more than a slap on the wrist for the telecommunications tycoon, whereas the security failures warranted serious action. They also noted that the proposed settlement agreement provided no remedy for affected users and no other meaningful accountability.

The FTC will soon publish a description of the consent agreement in the Federal Register, subject to public comment for 30 days. The FTC will then decide whether to make the proposed consent order final. Each violation of such an order may result in a civil penalty of upwards of $40,000.

Rothwell Figg attorney Christopher Ott published a chapter titled “Phantom Responsibility: How Data Security and Privacy Lapses Lead to Personal Liability for Officers and Directors” in the International Comparative Legal Guide to: Cybersecurity 2021, published by Global Legal Group, Ltd.

Boards of directors ignore data security and privacy risks to companies at the peril of their companies and – increasingly – their own personal liability. A business has its operations halted by ransomware approximately every 10 seconds. Billions of records are exposed every fiscal quarter. The global costs of these breaches and online crime reaches the trillions every year. These potential costs have elevated data security and privacy issues from mere “IT issues”, or compliance minutiae, to the centerpiece of strategic risk management. The law has grown to match this reality. As a result, boards face expanding personal legal liability for the company’s data security and privacy failures.

In 2014, Securities and Exchange Commission Commissioner Luis Aquilar stated that “boards that choose to ignore, or minimize, the importance of cybersecurity oversight responsibility do so at their own peril”. Those perils are changing in real time just as cybersecurity and privacy threats are changing. However, it is possible to identify certain concrete areas of established liability and strategically identify the emergent risks. In the chapter, Mr. Ott explores the current trends and tackles a few harder-to-classify risks related to United States national security oversight of cyber readiness.

To read the article in its entirety, access the article on the ICLG website here: https://iclg.com/practice-areas/cybersecurity-laws-and-regulations/3-phantom-responsibility-how-data-security-and-privacy-lapses-lead-to-personal-liability-for-officers-and-directors.

Although this is no ordinary campaign, recent news shows how politicians have many of the same worries as typical businesses.  On Thursday, October 29, 2020, the Wisconsin Republican Party reported that it had been victimized by a Business Email Compromise (BEC). There are many ways in which a criminal may conduct a BEC scam but one of the most common occurs when hackers compromise a vendor’s email accounts to hijack vendor payments. With this access, the hacker prepares elaborately fake invoices (or other supporting documents) mirroring the appearance, content, amount, and timing of typical documents from the vendor. The hacker then submits a request to change the usual payment procedures.  The hackers’ new payment plan always involves a well-known U.S. bank. When the victim business makes the next vendor payment, it goes quickly out of the U.S.-based bank and out of the country.

That is exactly what appears to have happened here when hackers stole $2.3 million from the Wisconsin Republican Party that was intended for use in the president’s re-election campaign. The theft was accomplished by tampering with invoices submitted to the party from four vendors. The modified invoices directed the state GOP to send money to accounts controlled by the hackers after a successful phishing campaign. (Phishing should be the subject of a separate, longer discussion. For today’s purposes, it is enough to know that “phishing” involves using emails to trick the recipient to hand over network control, credentials, and/or install malware that gives the hackers remote access to those systems.)

BEC cybercrime is big business.

  • While splashy malware attacks receive media attention, BEC fraud quietly cost businesses billions (with a “B”!) of dollars in recent reported losses every year.
  • Email remains a top attack vector for BEC attackers because, compared to hacking a company’s network infrastructure, it provides an easier, demonstrably profitable path for criminals.
  • These are often single-use email accounts and the hackers establish or hijack tens of thousands of these accounts every year.

Similarly, the election infrastructure is also grappling with ransomware attacks. Ransomware is a type of malware that threatens to publish the victim’s data or perpetually block access to it unless a ransom is paid. These attacks can be very disruptive. Imagine that you are running a hospital – which is the subject of another recent hacking campaign – and your health data is inaccessible: people could actually die.  Ransomware costs are climbing rapidly.  This is complicated by the fact that a company can also face fines for paying that ransomware.

We at Rothwell Figg often litigate and investigate the fallout from these BEC and ransomware events for clients. Although the consequences can be dire, there are real advantages to be had from smart and active cybersecurity legal response. Every organization could benefit from some work on their cyber hygiene and no organization is immune to these risks. The Wisconsin Republican Party and various election officials are learning their lessons in the public eye.

Nearly a year after the California Consumer Privacy Act (CCPA) went into effect, Californians now have a chance to weigh in on the California Privacy Rights Act of 2020 (CPRA), which is on the November 3, 2020, ballot as Proposition 24.  The CPRA is designed to strengthen consumer privacy protections by amending the CCPA to close certain loopholes, heighten enforcement through the creation of a California Privacy Protection Agency, and prevent the California legislature from weakening the law.  While many applaud the CPRA’s efforts, several groups—including the American Civil Liberties Union—have raised concerns that the CCPA is confusingly drafted, could worsen some of the CCPA’s loopholes, and may have the opposite effect of creating a ceiling on privacy legislation.

Below are a few highlights from Prop 24.

Data Retention: Section 1798.100(a)(3) specifies that a business that controls the collection of a consumer’s personal information shall, at or before the point of collection, inform consumers as to “the length of time the business intends to retain each category of personal information” or, if that is not possible, “the criteria used to determine such period.”  “[A] business shall not retain a consumer’s personal information or sensitive personal information for each disclosed purpose for which the personal information was collected for longer than is reasonably necessary for that disclosed purpose.”  Data retention was not fully addressed in the original CCPA, and it is a good common sense measure from both a privacy and data security perspective.

Sharing of Data: In response to covered entities narrowly interpreting the “sale” of personal information under the CCPA, the CPRA amends Section 1798.115 to provide that a customer shall have the right to know what personal information is sold or shared, and the categories of parties to whom the personal information was sold or shared.   Section 1798.120 is also amended to reflect that consumers have a right to opt-out of sale or sharing of personal information. The business must also provide a clear and conspicuous link titled “Do Not Sell or Share My Personal Information.”  See Section 1798.134(a)(1).

Global Opt-Out of Sale or Sharing of Personal Data:  The CPRA calls for regulations to define a universal “opt-out preference signal sent by a platform, technology, or mechanism, to indicate a consumer’s intent to opt-out of the sale or sharing of the consumers personal information and to limit the use or disclosure of the consumer’s sensitive personal information.”  The global opt-out mechanism is a response to the tedious, and sometimes opaque, task of having to opt-out of the sale of personal information individually with each covered entity.  (n.b. The California Attorney General’s final CCPA rules require companies to honor a global “Do Not Sell” user-enabled privacy control, such as a browser plug-in or privacy setting, device setting, or other mechanism.)

Sensitive Information: Sensitive personal information is expressly defined, and includes information such as social security, driver’s license, state identification and passport number; a customer’s account log-in, financial account, debit card or credit card number in combination with any required security or access code or other credentials; precise geolocation; racial or ethnic origin; religious or philosophical beliefs; union membership; the contents of mail, email and text messages; genetic data; biometric information; health information; and sex life or sexual orientation, among others.  Section 1798.140(ae).  Section 1798.121 is added to provide that a consumer has the right to limit use and disclosure of sensitive personal information.  Businesses must also provide a clear and conspicuous link on the business’s internet homepage titled “Limit the Use of My Sensitive Personal Information.”  Section 1798.135(a)(2)

Consent: Defined as “any freely given, specific, informed and unambiguous indication of the consumer’s wishes by which he or she . . . such as by a statement or by a clear affirmative action, signifies agreement to the processing of personal information relating to him or her for a narrowly defined particular purpose.”  Section 1798.140(h).

Enforcement: The CPRA establishes a California Privacy Protection Agency with full power, authority, and jurisdiction to implement and enforce the CCPA, and removes the “right to cure” language in the attorney general enforcement section.

Floor: The CPRA can be amended by the legislature only if it is with consistent with and furthers the initiative’s purposes and intent to “further protect consumers’ rights, including the constitutional right to privacy.”

With the uncertain prospect of a Federal privacy bill, and California and the CCPA setting the de facto data privacy standard in the US, perhaps other states are waiting to see what happens with Prop 24 before rolling out their own initiatives.  Some recent polling suggests that voters overwhelmingly support Prop 24, with 77% of likely voters saying they will vote YES on the ballot measure.

With all that has happened this year, most of us can’t wait until 2020 is in the rear view mirror.  The end of 2020, however, marks the end of the transition period provided, post-Brexit, to allow time for UK businesses and organizations that rely on international data flows, target European customers or operate inside the EEA, to negotiate a new data protection relationship with the EU.

According to the UK Information Commissioner’s Office (ICO), after the transition period, the “UK GDPR” will take effect, with the same key principles, rights and obligations as the GDPR.  However, there are implications for the rules on transfers of personal data between the UK and the EEA.

Understanding your international flows and cross-border processing of personal data, and specifically, transfers from the EEA to the UK, is critical to assessing what steps may need to be taken, presumably prior to the end of the transition period, to continue to lawfully receive these transfers.  A determination will need to be made whether a new data protection relationship needs to be established, considering the use of standard contractual clauses, binding corporate rules, and the guidance provided by the UK ICO, European Data Protection Board, and the European Commission.

The current guidance, while certainly well-intentioned, is murky at best, leaving many open questions about what may be required.  So, as the anxiously awaited end of 2020 approaches, keep an eye on those relationships and how to make them work.

The  Telephone Consumer Protection Act (TCPA) was passed in 1991 and is known by many as the law that created the “do-not-call” rules.  The statute includes a number of restrictions related to telephone, text, and fact solicitations, including a prohibition against what is colloquially known as “autodialing” and “robocalls,” and it creates a private right of action in the event of a violation, providing for the recovery of $500 for each violation of actual monetary loss (whichever is greater), an injunction, or both.

The question before the Supreme Court now, in Facebook, Inc. v. Noah Duguid, et al. (No. 19-511), is: What exactly is autodialing?  Or rather, what is “automatic telephone dialing system (ATDS)” (which is a defined term in the TCPA)?

Does autodialing encompass any device that can store and automatically dial phone numbers?  Or does autodialing require the use of a random or sequential number generator?

The statute defines ATDS as “equipment which has the capacity—(A) to store or produce telephone numbers to be called, using a random or sequential number generator; and (B) to dial such numbers.”  See 47 U.S.C. § 227(a)(1).  The TCPS provides the following prohibitions with respect to the use of ATDS:

It shall be unlawful for any person within the United States, or any person outside the United States if the recipient is within the United States—(A) to make any call (other than a call made for emergency purposes or made with the prior express consent of the called party) using any automatic telephone dialing system or an artificial or prerecorded voice—(i) to any emergency telephone line (including any “911” line and any emergency line of a hospital, medical physician or service office, health care facility, poison control center, or fire protection or law enforcement agency); (ii) to the telephone number assigned to a paging service, cellular telephone service, specialized mobile radio service, or other radio common carrier service, or any service for which the called party is charged for the call, unless such call is made solely to collect a debt owed to or guaranteed by the United States.

[Note: Just a few months ago, on July 6, 2020, the Supreme Court issued a decision in a case that challenged the constitutionality of the TCPA on first amendment grounds.  The basis of the challenge was a 2015 amendment to the TCPA which permitted calls that relate to the collection of debts guaranteed by the U.S. government.   The majority found that the restriction was subject to strict scrutiny and was an unconstitutional content-based speaker restriction.  But rather than find the entire statute unconstitutional, the Court severed the government debt collection exception, leaving the rest of the statute fully operative.]

So getting back to the case at hand, the question is: does “using a random or sequential number generator” modify (1) “produce” and “stored” or (2) just “produce”?  Circuit courts across the country have split over this question.  Most importantly to the case at hand, the Ninth Circuit found that “using a random or sequential number generator” modified only “produce.”  Thus, according to the Ninth Circuit, the TCPA’s prohibition on using autodialers does not require use of a random or sequential number generator.

Facebook filed its brief last month arguing that to qualify as an ATDS the equipment must include use of a random or sequential number generator.  A number of organizations, and the US government, filed amicus briefs supporting Facebook’s position.  Duguid filed his brief earlier this month, urging the Court to affirm the Ninth Circuit’s judgment, finding that any automated dialing of a stored number would constitute an ATDS.

The Supreme Court is slated to hear oral arguments on December 8, 2020.

A lawsuit recently filed against Amazon.com for a violation of the Illinois Biometric Information Protection Act (“BIPA”) should serve as a reminder to all companies engaged in COVID-19-related employee and/or customer scanning that it is important to determine what privacy and cybersecurity laws apply to your screening measures, and confirm that you are engaging in all required practices under those laws.

The recently-filed suit is a class action complaint brought by Michael Jerinic, a former Amazon employee, in the Circuit Court of Cook County, Illinois.  Jerinic alleges that in June 2020, in response to growing safety concerns related to the COVID-19 pandemic, Amazon began requiring its workers to provide a scan of their facial geometry, and possibly other biometric information, as part of a wellness check prior to being allowed access to the warehouse facility each day.  The complaint further alleges that “Defendant’s facial recognition devices and associated software collect and capture biometric identifiers such as scans of an individual’s facial geometry, retinas, and irises.  Defendant also scans and records the workers’ temperatures.”

Jerinic alleges that Amazon’s practices violated BIPA because Amazon did not, inter alia, provide written notice regarding the biometric information being collected or stored.