On Monday, the Federal Trade Commission (FTC) announced a settlement with Everalbum, Inc., the California-based developer of a photo app called “Ever,” with regard to allegations that the company deceived consumers about the use of its facial recognition technology and its data retention practices.

The Ever app allowed users to store and organize photos and videos uploaded from their mobile devices, computers, and social media accounts. In February 2017, the app launched a new “Friends” feature that used facial recognition technology, which allowed users to “tag” individuals appearing in the photos and group together similarly tagged photos. However, although Everalbum represented that it would not apply this facial recognition technology without users’ express consent, the FTC found that this feature was automatically activated for most Ever app users, and could not be turned off. Those users residing in Illinois, Texas, Washington, and the EU– currently, the only jurisdictions with laws related to biometric identifiers– could choose to turn on the facial recognition feature.

Moreover, unbeknownst to users, the facial images extracted from the app were combined with facial images obtained from publicly available sources to create datasets used to develop facial recognition services sold to Paravision, a company specializing in security and AI technology. The FTC also alleged that while Everalbum had promised to delete the stored photos and videos of users who had deactivated their accounts, that it failed to do so, and instead retained the content indefinitely.

The settlement requires Everalbum (who shutdown the Ever app back in August 2020) to: (i) delete the photos and videos of deactivated accounts; (ii) destroy any facial recognition data, models, or algorithms derived from the users’ photos or videos; and (iii) obtain affirmative consent for any use of the biometric data collected from its facial recognition technology. Everalbum is also prohibited from misrepresenting how it collects, uses, discloses, maintains, or deletes personal information. In response to the settlement, Paravision has stated that it is committed to utilizing its facial recognition services in an ethical manner and that the most recent model does not use any of the Ever app’s user data.

Commissioner Rohit Chopra issued a statement along with the settlement, reiterating the notion that facial recognition technology is “fundamentally flawed and reinforces harmful biases” and highlighting the importance of policing its use. He also calls the FTC’s inability to seek a monetary penalty in a first egregious offense “unfortunate” and makes a plea for the FTC, the states, and regulators around the globe to enact laws restricting the use of facial recognition and biometric identifier technologies, and to vigorously pursue enforcement actions against transgressors.

Facebook, the parent company to WhatsApp, is reporting near-record low revenue growth. Thus, presumably in an effort to monetize WhatsApp more heavily, WhatsApp recently announced changes to its privacy policy: as of February 8, 2021, all WhatsApp users (except those that live in Europe) must agree to share their data with Facebook. If users do not agree, WhatsApp will delete their account.

Since the announcement of the changes to WhatsApp privacy policy, there has been much public criticism. The public has also been speaking through their actions—with massive amounts of users changing messaging platforms over the last several days.

For example, following Turkish President Recep Tayyip Erdogan’s media office and the country’s defense ministry telling journalists that they are quitting WhatsApp and moving to an encrypted messaging app called BiP, businesswire.com reports that 4.6 million new users have purportedly signed up for BiP. BiP is a Turkish platform, with all data stored in data centers of Turkcell in Turkey with highly secured encryption. Atac Tansug, Executive Vice President – Digital Services and Solutions of Turkcell, has promised users that they will not be obligated to share their data with third parties.

Elon Musk has also issued a call for users to switch from WhatsApp. Specifically, Musk has urged people via Twitter to switch to Signal. Signal is an encrypted app that lets users send messages and make calls via the Internet. Signal’s marketing message focuses on privacy. According to gadgets.ndtv.com, Signal is open source and its code is peer-reviewed, which means its privacy and security is regularly checked by independent experts.

Meanwhile, Microsoft is urging WhatsApp users to change to Microsoft’s messaging platform, Skype. The Skype Twitter account tweeted: “Skype respects your privacy. We are committed to keeping your personal data private and do not sell to 3rd parties.” The tweet also included a URL linking to Microsoft’s privacy policy statement.

The changes to WhatsApp’s privacy policy are a good example of:

(1) how companies try to monetize data available to them;

(2) how changes to a company’s privacy policy can drive users away;

(3) how Europe’s data privacy laws resulted in different treatment for those living in Europe; and

(4) how companies can compete (and will likely increasingly compete) based on their privacy policies.

Just before the Christmas holidays, the Department of the Treasury’s Financial Crimes Enforcement Network (FinCEN) issued proposed rulemaking entitled “Requirements for Certain Transactions Involving Convertible Virtual Currency or Digital Assets.”  The proposed regulations seek to “require banks and money service businesses (“MSBs”) to submit reports, keep records, and verify the identity of customers in relation to transactions involving convertible virtual currency (“CVC”) or digital assets with legal tender status (“legal tender digital assets” or “LTDA”) held in unhosted wallets…”  The proposed rulemaking is set to be adopted under the Bank Secrecy Act (BSA).

FinCEN justified their proposal on national security grounds – i.e., the national security threat posed by bad actors using CVCs to, inter alia, “facilitate international terrorist financing, weapons proliferation, sanctions evasion, and transactional money laundering.”  Thus, the question arising out of the proposal is the same that often arises – indeed, it’s the same question that came out of the Schrems II decision that led to the invalidation of Privacy Shield last year: What is the proper balance of national security vs. personal privacy?

Specifically, in the case of FinCEN’s recent proposal, banks and MSBs would be required to:

  1. file a report with FinCEN containing certain information related to a customer’s CVC or LTDA transaction and counterparty (including name and physical address), and to verify the identity of their customer, if a counterparty to the transaction is using an unhosted or otherwise covered wallet and the transaction is greater than $10,000 (or the transaction is one of multiple CVC transactions involving such counterparty wallets and the customer flowing through the bank or MSB within a 24-hour period that aggregate to value in or value out of greater than $10,ooo); and
  2. keep records of a customer’s CVC or LTDA transaction and counterparty, including verifying the identity of their customer, if a counterparty is using an unhosted or otherwise covered wallet and the transaction is greater than $3,000.

The proposed rulemaking was open for public comments for only 15 days (the standard public comment period for these types of policies is 60 days), until January 4, 2021.  (Note: the Electronic Frontier Foundation (EEF) and Coinbase both criticized the limited timeframe for public comments, given that the holidays occurred during the 15-day period).

Several of the public comments on the proposal were focused on privacy-related concerns.  Even though the proposal sought to make the know your customer (KYC) rules for traditional banking institutions equally applicable to cryptocurrency, commenters argued that the promises of cryptocurrency (e.g., privacy and self-sovereignty) and the technological nature of cryptocurrency (e.g., the public ledger for blockchain-based currencies like Bitcoin) introduced new concerns.

For example, EEF noted that some cryptocurrencies like Bitcoin keep a public record of all transactions.  Thus, if the name of a user connected with a particular Bitcoin address is known, “the government may have access to a massive amount of data beyond just what the regulation purports to cover.”

Jack Dorsey, the CEO of Twitter and Square, also submitted comments.  Dorsey’s major complaint was that the proposed rules would create “unnecessary friction” between cryptocurrency users and financial institutions, which could lead to “perverse incentives.”  “To put it plainly – were the [regulations] to be implemented as written, Square would be required to collect unreliable data about people who have not opted into our service or signed up as our customers.”

Of course, the proposed rulemaking will not be the end of action in cryptocurrency regulation.  The recently passed “National Defense Authorization Act for Fiscal Year 2021” (H.R.6395) contains additional anti-money laundering tools that may further complicate cryptocurrency procedures in the coming months.

Rothwell Figg partner Christopher Ott was interviewed by Frazer Rice on his “Wealth Actually” podcast, discussing cybersecurity for high net worth individuals. The discussion includes how one should view his or her own digital risks, how to protect oneself, and what to do when you have been compromised.

You can listed to the interview here: https://frazerrice.com/blog/ep-72-chris-ott/.

Pursuant to Section 6(b) of the FTC Act and a December 11, 2020, resolution of the Federal Trade Commission (FTC) entitled “Resolution Directing use of Compulsory Process to Collect Information Regarding Social Media and Video Streaming Service Providers’ Privacy Practices,” the FTC has now issued orders requiring Facebook, WhatsApp, Snap, Twitter, YouTube, ByteDance, Twitch, Reddit, and Discord to file information concerning the method and manner in which they collect, use, store, and disclose information about users and their devices.  The companies have 45 days to respond, and, at the moment, the study is not part of any specific law enforcement proceeding.

The FTC voted 4-1 in favor of beginning the study.  A Joint Statement of three of the supporting FTC Commissioners–Chopra, Slaughter, and Wilson—highlights that several “social media and video streaming companies have been able to exploit their user-surveillance capabilities to achieve such significant financial gains that they are now among the most profitable companies in the world.”  The Joint Statement further explains:

“One key aspect of the inquiry is ascertaining the full scale and scope of social media and video streaming companies’ data collection. The FTC wants to know how many users these companies have, how active the users are, what the companies know about them, how they got that information, and what steps the companies take to continue to engage users. The inquiry also asks how social media and video streaming companies process the data they collect and what kinds of inferences they are able to make about user attributes, interest, and interactions. The FTC wants to understand how business models influence what Americans hear and see, with whom they talk, and what information they share. The questions push to uncover how children and families are targeted and categorized. These questions also address whether we are being subjected to social engineering experiments. And the FTC wants to better understand the financial incentives of social media and video streaming services.”

Commission Phillips submitted a Dissenting Statement, primarily criticizing the breadth of the inquiry.

Although it is very early in this inquiry, we can glean certain discrete details about what this order does and does not mean. First, this order has little direct connection with the public debates about free speech and Section 230.  Rather, this order does signal that the primary consumer protection and privacy watchdog is going to dive into the so-called attention economy.  This attention economy drives the online advertising goldrush, and many of the “free” online services that we enjoy. However, the effects that this still-new economic model may have upon consumers remain poorly understood.

The FTC study may shine a light on the practice of offering free services in exchange for monetization of personal data.  The investigation may also lead to further activity at the federal level in areas such as child exploitation, misinformation, and violence with social media and video streaming services.  Sen. Edward Markey (D-Mass) and Sen. Richard Blumenthal (D-Conn.) already introduced a bill earlier this year that addresses the “non-transparent ways” digital media apps and platforms market to and collect information from users ages 16 and under.

IBM Security (the company’s cybersecurity division) has recently discovered a global phishing campaign targeting organizations associated with the critical coronavirus vaccine distribution chain. The division’s “X-Force,” created at the onset of the pandemic to monitor cyber threats against vaccine developers and distributors, released a report on Thursday with their analysis and recommendations.

Starting in September, and spanning across at least 6 countries and multiple organizations, the malicious cyber operation appeared to specifically target the Cold Chain Equipment Optimization Platform (CCEOP), a $400M, 5-year project launched in 2015 by Gavi the Vaccine Alliance, UNICEF, and other partners. CCEOP was initiated to upgrade existing cold chain equipment in 56 countries by 2021, and has been specifically utilized this year to support vaccine response efforts for COVID-19. As vaccines are both light- and heat-sensitive, the newly developed technology has been absolutely vital in preserving vaccine quality and potency during storage and transportation.

The nature of the cyberattacks involved impersonation of a business executive from Haier Biomedical, a legitimate CCEOP supplier and purportedly the world’s only complete cold chain provider. Disguised as this business leader, the cyber-attacker sent phishing emails to organizations that support the CCEOP project efforts, including those in the energy sector (e.g., companies who develop solar-powered vaccine refrigerators and dry ice through petroleum production) and the IT sector (e.g., organizations involved in safeguarding pharmaceutical manufacturing and the biotechnical and electrical components of container transportation). The emails were posed as requests for quotations for participating in a vaccine program, but included malicious HTML attachments which prompted the recipient to enter their personal credentials. IBM believes that the purpose of harvesting the credentials is possibly to steal the cold chain technology, or more likely, to gain unauthorized access to critical and confidential information pertaining to the COVID-19 vaccine distribution, including timelines, lists of recipients, and shipping routes, for a more nefarious objective. IBM has not yet identified the cyber-attacker(s), but because of the sophisticated and precision-targeted attacks, it is believed to be the work of a nation-state, rather than a rogue criminal operation.

While it is currently unclear whether the phishing attacks were successful, governments are already sounding the alarm and alerting those organizations involved in developing, manufacturing, and distributing the COVID-19 vaccine to the threat of cyberattacks. The thought of an intentional interference with the vaccine’s cold chain distribution is chilling, and companies involved are strongly encouraged to review the recommendations and indicators of compromise set forth in IBM’s report, as well as the U.S. Cybersecurity & Infrastructure Security Agency’s (CISA) tips for avoiding phishing attacks in its CISA Insights: Enhance Email & Web Security.

While the nation was preoccupied with the presidential race, California voted “YES” on the California Privacy Rights Act (“CPRA”), which amends and expands on the CCPA. Our previous post sets forth the highlights of the balloted Proposition 24, but in a nutshell, the CPRA was designed to close a number of loopholes in the CCPA, strengthen consumer privacy protections, and establish the California Privacy Protection Agency as the primary enforcement authority.

Some dates to keep in mind – The CPRA amendments become effective on January 1, 2023, and will apply to personal information (“PI”) collected by covered businesses on or after January 1, 2022. The CCPA’s existing exemption for PI collected for employment purposes or in connection with business-to-business (B2B) communications was extended until January 1, 2023 (previously January 1, 2022). The California Privacy Protection Agency must be established by July 1, 2021 and must adopt final regulations by July 1, 2022. Enforcement of the CPRA amendments by the Agency will not begin until July 1, 2023.

While the key dates seem far off, companies should start thinking, sooner versus later, about what steps they will need to take to implement the new privacy law. And the good news is that you don’t have to start from scratch – compliance with the CPRA should simply build on current efforts towards compliance with the CCPA (which is still enforceable as currently implemented).

So what steps should your business take now?

  • Determine if your business is covered by the CPRA. The CPRA changes the thresholds for businesses that are subject to California’s privacy law. For instance, the CPRA defines covered businesses as those engaged in the buying, selling, or sharing of the PI of more than 100,000 California consumers/households. This is an increase from 50,000 under the CCPA, meaning that more small businesses are now excluded from the regulation. However, the specific addition of businesses that “share” PI, clarifies and expands the scope of the law, to now cover businesses who provide PI to others, whether or not for monetary or other valuable consideration (in an effort to further regulate the use of PI for behavioral or targeted advertising purposes).
  • Revamp your data subject request systems.  The CPRA creates new rights for California consumers, including the right to correct PI, the right to limit the use of sensitive PI, and the right to opt out of the “sharing” of PI. Your business should thus implement changes on the system back-end to accept and act on such requests by consumers. Companies will also need to consider how to distinguish and separate out “sensitive personal information,” which includes SSN, driver’s license number, passport number, credit card info in combination with log-in credentials for a bank account, geolocation data, health and biometric data, and information about race, ethnicity, religion, and sexual orientation.
  • Review and revise business agreements. The CPRA places new contractual obligations on service providers, contractors, and third parties. Specifically, it requires that businesses sending PI to third parties enter into an agreement binding the recipient to the same level of privacy protections provided by the CPRA, granting the business rights to take reasonable steps to remediate unauthorized use, and requiring the recipient to notify the business if it can no longer comply.
  • Enhance data security and implement risk assessments. The CPRA requires businesses to take reasonable precautions to protect consumers’ PI from a security breach. In addition, under the California Privacy Protection Agency’s rulemaking, businesses that process PI that presents a significant risk to consumers’ privacy or security must (i) perform an annual cybersecurity audit, and (ii) submit to the Agency on a regular basis a risk assessment with respect to the potential risks related to their processing of PI.

If you need any help navigating or implementing California’s evolving privacy law, please contact us at privacy@rothwellfigg.com.

Four Rothwell Figg attorneys, Jenny Colgate, Caitlin Wilmot, Martin Zoltick, and Jennifer Maisel, authored the U.S. section of Mondaq’s Data Privacy Comparative Guide.

Mondaq’s Comparative Guides provide an overview of key points of law and practice, and allows users to compare regulatory environments and laws across multiple jurisdictions on a global scale. Users can select a topic, choose regions, and refine subjects to view detailed analysis provided by carefully selected internationally recognized experts.

To read the U.S. section of the Data Privacy Comparative Guide, please click here.

If you’re a company that has been scratching your head and racking your brain since the Schrems II decision issued on July 16, 2020, invalidating Privacy Shield and calling into question all data transfers between the EU and third countries on surveillance-related grounds, your wish for more guidance has finally come true.

This week, the European Data Protection Board (EDPB) adopted recommendations regarding surveillance measures on the European Essential Guidelines (EEG), and recommendations on measures to supplement transfer tools.  Additionally, the European Commission published new draft standard contractual clauses (SCCs) and its draft implementing decision.  A discussion of each is below.

Accompanying the recent flurry of activity, the EDPB issued a press release, acknowledging the importance of the issued guidance to companies who have been struggling to know how to conduct cross-border data transfers following the July 2020 Schrems II ruling.  In the press release, the EDPB Chair, Andrea Jelinek said:

“The EDPB is acutely aware of the impact of the Schrems II ruling on thousands of EU businesses and the important responsibility it places on data exporters. The EDPB hopes that these recommendations can help data exporters with identifying and implementing effective supplementary measures where they are needed. Our goal is to enable lawful transfers of personal data to third countries while guaranteeing that the data transferred is afforded a level of protection essentially equivalent to that guaranteed within the EEA.”

She added:

“The implications of the Schrems II judgment extend to all transfers to third countries. Therefore, there are no quick fixes, nor a one-size-fits-all solution for all transfers, as this would be ignoring the wide diversity of situations data exporters face. Data exporters will need to evaluate their data processing operations and transfers and take effective measures bearing in mind the legal order of the third countries to which they transfer or intend to transfer data.”

Recommendations Regarding Surveillance Measures: European Essential Guidelines (EEG)

On November 10, 2020, the EDPB adopted recommendations on the European Essential Guarantees for surveillance measures.  These recommendations provide data exporters with a framework for determining if the surveillance practices in a third country with respect to public authorities’ access to data can be regarded as justifiable interference with the rights to privacy and the protection of personal data, such that they do not impinge on the commitments of the Article 46 GDPR transfer tool that the data exporter and importer rely on.

The publication starts out with an introduction, setting forth the historical framework for the issuance of the recommendations, including the Schrems I judgment, the Schrems II judgment, and the fact that the invalidation of Privacy Shield had consequences on other transfer tools as well (i.e., any tools referred to in Article 46 GDPR).  The introduction explains that the Schrems II judgment was a determination that US surveillance measures interfered with what are considered “fundamental rights” under EU law, i.e., the rights to respect for private and family life, including communications, and to the protection of personal data.  These rights are laid down in Articles 7 and 8 of the Charter of Fundamental Rights of the EU.  It is then further explained that Articles 7 and 8 of the Charter are not absolute rights, but must be considered in relation to their function in society, and it points to Article 52(1) of the Charter which specifies the scope of possible limitations to Articles 7 and 8, including: “Subject to the principles of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of general interest recognized by the Union or the need to protect the rights and freedoms of others.”  It further states that legislation involving the interference with the fundamental rights guaranteed by Articles 7 and 8 “must lay down clear and precise rules governing the scope and application of the measure and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse,” in particular where personal data is subjected to automatic processing and “where there is a significant risk of unlawful access to that data.”  Finally, the introduction explains that the “four European Essential Guarantees” (set forth in the publication) intend to specify “how to assess the level of interference with the fundamental rights to privacy and data protection in the context of surveillance measures by public authorities in a third country, when transferring personal data, and what legal requirements must consequently apply in order to evaluate whether such interferences would be acceptable under the Charter.”

The four European Essential Guarantees are as follows:

  1. Processing should be based on clear, precise and accessible rules;
  2. Necessity and proportionality with regard to the legitimate objectives pursued need to be demonstrated;
  3. An independent oversight mechanism should exist; and
  4. Effective remedies need to be available to the individual.

The paper then goes into detail explaining and developing each of these guarantees, and emphasizes on the fourth guarantee (effective remedies) the language from the Schrems II judgment explaining that “data subjects must have the possibility of bringing legal action before an independent and impartial court in order to have access to their personal data, or to obtain the rectification or erasure of such data,” as well as the court’s point that effective judicial protection against interferences with personal data can be ensured not only by a court, but also by a body which offers guarantees essentially equivalent to those required by Article 47 of the Charter.    (Note: Article 47 of the charter sets forth the right to an effective remedy and a fair trial.

Finally, in the “Final Remarks,” the paper acknowledges that the four guarantees require “a certain degree of interpretation, especially since the third country legislation does not have to be identical to the EU legal framework.” Notwithstanding, it concludes that the assessment of third country surveillance measures against the EEG (European Essential Guidelines) may lead to two conclusions: (1) the third country legislation at issue does not ensure the EEG requirements, or (2) the third party legislation satisfies the EEG.  Thus, it appears that determinations should be made on a country-by-country basis.

Recommendations on Measures that Supplement Transfer Tools

The same day, November 10, 2020, the European Data Protection Board (EDPB) also adopted “recommended measures” for complying with the GDPR requirements for EU-third party data transfers, including example “supplemental measures” to supplement third country’s laws, if an exporter determines that the laws are or may be insufficient/not comparable to those required by the EU.  The publication provides detailed guidance, i.e., a “roadmap”, on how companies can determine whether a particular EU-third party data transfer may occur and what steps are necessary.  These recommendations are open for public comment until November 30, 2020.

The Executive Summary of the recommended measures, setting forth the background and purpose of the recommendations, explains: “Standard contractual clauses and other transfer tools mentioned under Article 46 GDPR do not operate in a vacuum.  The Court [in Schrems II] states that controllers or processors, acting as exporters, are responsible for verifying, on a case-by-case basis and, where appropriate, in collaboration with the importer in the third country, if the law or practice of the third country impinges on the effectiveness of the appropriate safeguards contained in the Article 46 GDPR transfer tools.  In those cases, the Court still leaves open the possibility for exporters to implement supplementary measures to fill these gaps in the protection and bring it up to the level required by the EU law.  The Court does not specify which measures these could be.  However, the Court underlines that exporters will need to identify them on a case-by-case basis.  This is in line with the principle of accountability of Article 5.2 GDPR, which requires controllers to be responsible for, and be able to demonstrate compliance with the GDPR principles relating to processing of personal data.”  (emphasis added).

The recommended measures consist of a series of six steps for exporters to follow, and include potential sources of information and examples of some supplementary measures that could be put in place.  Below is an overview of the six steps.

  1. Exporters should know their transfers.  It is strongly advised that exporters map all of their transfers of data to third countries, even though it is a difficult exercise, because it will allow (a) determination of whether there is sufficient levels of protection; and (b) determination of whether the data transferred is adequate, relevant, and limited to what is necessary.
  2. Exporters should verify the transfer tool being used (e.g., an adequacy decision or some other transfer tool listed under Article 46 GDPR).  There is a reminder that the derogations provided in Article 49 GDPR may only be used for occasional and non-repetitive transfers, if conditions are met.
  3. Exporters should assess if there is anything in the law or practice of the third country that may impinge on the effectiveness of the appropriate safeguards of the transfer tools being relied on, in the context of the specific transfer.  While this step provides that primary focus  should be on the third country legislation that is applicable, and the EEG recommendations (see above) should be used to assess it, this step also provides that “In the absence of legislation governing the circumstances in which public authorities may access personal data, if you still wish to proceed with the transfer, you should look into other relevant and objective factors, and not rely on the subjective factors such as the likelihood of public authorities’ access to your data in a manner not in line with EU standards.”  It further provides that such an assessment should be conducted with due diligence and documented thoroughly, “as you will be held accountable to the decision you may take on that basis.”
  4. Exporters should identify and adopt supplementary measures that are necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence, if one’s assessment reveals that the third country legislation impinges on the effectiveness of the Article 46 GDPR transfer tool you are relying on/intend to rely onSome examples of effective supplementary measures are set forth in annex 2, but it is noted that some of these measures may be effective in some countries but not others, depending on the country’s laws.  It is further provided that: “You may ultimately find that no supplementary measure can ensure an essentially equivalent level of protection for your specific transfer.  In those cases where no supplementary measure is suitable, you must avoid, suspend or terminate the transfer to avoid compromising the level of protection of the personal data.  You should also conduct this assessment of supplementary measures with due diligence and document it.”  (emphasis added).
  5. Exporters should take any formal procedural steps in the adoption of any supplementary measure, and some such formalities are set out in the recommendations.  For example, if you intend to modify the SCCs or where supplementary measures added contradict the SSCs, you are no longer deemed to be relying on the SCCs, and must seek authorization from the competent supervisory authority in accordance with Article 46(3)(a) GDPR.
  6. Exporters should re-evaluate at appropriate intervals the level of protection accorded to the data transferred to third countries and monitor if there have been or will be any developments that will affect those transfers/that data, as the principle of accountability requires continuous vigilance of the level of protection of personal data.

New Draft SSCs and Implementing Decision on SCCs

Yesterday, November 12, 2020, the European Commission published a draft set of new SCCs, as well as a draft implementing decisionThe drafts are open for public comment until December 10, 2020.  They are expected to be adopted in late 2020 or early 2021.

Importantly to addressing Schrems II, clause 2(a) of the draft SSCs requires the parties to warrant that they “have no reason to believe that the laws in the third country …, including any requirements to disclose personal data or measures authorizing access by public authorities, prevent the data importer from fulfilling its obligations under these Clauses.  This is based on the understanding that the laws that respect the essence of the fundamental rights and freedoms and do not exceed what is necessary and proportionate in a democratic society to safeguard one of the objectives listed in Article 23(1) GDPR, are not in contradiction with the Clauses.”

Clause 2 further requires:

  • the parties to declare that they have taken proper due diligence measures to assess, inter alia, the specific circumstances of the transfer, the laws of the third country of destination, and any safeguards in addition to those under those clauses;
  • the data importer to warrant that it has made best efforts to provide the data exporter with relevant information and agrees it will continue to cooperate with the data exporter in ensuring compliance;
  • the parties to document their assessment process and make it available to the competent supervisory authority upon request;
  • the data importer to notify the data exporter if it has any reason to believe that it has or become subject to laws not in line with the requirements of Clause 2; and
  • the data exporter, upon learning or having reason to believe that the data importer cannot fulfill its obligations, to identify appropriate measures to be adopted by the data exporter and/or data importer to address the situation, if appropriate in consultation with the competent supervisory authority.

 

 

Earlier this week, the Federal Trade Commission (FTC) announced a settlement with Zoom that will require the company to enhance its data security practices to address allegations that the videoconferencing provider engaged in a series of deceptive and unfair practices that duped users into a false sense of security. Zoom, which has become a household name this year – with the FTC reporting a surge in users from 10 million in December 2019 to 300 million in April 2020 during the pandemic – has agreed to implement a comprehensive information security program and to stop making misrepresentations about its privacy and data security practices.

In its complaint, the FTC asserts that, since at least 2016, Zoom misled users by advertising “end-to-end, 256-bit-encryption,” which, in theory, would secure communications so that only the sender and recipient(s), and no other person– not even the platform provider– can access the content. In reality, Zoom’s practices fell far short, with the platform maintaining cryptographic keys to access user content, storing unencrypted recordings for up to 60 days, and offering a lower level of encryption than promised. Subpar security controls also allowed for unwanted intrusions, or “Zoombombing.” The complaint further alleges that Zoom engaged in deceptive and misleading practices with regard to its secretly installed software, called a ZoomOpener web server, which allowed the program to bypass computer security safeguards and would remain on users’ computers even after the Zoom app had been deleted. The Commission found that such deployment of the ZoomOpener, without adequate notice or user consent, was unfair and violated the FTC Act.

Under the settlement agreement, Zoom is required to annually assess and document any potential security risks and develop ways to protect against these vulnerabilities; to refrain from making misrepresentations about how it collects and uses personal information or the level of security offered to users; and to obtain biennial assessments of its security program by an independent third party for the next 20 years.

The Commission voted 3-2 on the proposed consent agreement with Zoom, with the two dissenters arguing that the agreement amounted to nothing more than a slap on the wrist for the telecommunications tycoon, whereas the security failures warranted serious action. They also noted that the proposed settlement agreement provided no remedy for affected users and no other meaningful accountability.

The FTC will soon publish a description of the consent agreement in the Federal Register, subject to public comment for 30 days. The FTC will then decide whether to make the proposed consent order final. Each violation of such an order may result in a civil penalty of upwards of $40,000.