As Biometrics Technologies Evolve, Consumer Risks Follow, Warns FTC

July 28, 2023

Reading Time : 4 min

In a policy statement released on May 18, 2023, the Federal Trade Commission (FTC) warned of several consumer data privacy risks related to the increasing commercial use of biometrics technologies.1  The Commission unanimously voted 3-0 to adopt the policy statement, which builds on more than a decade of Commission guidance on biometrics, including its 2012 report on best practices for facial recognition technology.

Currently, there is no federal privacy law governing the collection and use of individuals’ biometric information, and only a few states and cities (Illinois, Texas, Washington, Portland and New York City) have enacted such legislation.2 However, the policy statement comes in a year when bills addressing biometric privacy issues have been introduced in at least 13 state legislatures. Biometric information has also appeared under the definition of “sensitive” information in several state comprehensive privacy laws, including the California Consumer Privacy Act (CCPA) and Tennessee’s recently enacted privacy law, mandating additional or heightened protections and consumer rights for this type of information.3 Against this backdrop of state action, the FTC acknowledges the commercial benefits of biometric technologies, but cautions that businesses utilizing these tools in ways that harm consumers may face enforcement actions under Section 5 of the Federal Trade Commission Act (“FTC Act”), along with other laws.

Notably, the FTC defines “biometric information technologies” as “technologies that use or purport to use biometric information.”4 “Biometric information” is defined broadly as “data that depict or describe physical, biological, or behavioral traits, characteristics, or measurements of or relating to an identified or identifiable person’s body.” The FTC then specifies that biometric information “includes, but is not limited to, depictions, images, descriptions, or recordings of an individual’s facial features, iris or retina, finger or handprints, voice, genetics, or characteristic movements or gestures (e.g., gait or typing pattern)” and “also includes data derived from such depictions, images, descriptions, or recordings, to the extent that it would be reasonably possible to identify the person from whose information the data had been derived.”5  

Section 5 of the FTC Act prohibits “unfair or deceptive acts or practices in or affecting commerce.”6 As the FTC explains, the evolution and proliferation of biometric information technologies inevitably create new and increased risks to consumers. For example, not only may biometrics technologies be abused for fraudulent means, but also they “may perform differently across different demographic groups in ways that facilitate or produce discriminatory outcomes.”7

Under this framework, the policy statement includes a non-exhaustive list of exemplar practices the FTC may consider “unfair” or “deceptive,” warning businesses that these practices may lead to enforcement action and encouraging businesses to frequently assess their practices against the ever-expanding legal and technological landscape.

Deception

The Commission advises that the following practices may constitute deceptive trade practices that violate the FTC Act:

  • False or unsubstantiated marketing claims relating to the validity, reliability, accuracy, performance, fairness or efficacy of technologies using biometric information.
  • Deceptive statements about the collection and use of biometric information.

Unfairness

The FTC also describes several unfair practices related to the collection and use of biometric information that could violate the FTC Act. Further, it notes a business’s failure to clearly and conspicuously disclose the collection and use of such information may deprive consumers of the ability to avoid harm and may therefore meet the definition of an unfair trade practice.

Assessment

Finally, the policy statement provides the following non-exhaustive list of factors the FTC may consider when assessing a company’s practices related to biometric information:

  • Failing to assess foreseeable harms to consumers before collecting biometric information.
  • Failing to promptly address known or foreseeable risks.
  • Engaging in surreptitious and unexpected collection or use of biometric information.
  • Failing to evaluate the practices and capabilities of third parties.
  • Failing to provide appropriate training for employees and contractors.
  • Failing to conduct ongoing monitoring of technologies that the business develops, offers for sale or uses in connection with biometric information.

To underscore its commitment to preventing deceptive and unfair practices in connection with the collection and use of biometric information, the FTC cites complaints from numerous data privacy-related enforcement actions. The message to businesses is clear: businesses must consider, and mitigate, the risk of harm to consumers if they wish to reap the benefits of biometric information technology.

Please contact a member of Akin’s cybersecurity, privacy and data protection team to learn more about how your company can optimize biometric technologies while avoiding FTC enforcement risk.


1 FTC Policy Statement, https://www.ftc.gov/system/files/ftc_gov/pdf/p225402biometricpolicystatement.pdf ; see also https://www.ftc.gov/news-events/news/press-releases/2023/05/ftc-warns-about-misuses-biometric-information-harm-consumers.

2 See Biometric Information Privacy Act (BIPA), 740 ILCS 14; Texas Capture or Use of Biometric Identifier Act (CUBI), Tex. Bus. Com. Code Ann. § 503.001; Washington Biometric Law, RCW §19.375.010; NYC Admin. Code §§ 22-1201 – 1205; NYC Admin. Code §§ 26-3001 – 3007; Portland City Code Chapter 34.10;

3 California, Colorado, Connecticut, Indiana, Iowa, Montana, Tennessee, Utah and Virginia consider biometric information that is processed for the purpose of uniquely identifying an individual as “sensitive data” or “sensitive personal information.” Note that California and Tennessee also list biometric information generally as a type of personal information.

4 Policy Statement at 1.

5 Id.

6 15 U.S.C. § 45(n).

7 Policy Statement at 4.

Share This Insight

Previous Entries

Data Dive

November 19, 2024

The European Union’s AI Office published the inaugural General-Purpose AI Code of Practice on November 14, 2024. The Code is intended to assist providers of AI models in their preparations for compliance with the forthcoming EU AI Act, to be enforced from August 2, 2025. The Code is designed to be both forward-thinking and globally applicable, addressing the areas of transparency, risk evaluation, technical safeguards and governance. While adherence to the Code is not mandatory, it is anticipated to serve as a means of demonstrating compliance with the obligations under the EU AI Act. Following a consultation period that garnered approximately 430 responses, the AI Office will be empowered to apply these rules, with penalties for nonconformity potentially reaching 3% of worldwide turnover or €15 million. Three additional iterations of the Code are anticipated to be produced within the coming five months.

...

Read More

Data Dive

November 15, 2024

On October 29, 2024, the DOJ issued a proposed rule prohibiting and restricting certain transactions that could allow persons from countries of concern, such as China, access to bulk sensitive personal data of U.S. citizens or to U.S. government-related data (regardless of volume).

...

Read More

Data Dive

October 17, 2024

During the course of any lending transaction, lenders will conduct a due diligence review of the borrower, including reviewing any relevant “know-your-customer” information.

...

Read More

Data Dive

September 17, 2024

Following the publication of the European Union’s Artificial Intelligence Act (AI Act or Act) on 12 July 2024, there are now a series of steps that various EU bodies need to take towards implementation. One of the first key steps is in relation to the establishment of codes of practice to “contribute to the proper application” of the AI Act.

...

Read More

Data Dive

August 6, 2024

On July 30, 2024, the Senate passed the Kids Online Safety and Privacy Act (S. 2073) via an overwhelmingly bipartisan vote of 91-3 shortly before departing for the August recess.

...

Read More

Data Dive

July 18, 2024

On 12 July 2024, the European Union Artificial Intelligence Act (AI Act or Act) was published in the Official Journal of the European Union (EU), marking the final step in the AI Act’s legislative journey. Its publication triggers the timeline for the entry into force of the myriad obligations under the AI Act, along with the deadlines we set out below. The requirement to ensure a sufficient level of AI literacy of staff dealing with the operation and use of AI systems will, for example, apply to all providers and deployers on 2 February 2025.

...

Read More

Data Dive

July 18, 2024

On June 18, 2024, the United States Securities and Exchange Commission (SEC) announced a settlement with R.R. Donnelley & Sons Company (RRD) for alleged internal control and disclosure failures following a ransomware attack in 2021. Without admitting or denying the SEC’s findings, the business communications and marketing services provider agreed to pay a civil penalty of over $2.1 million to settle charges alleging violations of Section 13(b)(2)(B) of the Securities Exchange Act of 1934 (Exchange Act) and Exchange Act Rule 13a-15(a).1

...

Read More

Data Dive

June 11, 2024

In May, the National Institute of Standards and Technology (NIST) issued updated recommendations for security controls for controlled unclassified information (CUI) that is processed, stored or transmitted by nonfederal organizations using nonfederal systems, (NIST Special Publication 800-171 (SP 800-171), Revision 3). These security requirements are “intended for use by federal agencies in contractual vehicles or other agreements that are established between those agencies and nonfederal organizations.”1 While these new controls are only applicable to nonfederal entities that agree to comply with the new issuance, Revision 3 signals the next phase of expected security for government contractors.

...

Read More

© 2024 Akin Gump Strauss Hauer & Feld LLP. All rights reserved. Attorney advertising. This document is distributed for informational use only; it does not constitute legal advice and should not be used as such. Prior results do not guarantee a similar outcome. Akin is the practicing name of Akin Gump LLP, a New York limited liability partnership authorized and regulated by the Solicitors Regulation Authority under number 267321. A list of the partners is available for inspection at Eighth Floor, Ten Bishops Square, London E1 6EG. For more information about Akin Gump LLP, Akin Gump Strauss Hauer & Feld LLP and other associated entities under which the Akin Gump network operates worldwide, please see our Legal Notices page.