Growing regulatory action to combat so-called “dark patterns” used in web design to influence consumer choice has resulted in hundreds of millions of dollars in fines, and promises to continue to be an area of enforcement in 2023. Federal enforcement actions, state laws and agency guidance have cast dark patterns as a grave concern that regulators are looking to root out from company practice. But what exactly are dark patterns and which practices do they encompass? Here we will discuss practices that risk being classified as dark patterns and how regulators are enforcing this new data privacy trap.
By: Natasha G. Kohne
First coined by web designer Harry Brignull in 2010, “dark patterns” generally refers to design practices in online user interfaces that influence users into making choices they would not otherwise have made and that may be against their interests.1 Federal Trade Commission (FTC) Commissioner Rohit Chopra similarly defined it as: “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.”2 Use of dark patterns to obtain user agreement or to confuse or fatigue a user into staying in an agreement may not be a valid form of consent because the consumer has not been fully or meaningfully informed of their choices. Some examples might include:
- Ambiguously worded buttons that could trick people into making a different choice than they intended.
- An unnecessarily lengthy click-thru process before customers can cancel a subscription.
- Requiring scrolling through long documents in order to opt out of data sharing.
- System defaults to collect more information than a consumer would expect.
There are potentially limitless different design choices that regulators could consider dark patterns. It is important that companies consider if their designs might interfere with a consumer’s choice by, for instance: inducing a false belief about a choice being made, hiding unauthorized charges or creating unnecessary obstacles to opting out of an agreement.
Dark Patterns in State Privacy Law
Dark patterns were thrust into the regulatory spotlight with the 2020 passage of the California Privacy Rights Act (CPRA), followed by the Colorado Privacy Act (CPA) in 2021 and the Connecticut Data Privacy Act (CTDPA) in 2022. Both the CPRA and CPA define dark patterns as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice,”3 while the CTDPA adds that the definition also includes “any practice the Federal Trade Commission refers to as a ‘dark pattern.’”4
These laws provide that a consumer’s agreement obtained by designs considered dark patterns does not constitute consent (at least for purposes of personal data processing). When presenting consumers with opt-out rights, such as the right to opt out of targeted advertising or use of sensitive personal information, companies must make sure that required consumer consents are freely given, specific, informed and unambiguous.
In its recently revised draft regulations, the California Privacy Protection Agency (CPPA) provides a list of design decisions a company can take to avoid dark pattern use:
- Using language that is easy for consumers to read and understand.
- Ensuring the path for a consumer to exercise a privacy-protective option is not longer, more difficult or more time-consuming than the less privacy-protective option.
- Offering consumers distinct choices to opt in to the sale or use of personal information such as “Accept” and “Decline,” instead of using options like “Yes” and “Ask me later” or “Accept All” and “More Information.”
- Avoiding use of toggles or buttons that feature unintuitive placement or confusing double negatives.
- Avoiding consent structure that forces consumers to click through disruptive screens or consent to options that are incompatible with the expected service.5
Federal Enforcement and Guidance on Dark Patterns
The FTC issued a staff report in September 2022 titled Bringing Dark Patterns to Light, breaking down cases the agency has brought against companies for allegedly engaging in dark patterns. Stemming from an April 2021 workshop on dark patterns, this report provides insight into federal enforcement priorities and a source of useful guidance for companies looking to navigate design best practices.
Practices like disguising ads to look like independent content, imposing difficult steps for consumers to cancel subscriptions, or burying key terms to trick consumers into sharing data are some of the dark patterns tactics discussed in the report.6 The examples of alleged dark patterns and corresponding agency enforcement include:
- Design elements that induce false beliefs – These can involve a company making false or inflated advertising claims or using advertising designs that mislead the consumer into making a certain decision. In its case against Credit Karma, the FTC alleged that the company misled consumers into clicking links to apply for credit cards by telling consumers they had been “pre-approved” regardless of their actual pre-approval status. Credit Karma allegedly had chosen to employ the tactic after testing revealed it yielded higher click rates, an act that the FTC called a dark pattern to “trick consumers into taking actions in a company’s interest.”7
- Design elements that hide or delay disclosure of material information – These are designs that bury fees or important product information within a lengthy Terms of Service document. In its complaint against the LendingClub Corporation, the FTC alleged that the company deceived consumers into believing they would pay no hidden fees using prominent visuals. LendingClub allegedly hid details about fees using tooltip buttons consumers were unlikely to click on, while also burying mentions of fees between bolded, more prominent paragraphs later in the application process. The agency also referenced a purported dark pattern called “drip pricing” where companies only advertise part of a product’s total price, obscuring other mandatory charges until later in the buying process.8
- Design elements that lead to unauthorized charges – The FTC also singles out payments for products and services for products consumers do not intend to purchase or continue purchasing. In its case against children’s online learning site operator ABCMouse, the FTC alleged that consumers were hindered from cancelling free trials and subscription plans. According to the complaint, ABCMouse advertised easy cancellation, while requiring consumers to click through lengthy pages with links that directed them out of the cancellation process.9 Similarly, in the case against Epic Games, the FTC alleged that it designed the Fortnite interface in ways that led to unauthorized charges, such as: saving credit card information for in-game currency purchases with no purchase confirmation required, putting preview buttons close to purchase buttons and switching the buttons for some items, as well as setting up hindrances to reversing unauthorized charges (like locking accounts and using a difficult-to-navigate refund request path). In addition to the $245 million fine, the proposed order requires Epic Games to restructure their billing and dispute practices, and bars the use of dark patterns to gain consumer consent.10
- Design elements that obscure or subvert privacy choices – Dark patterns can mislead consumers away from their data privacy preferences. Lead generator Sunkey Publishing allegedly used websites designed to appear as official army recruitment websites to manipulate those interested in enlisting into submitting their personal information. According to the FTC, Sunkey falsely promised to use the information only for military recruitment purposes, while actually selling the information for marketing leads. The FTC argued that lead generation like this manipulates consumers, and companies should make sure the third-party lead generators they work with are not collecting consumer information for one purpose while sharing it for a different purpose without consumer consent.11
Both the report and the agency’s recent actions make it clear that the FTC will continue to investigate alleged uses of dark patterns designed to “get consumers to part with their money or data”12 in 2023. The FTC is also weighing comments from the comment period (closed in November) for its Advanced Notice of Proposed Rulemaking for data privacy. This rulemaking could have a huge impact on company data practices, with the agency having sought public commentary on matters such as data collection, notice and choice, data monetization, data security and dark patterns.
Takeaways
Given the light that the FTC and other regulators are shedding on dark patterns, companies must be mindful about how they market to consumers online. Design choices on notice location, banner visibility, language, check-out procedure, free trial terms, countdowns, and many others can bring unwanted attention from regulators on the lookout for dark patterns that they claim manipulate consumer choice. Given the increasing level of regulatory focus over the previous year, designing user interfaces to prioritize user choice and avoid practices considered to be dark patterns will be an important part of risk mitigation strategy.
Companies should analyze their user interface from the perspective of the consumer, as well as have it examined by lawyers familiar with the online regulatory space. Other essential considerations like disclosures to customers and understandable language should also be given a priority, along with adding consent to risk mitigation concerns. Companies that anticipate scrutiny of their notice and consent procedures will have a better chance of avoiding regulators’ dark-pattern crosshairs.
Please contact a member of Akin Gump’s cybersecurity, privacy and data protection team if you have any questions about dark patterns or how they may affect your company.
1 Bringing Dark Patterns to Light, Federal Trade Comm’n, Staff Report (September 15, 2022), herein after “FTC Staff Report,” available at https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-%20FINAL.pdf.
2 Regarding Dark Patterns in the Matter of Age of Learning, Inc., Federal Trade Comm’n, Statement of Commissioner Rohit Chopra (September 2, 2020), available at https://www.ftc.gov/system/files/documents/public_statements/1579927/172_3086_abcmouse_-_rchopra_statement.pdf.
3 Cal. Civ. Code § 1798.140(l); S.B. 21-190 § 6-1-1303(9).
4 P.A. 22-15 § 1(11).
5 California Consumer Privacy Act Regulations, Modified Text of Proposed Regulations § 7004(a).
6 “FTC Staff Report,” available at https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-%20FINAL.pdf.
7 Credit Karma, LLC, In the Matter of, 2023138 (September 1, 2022), available at https://www.ftc.gov/system/files/ftc_gov/pdf/CK%20Complaint%209-1-22%20%28Redacted%29.pdf.
8 Federal Trade Comm’n v. LendingClub Corporation, 3:18-cv-02454-JSC (November 29, 2018), available at https://www.ftc.gov/system/files/documents/cases/lendingclub_corporation_first_amended_complaint.pdf.
9 Federal Trade Comm’n v. Age of Learning, Inc., a corporation, also d/b/a ABCmouse and ABCmouse.com, 2:20-cv-7996 (September 2, 2020), available at https://www.ftc.gov/system/files/documents/cases/1723086abcmousecomplaint.pdf.
10 Epic Games, In the Matter of, 1923203 (December 19, 2022) available at https://www.ftc.gov/system/files/ftc_gov/pdf/1923203EpicGamesACCO.pdf. See also Federal Trade Comm’n, FTC Action Against Vonage Results in $100 Million to Customers Trapped by Illegal Dark Patterns and Junk Fees When Trying to Cancel Service, Press Release (November 3, 2022) available at https://www.ftc.gov/news-events/news/press-releases/2022/11/ftc-action-against-vonage-results-100-million-customers-trapped-illegal-dark-patterns-junk-fees-when-trying-cancel-service (imposing fine against Vonage for $100 million, alleging that the company was imposing junk fees on consumers and then using “illegal dark patterns” to make it difficult for them to cancel service or stop recurring charges).
11 Staff Report at 19.
12 Id. at 1.
First coined by web designer Harry Brignull in 2010, “dark patterns” generally refers to design practices in online user interfaces that influence users into making choices they would not otherwise have made and that may be against their interests.1 Federal Trade Commission (FTC) Commissioner Rohit Chopra similarly defined it as: “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.”2 Use of dark patterns to obtain user agreement or to confuse or fatigue a user into staying in an agreement may not be a valid form of consent because the consumer has not been fully or meaningfully informed of their choices. Some examples might include:
- Ambiguously worded buttons that could trick people into making a different choice than they intended.
- An unnecessarily lengthy click-thru process before customers can cancel a subscription.
- Requiring scrolling through long documents in order to opt out of data sharing.
- System defaults to collect more information than a consumer would expect.
There are potentially limitless different design choices that regulators could consider dark patterns. It is important that companies consider if their designs might interfere with a consumer’s choice by, for instance: inducing a false belief about a choice being made, hiding unauthorized charges or creating unnecessary obstacles to opting out of an agreement.
Dark Patterns in State Privacy Law
Dark patterns were thrust into the regulatory spotlight with the 2020 passage of the California Privacy Rights Act (CPRA), followed by the Colorado Privacy Act (CPA) in 2021 and the Connecticut Data Privacy Act (CTDPA) in 2022. Both the CPRA and CPA define dark patterns as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice,”3 while the CTDPA adds that the definition also includes “any practice the Federal Trade Commission refers to as a ‘dark pattern.’”4
These laws provide that a consumer’s agreement obtained by designs considered dark patterns does not constitute consent (at least for purposes of personal data processing). When presenting consumers with opt-out rights, such as the right to opt out of targeted advertising or use of sensitive personal information, companies must make sure that required consumer consents are freely given, specific, informed and unambiguous.
In its recently revised draft regulations, the California Privacy Protection Agency (CPPA) provides a list of design decisions a company can take to avoid dark pattern use:
- Using language that is easy for consumers to read and understand.
- Ensuring the path for a consumer to exercise a privacy-protective option is not longer, more difficult or more time-consuming than the less privacy-protective option.
- Offering consumers distinct choices to opt in to the sale or use of personal information such as “Accept” and “Decline,” instead of using options like “Yes” and “Ask me later” or “Accept All” and “More Information.”
- Avoiding use of toggles or buttons that feature unintuitive placement or confusing double negatives.
- Avoiding consent structure that forces consumers to click through disruptive screens or consent to options that are incompatible with the expected service.5
Federal Enforcement and Guidance on Dark Patterns
The FTC issued a staff report in September 2022 titled Bringing Dark Patterns to Light, breaking down cases the agency has brought against companies for allegedly engaging in dark patterns. Stemming from an April 2021 workshop on dark patterns, this report provides insight into federal enforcement priorities and a source of useful guidance for companies looking to navigate design best practices.
Practices like disguising ads to look like independent content, imposing difficult steps for consumers to cancel subscriptions, or burying key terms to trick consumers into sharing data are some of the dark patterns tactics discussed in the report.6 The examples of alleged dark patterns and corresponding agency enforcement include:
- Design elements that induce false beliefs – These can involve a company making false or inflated advertising claims or using advertising designs that mislead the consumer into making a certain decision. In its case against Credit Karma, the FTC alleged that the company misled consumers into clicking links to apply for credit cards by telling consumers they had been “pre-approved” regardless of their actual pre-approval status. Credit Karma allegedly had chosen to employ the tactic after testing revealed it yielded higher click rates, an act that the FTC called a dark pattern to “trick consumers into taking actions in a company’s interest.”7
- Design elements that hide or delay disclosure of material information – These are designs that bury fees or important product information within a lengthy Terms of Service document. In its complaint against the LendingClub Corporation, the FTC alleged that the company deceived consumers into believing they would pay no hidden fees using prominent visuals. LendingClub allegedly hid details about fees using tooltip buttons consumers were unlikely to click on, while also burying mentions of fees between bolded, more prominent paragraphs later in the application process. The agency also referenced a purported dark pattern called “drip pricing” where companies only advertise part of a product’s total price, obscuring other mandatory charges until later in the buying process.8
- Design elements that lead to unauthorized charges – The FTC also singles out payments for products and services for products consumers do not intend to purchase or continue purchasing. In its case against children’s online learning site operator ABCMouse, the FTC alleged that consumers were hindered from cancelling free trials and subscription plans. According to the complaint, ABCMouse advertised easy cancellation, while requiring consumers to click through lengthy pages with links that directed them out of the cancellation process.9 Similarly, in the case against Epic Games, the FTC alleged that it designed the Fortnite interface in ways that led to unauthorized charges, such as: saving credit card information for in-game currency purchases with no purchase confirmation required, putting preview buttons close to purchase buttons and switching the buttons for some items, as well as setting up hindrances to reversing unauthorized charges (like locking accounts and using a difficult-to-navigate refund request path). In addition to the $245 million fine, the proposed order requires Epic Games to restructure their billing and dispute practices, and bars the use of dark patterns to gain consumer consent.10
- Design elements that obscure or subvert privacy choices – Dark patterns can mislead consumers away from their data privacy preferences. Lead generator Sunkey Publishing allegedly used websites designed to appear as official army recruitment websites to manipulate those interested in enlisting into submitting their personal information. According to the FTC, Sunkey falsely promised to use the information only for military recruitment purposes, while actually selling the information for marketing leads. The FTC argued that lead generation like this manipulates consumers, and companies should make sure the third-party lead generators they work with are not collecting consumer information for one purpose while sharing it for a different purpose without consumer consent.11
Both the report and the agency’s recent actions make it clear that the FTC will continue to investigate alleged uses of dark patterns designed to “get consumers to part with their money or data”12 in 2023. The FTC is also weighing comments from the comment period (closed in November) for its Advanced Notice of Proposed Rulemaking for data privacy. This rulemaking could have a huge impact on company data practices, with the agency having sought public commentary on matters such as data collection, notice and choice, data monetization, data security and dark patterns.
Takeaways
Given the light that the FTC and other regulators are shedding on dark patterns, companies must be mindful about how they market to consumers online. Design choices on notice location, banner visibility, language, check-out procedure, free trial terms, countdowns, and many others can bring unwanted attention from regulators on the lookout for dark patterns that they claim manipulate consumer choice. Given the increasing level of regulatory focus over the previous year, designing user interfaces to prioritize user choice and avoid practices considered to be dark patterns will be an important part of risk mitigation strategy.
Companies should analyze their user interface from the perspective of the consumer, as well as have it examined by lawyers familiar with the online regulatory space. Other essential considerations like disclosures to customers and understandable language should also be given a priority, along with adding consent to risk mitigation concerns. Companies that anticipate scrutiny of their notice and consent procedures will have a better chance of avoiding regulators’ dark-pattern crosshairs.
Please contact a member of Akin Gump’s cybersecurity, privacy and data protection team if you have any questions about dark patterns or how they may affect your company.
1 Bringing Dark Patterns to Light, Federal Trade Comm’n, Staff Report (September 15, 2022), herein after “FTC Staff Report,” available at https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-%20FINAL.pdf.
2 Regarding Dark Patterns in the Matter of Age of Learning, Inc., Federal Trade Comm’n, Statement of Commissioner Rohit Chopra (September 2, 2020), available at https://www.ftc.gov/system/files/documents/public_statements/1579927/172_3086_abcmouse_-_rchopra_statement.pdf.
3 Cal. Civ. Code § 1798.140(l); S.B. 21-190 § 6-1-1303(9).
4 P.A. 22-15 § 1(11).
5 California Consumer Privacy Act Regulations, Modified Text of Proposed Regulations § 7004(a).
6 “FTC Staff Report,” available at https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-%20FINAL.pdf.
7 Credit Karma, LLC, In the Matter of, 2023138 (September 1, 2022), available at https://www.ftc.gov/system/files/ftc_gov/pdf/CK%20Complaint%209-1-22%20%28Redacted%29.pdf.
8 Federal Trade Comm’n v. LendingClub Corporation, 3:18-cv-02454-JSC (November 29, 2018), available at https://www.ftc.gov/system/files/documents/cases/lendingclub_corporation_first_amended_complaint.pdf.
9 Federal Trade Comm’n v. Age of Learning, Inc., a corporation, also d/b/a ABCmouse and ABCmouse.com, 2:20-cv-7996 (September 2, 2020), available at https://www.ftc.gov/system/files/documents/cases/1723086abcmousecomplaint.pdf.
10 Epic Games, In the Matter of, 1923203 (December 19, 2022) available at https://www.ftc.gov/system/files/ftc_gov/pdf/1923203EpicGamesACCO.pdf. See also Federal Trade Comm’n, FTC Action Against Vonage Results in $100 Million to Customers Trapped by Illegal Dark Patterns and Junk Fees When Trying to Cancel Service, Press Release (November 3, 2022) available at https://www.ftc.gov/news-events/news/press-releases/2022/11/ftc-action-against-vonage-results-100-million-customers-trapped-illegal-dark-patterns-junk-fees-when-trying-cancel-service (imposing fine against Vonage for $100 million, alleging that the company was imposing junk fees on consumers and then using “illegal dark patterns” to make it difficult for them to cancel service or stop recurring charges).
11 Staff Report at 19.
12 Id. at 1.
Previous Entries
Data Dive
November 19, 2024
The European Union’s AI Office published the inaugural General-Purpose AI Code of Practice on November 14, 2024. The Code is intended to assist providers of AI models in their preparations for compliance with the forthcoming EU AI Act, to be enforced from August 2, 2025. The Code is designed to be both forward-thinking and globally applicable, addressing the areas of transparency, risk evaluation, technical safeguards and governance. While adherence to the Code is not mandatory, it is anticipated to serve as a means of demonstrating compliance with the obligations under the EU AI Act. Following a consultation period that garnered approximately 430 responses, the AI Office will be empowered to apply these rules, with penalties for nonconformity potentially reaching 3% of worldwide turnover or €15 million. Three additional iterations of the Code are anticipated to be produced within the coming five months.
Data Dive
November 15, 2024
On October 29, 2024, the DOJ issued a proposed rule prohibiting and restricting certain transactions that could allow persons from countries of concern, such as China, access to bulk sensitive personal data of U.S. citizens or to U.S. government-related data (regardless of volume).
Data Dive
October 17, 2024
During the course of any lending transaction, lenders will conduct a due diligence review of the borrower, including reviewing any relevant “know-your-customer” information.
Data Dive
September 17, 2024
Following the publication of the European Union’s Artificial Intelligence Act (AI Act or Act) on 12 July 2024, there are now a series of steps that various EU bodies need to take towards implementation. One of the first key steps is in relation to the establishment of codes of practice to “contribute to the proper application” of the AI Act.
Data Dive
August 6, 2024
On July 30, 2024, the Senate passed the Kids Online Safety and Privacy Act (S. 2073) via an overwhelmingly bipartisan vote of 91-3 shortly before departing for the August recess.
Data Dive
July 18, 2024
On 12 July 2024, the European Union Artificial Intelligence Act (AI Act or Act) was published in the Official Journal of the European Union (EU), marking the final step in the AI Act’s legislative journey. Its publication triggers the timeline for the entry into force of the myriad obligations under the AI Act, along with the deadlines we set out below. The requirement to ensure a sufficient level of AI literacy of staff dealing with the operation and use of AI systems will, for example, apply to all providers and deployers on 2 February 2025.
Data Dive
July 18, 2024
On June 18, 2024, the United States Securities and Exchange Commission (SEC) announced a settlement with R.R. Donnelley & Sons Company (RRD) for alleged internal control and disclosure failures following a ransomware attack in 2021. Without admitting or denying the SEC’s findings, the business communications and marketing services provider agreed to pay a civil penalty of over $2.1 million to settle charges alleging violations of Section 13(b)(2)(B) of the Securities Exchange Act of 1934 (Exchange Act) and Exchange Act Rule 13a-15(a).1
Data Dive
June 11, 2024
In May, the National Institute of Standards and Technology (NIST) issued updated recommendations for security controls for controlled unclassified information (CUI) that is processed, stored or transmitted by nonfederal organizations using nonfederal systems, (NIST Special Publication 800-171 (SP 800-171), Revision 3). These security requirements are “intended for use by federal agencies in contractual vehicles or other agreements that are established between those agencies and nonfederal organizations.”1 While these new controls are only applicable to nonfederal entities that agree to comply with the new issuance, Revision 3 signals the next phase of expected security for government contractors.