Class Action Lawsuits Allege Privacy Violations by Smart Speakers – Key Trends and Takeaways

Feb 3, 2020

Reading Time : 7 min

By: Jessica H. Ro, Jessica Ro

News reports that technology companies actively listen to user voice recordings have fueled conversation regarding speaker privacy. For example, in July 2019, it was reported that Apple contractors listened to a subset of Siri recordings to assess the adequacy of Siri’s response. According to reports, these recordings included instances in which the speaker was accidentally activated—in other words, when the user did not use the designated “wake phrase” (e.g., “Hey Siri”). Consequently, the content of these recordings reportedly includes highly personal information, such as conversations between users and their doctors.

Additionally, this past summer, Germany’s data protection authority ordered Google to stop listening to recordings collected from its voice-activated Google Assistant product to improve the company’s speech recognition technology. The moratorium, which lasted three months, came after confirmation from Google that it had listened in on Google Assistant users, a potential violation of the EU General Data Protection Regulation (GDPR). Google explained that it listens to a small subset of voice clips as part of its efforts to improve how its voice recognition systems interpret different accents and dialects across languages.

Since last June, a number of class action complaints have been filed in federal district courts against top smart speaker and virtual assistant providers—specifically, Amazon, Google and Apple. In each case, the plaintiffs contend they did not consent to their smart speakers recording their conversations and that such recording constitutes a violation of privacy.

Here are a few key trends that have emerged in these class actions:

Most of the classes include minors, and one encompasses non-consumer “bystanders.”

A majority of these putative class actions seek to represent minors. One such case, Adamsky et al v. Amazon.com, No. 2:19-cv-01214 (W.D. Wash. filed Aug. 2, 2019), defines the proposed class as all children under the age of 13 residing in the United States who used Alexa on the Amazon Echo Dot Kids Edition. One other case, Hall-O’Neil v. Amazon.com, No. 2:19-cv-00910 (W.D. Wash. filed June 11, 2019), is even more specific—it proposes classes comprised of minors who used an Alexa-enabled device, but who had not downloaded and installed the Alexa App themselves. In three other cases, there are subclasses for minors who were recorded by a smart speaker. These cases argue that businesses cannot obtain consent from minors who did not purchase the smart speaker device or possess user accounts of their own. Lopez v. Apple, Inc., No. 4:19-cv-04577-NC (N.D. Cal. filed Aug. 7, 2019); Wilcosky v. Amazon.com, No. 1:19-cv-05061 (N.D. Ill. filed July 26, 2019); In re Google Assistant Privacy Litigation, No. 5:19-cv-04286 (N.D. Cal. filed July 25, 2019).

On a similar note, one case, Wilcosky v. Amazon.com, specifically includes a proposed “bystander” subclass, characterized as those who do not have registered Alexa accounts, but spoke in the vicinity of an Alexa device and therefore had Amazon create and store their voice recordings.

The plaintiffs rely (at least in part) on state law claims, including claims that the defendants violated state “two-party consent” laws.

The complaints generally allege the same wrongful conduct—the recording of confidential communications and collection of personal information without the user’s (or the parent’s) consent in violation of state law. At the heart of these allegations is the plaintiffs’ expectation that their interactions with the smart speakers would remain private.

For example, despite targeting different companies, the plaintiffs in both Lopez v. Apple, Inc. and In re Google Assistant Privacy Litigation filed their complaints in California and asserted violations of two California statutes—the California Invasion of Privacy Act (CIPA) and the Unfair Competition Law (UCL)—as well as the California Constitution. CIPA prohibits the recording of confidential communication without the consent of all parties involved. Under CIPA, the plaintiffs argue that the companies used the smart speakers to record their conversations without the consent of all parties and when plaintiffs had a reasonable expectation that they were not being recorded unless they said the wake phrase. Plaintiffs also argue that these practices constitute deceptive acts and unfair competition under the UCL and an invasion of privacy in violation of the California Constitution.

Similarly, in Hall-O’Neil v. Amazon.com, the plaintiffs allege that Amazon invaded their right to privacy by intentionally intercepting and using confidential communications, without the consent of all parties involved, in violation of various state wiretap statutes that require two-party consent for the recording of oral communications.

In Wilcosky v. Amazon.com, filed in the Northern District of Illinois, the plaintiffs claim a violation of the Illinois Biometric Information Privacy Act (BIPA). BIPA governs the collection, retention, capture and purchase of biometric identifiers—including “voiceprints.” Among other things, BIPA requires companies who handle biometric information to: (1) obtain valid consent from individuals if the company intends to collect or disclose their personal biometric identifiers; (2) destroy biometric identifiers in a timely manner; and (3) securely store biometric information. The plaintiffs in this case contend that the defendant company disregarded these obligations by failing to inform users that a biometric identifier was being collected and stored and failing to secure proper consent.

The plaintiffs also rely on common law claims related to privacy.

Aside from state law, a couple of the cases also assert common law claims of intrusion upon seclusion, invasion of privacy, breach of contract and unjust enrichment. As to unjust enrichment, the plaintiffs contend that they have a property interest in their voice recordings and that the defendant is profiting from its unlawful collection and use of this data.

A few of the class action lawsuits invoke federal privacy law, but most focus on state and common law claims.

Not surprisingly, federal privacy law is not the main focus in most of these complaints. In just three cases, Adamsky et al v. Amazon.com, In re Google Assistant Privacy Litigation and Lopez v. Apple, Inc., the plaintiffs invoke the Federal Wiretap Act. The plaintiffs in In re Google Assistant Privacy Litigation and Lopez v. Apple, Inc. also bring claims under the Stored Communications Act, which governs the disclosure of electronic communications stored with technology providers. Additionally, the In re Google Assistant Privacy Litigation plaintiffs rely on the Magnuson-Moss Warranty Act, which governs warranties on consumer products. Outside of these three cases, however, federal law rarely makes an appearance in these complaints.

In Adamsky et al v. Amazon.com, the plaintiffs reference the Children’s Online Privacy Protection Act (together with its implementing regulations, COPPA), a federal statute that prohibits the collection and recording of children’s personal information absent express parental consent. Although COPPA cannot be enforced through private lawsuits, the Federal Trade Commission (FTC) has the authority to ensure that smart speaker manufacturers comply with COPPA’s requirements. In May 2019, children’s privacy advocates filed a complaint requesting that the FTC investigate whether Amazon’s Echo Dot Kids Edition collects children’s personal information in violation of COPPA. The advocacy groups were led by the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, whose 2018 FTC complaint against YouTube led to a $170 million settlement announced last September.

At least one defendant has moved to compel arbitration.

In three cases, Hall-O’Neil v. Amazon.com, Wilcosky v. Amazon.com and Tice v. Amazon.com, No. 5:19-cv-01311 (C.D. Cal. filed July 17, 2019), Amazon has filed motions to compel arbitration and dismiss plaintiffs’ claims. Amazon alternatively requests that the court stay plaintiffs’ claims pending arbitration. Amazon argues that the plaintiffs agreed to arbitration when they purchased or registered Alexa-enabled products and accepted Amazon’s Conditions of Use, which includes a binding arbitration clause. The plaintiffs have largely responded by asserting that their claims are not governed by the arbitration agreements. The courts have yet to rule on Amazon’s motions.

Defenses will depend on the specific claims in each case, but will likely include arguments regarding plaintiffs’ reasonable expectations of privacy.

In defending against these class actions, defendants will likely argue that plaintiffs did not have a reasonable expectation of privacy in their conversations because the terms of use and privacy policies associated with the speakers described what information is collected through the speaker’s microphone and how that information is used. Defendants may also assert that, by agreeing to the terms of use and privacy policies, plaintiffs consented to the recording and use of their conversations as described therein.

Depending on the specific claims asserted in each case, defendants may also assert that the plaintiffs have not suffered an injury-in-fact, and thus do not have standing, or that the applicable statute of limitations has passed. For example, the statute of limitations for bringing a claim under CIPA is one year from when the plaintiffs discovered, or should have discovered, the defendant’s unlawful activity. See, e.g., Montalti v. Catanzariti, 236 Cal. Rptr. 231, 232–33 (Cal. Ct. App. 1987); NEI Contracting & Engineering, Inc. v. Hanson Aggregates Pac. Sw., Inc., No. 3:12-CV-01685-BAS, 2015 WL 1346110, at *4 (S.D. Cal. Mar. 24, 2015).

Lastly, defendants may challenge the plaintiff classes for lack of ascertainability, which requires that classes be defined clearly and based on objective criteria. Defendants could argue that class members who allegedly spoke in the vicinity of a smart speaker, but did not purchase or register the speaker, such as minors and bystanders, cannot be adequately ascertained.

Takeaways

This flurry of lawsuits serves as a reminder that technology companies manufacturing smart speakers or similar products should ensure that they obtain appropriate user consent, as necessary, to collect and use voice recordings, especially when children may be involved. These cases also reveal that the practice of recording and listening to users, even if solely to improve product functionality, leaves companies potentially vulnerable to allegations of privacy violations. To decrease the risk of complaints—both in and out of court—businesses must take a thoughtful approach to user privacy, including by communicating with users in a manner that is straightforward, direct and easy-to-understand.

Share This Insight

Previous Entries

Data Dive

November 19, 2024

The European Union’s AI Office published the inaugural General-Purpose AI Code of Practice on November 14, 2024. The Code is intended to assist providers of AI models in their preparations for compliance with the forthcoming EU AI Act, to be enforced from August 2, 2025. The Code is designed to be both forward-thinking and globally applicable, addressing the areas of transparency, risk evaluation, technical safeguards and governance. While adherence to the Code is not mandatory, it is anticipated to serve as a means of demonstrating compliance with the obligations under the EU AI Act. Following a consultation period that garnered approximately 430 responses, the AI Office will be empowered to apply these rules, with penalties for nonconformity potentially reaching 3% of worldwide turnover or €15 million. Three additional iterations of the Code are anticipated to be produced within the coming five months.

...

Read More

Data Dive

November 15, 2024

On October 29, 2024, the DOJ issued a proposed rule prohibiting and restricting certain transactions that could allow persons from countries of concern, such as China, access to bulk sensitive personal data of U.S. citizens or to U.S. government-related data (regardless of volume).

...

Read More

Data Dive

October 17, 2024

During the course of any lending transaction, lenders will conduct a due diligence review of the borrower, including reviewing any relevant “know-your-customer” information.

...

Read More

Data Dive

September 17, 2024

Following the publication of the European Union’s Artificial Intelligence Act (AI Act or Act) on 12 July 2024, there are now a series of steps that various EU bodies need to take towards implementation. One of the first key steps is in relation to the establishment of codes of practice to “contribute to the proper application” of the AI Act.

...

Read More

Data Dive

August 6, 2024

On July 30, 2024, the Senate passed the Kids Online Safety and Privacy Act (S. 2073) via an overwhelmingly bipartisan vote of 91-3 shortly before departing for the August recess.

...

Read More

Data Dive

July 18, 2024

On 12 July 2024, the European Union Artificial Intelligence Act (AI Act or Act) was published in the Official Journal of the European Union (EU), marking the final step in the AI Act’s legislative journey. Its publication triggers the timeline for the entry into force of the myriad obligations under the AI Act, along with the deadlines we set out below. The requirement to ensure a sufficient level of AI literacy of staff dealing with the operation and use of AI systems will, for example, apply to all providers and deployers on 2 February 2025.

...

Read More

Data Dive

July 18, 2024

On June 18, 2024, the United States Securities and Exchange Commission (SEC) announced a settlement with R.R. Donnelley & Sons Company (RRD) for alleged internal control and disclosure failures following a ransomware attack in 2021. Without admitting or denying the SEC’s findings, the business communications and marketing services provider agreed to pay a civil penalty of over $2.1 million to settle charges alleging violations of Section 13(b)(2)(B) of the Securities Exchange Act of 1934 (Exchange Act) and Exchange Act Rule 13a-15(a).1

...

Read More

Data Dive

June 11, 2024

In May, the National Institute of Standards and Technology (NIST) issued updated recommendations for security controls for controlled unclassified information (CUI) that is processed, stored or transmitted by nonfederal organizations using nonfederal systems, (NIST Special Publication 800-171 (SP 800-171), Revision 3). These security requirements are “intended for use by federal agencies in contractual vehicles or other agreements that are established between those agencies and nonfederal organizations.”1 While these new controls are only applicable to nonfederal entities that agree to comply with the new issuance, Revision 3 signals the next phase of expected security for government contractors.

...

Read More

© 2024 Akin Gump Strauss Hauer & Feld LLP. All rights reserved. Attorney advertising. This document is distributed for informational use only; it does not constitute legal advice and should not be used as such. Prior results do not guarantee a similar outcome. Akin is the practicing name of Akin Gump LLP, a New York limited liability partnership authorized and regulated by the Solicitors Regulation Authority under number 267321. A list of the partners is available for inspection at Eighth Floor, Ten Bishops Square, London E1 6EG. For more information about Akin Gump LLP, Akin Gump Strauss Hauer & Feld LLP and other associated entities under which the Akin Gump network operates worldwide, please see our Legal Notices page.