News reports that technology companies actively listen to user voice recordings have fueled conversation regarding speaker privacy. For example, in July 2019, it was reported that Apple contractors listened to a subset of Siri recordings to assess the adequacy of Siri’s response. According to reports, these recordings included instances in which the speaker was accidentally activated—in other words, when the user did not use the designated “wake phrase” (e.g., “Hey Siri”). Consequently, the content of these recordings reportedly includes highly personal information, such as conversations between users and their doctors.
Additionally, this past summer, Germany’s data protection authority ordered Google to stop listening to recordings collected from its voice-activated Google Assistant product to improve the company’s speech recognition technology. The moratorium, which lasted three months, came after confirmation from Google that it had listened in on Google Assistant users, a potential violation of the EU General Data Protection Regulation (GDPR). Google explained that it listens to a small subset of voice clips as part of its efforts to improve how its voice recognition systems interpret different accents and dialects across languages.
Since last June, a number of class action complaints have been filed in federal district courts against top smart speaker and virtual assistant providers—specifically, Amazon, Google and Apple. In each case, the plaintiffs contend they did not consent to their smart speakers recording their conversations and that such recording constitutes a violation of privacy.
Here are a few key trends that have emerged in these class actions:
Most of the classes include minors, and one encompasses non-consumer “bystanders.”
A majority of these putative class actions seek to represent minors. One such case, Adamsky et al v. Amazon.com, No. 2:19-cv-01214 (W.D. Wash. filed Aug. 2, 2019), defines the proposed class as all children under the age of 13 residing in the United States who used Alexa on the Amazon Echo Dot Kids Edition. One other case, Hall-O’Neil v. Amazon.com, No. 2:19-cv-00910 (W.D. Wash. filed June 11, 2019), is even more specific—it proposes classes comprised of minors who used an Alexa-enabled device, but who had not downloaded and installed the Alexa App themselves. In three other cases, there are subclasses for minors who were recorded by a smart speaker. These cases argue that businesses cannot obtain consent from minors who did not purchase the smart speaker device or possess user accounts of their own. Lopez v. Apple, Inc., No. 4:19-cv-04577-NC (N.D. Cal. filed Aug. 7, 2019); Wilcosky v. Amazon.com, No. 1:19-cv-05061 (N.D. Ill. filed July 26, 2019); In re Google Assistant Privacy Litigation, No. 5:19-cv-04286 (N.D. Cal. filed July 25, 2019).
On a similar note, one case, Wilcosky v. Amazon.com, specifically includes a proposed “bystander” subclass, characterized as those who do not have registered Alexa accounts, but spoke in the vicinity of an Alexa device and therefore had Amazon create and store their voice recordings.
The plaintiffs rely (at least in part) on state law claims, including claims that the defendants violated state “two-party consent” laws.
The complaints generally allege the same wrongful conduct—the recording of confidential communications and collection of personal information without the user’s (or the parent’s) consent in violation of state law. At the heart of these allegations is the plaintiffs’ expectation that their interactions with the smart speakers would remain private.
For example, despite targeting different companies, the plaintiffs in both Lopez v. Apple, Inc. and In re Google Assistant Privacy Litigation filed their complaints in California and asserted violations of two California statutes—the California Invasion of Privacy Act (CIPA) and the Unfair Competition Law (UCL)—as well as the California Constitution. CIPA prohibits the recording of confidential communication without the consent of all parties involved. Under CIPA, the plaintiffs argue that the companies used the smart speakers to record their conversations without the consent of all parties and when plaintiffs had a reasonable expectation that they were not being recorded unless they said the wake phrase. Plaintiffs also argue that these practices constitute deceptive acts and unfair competition under the UCL and an invasion of privacy in violation of the California Constitution.
Similarly, in Hall-O’Neil v. Amazon.com, the plaintiffs allege that Amazon invaded their right to privacy by intentionally intercepting and using confidential communications, without the consent of all parties involved, in violation of various state wiretap statutes that require two-party consent for the recording of oral communications.
In Wilcosky v. Amazon.com, filed in the Northern District of Illinois, the plaintiffs claim a violation of the Illinois Biometric Information Privacy Act (BIPA). BIPA governs the collection, retention, capture and purchase of biometric identifiers—including “voiceprints.” Among other things, BIPA requires companies who handle biometric information to: (1) obtain valid consent from individuals if the company intends to collect or disclose their personal biometric identifiers; (2) destroy biometric identifiers in a timely manner; and (3) securely store biometric information. The plaintiffs in this case contend that the defendant company disregarded these obligations by failing to inform users that a biometric identifier was being collected and stored and failing to secure proper consent.
The plaintiffs also rely on common law claims related to privacy.
Aside from state law, a couple of the cases also assert common law claims of intrusion upon seclusion, invasion of privacy, breach of contract and unjust enrichment. As to unjust enrichment, the plaintiffs contend that they have a property interest in their voice recordings and that the defendant is profiting from its unlawful collection and use of this data.
A few of the class action lawsuits invoke federal privacy law, but most focus on state and common law claims.
Not surprisingly, federal privacy law is not the main focus in most of these complaints. In just three cases, Adamsky et al v. Amazon.com, In re Google Assistant Privacy Litigation and Lopez v. Apple, Inc., the plaintiffs invoke the Federal Wiretap Act. The plaintiffs in In re Google Assistant Privacy Litigation and Lopez v. Apple, Inc. also bring claims under the Stored Communications Act, which governs the disclosure of electronic communications stored with technology providers. Additionally, the In re Google Assistant Privacy Litigation plaintiffs rely on the Magnuson-Moss Warranty Act, which governs warranties on consumer products. Outside of these three cases, however, federal law rarely makes an appearance in these complaints.
In Adamsky et al v. Amazon.com, the plaintiffs reference the Children’s Online Privacy Protection Act (together with its implementing regulations, COPPA), a federal statute that prohibits the collection and recording of children’s personal information absent express parental consent. Although COPPA cannot be enforced through private lawsuits, the Federal Trade Commission (FTC) has the authority to ensure that smart speaker manufacturers comply with COPPA’s requirements. In May 2019, children’s privacy advocates filed a complaint requesting that the FTC investigate whether Amazon’s Echo Dot Kids Edition collects children’s personal information in violation of COPPA. The advocacy groups were led by the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, whose 2018 FTC complaint against YouTube led to a $170 million settlement announced last September.
At least one defendant has moved to compel arbitration.
In three cases, Hall-O’Neil v. Amazon.com, Wilcosky v. Amazon.com and Tice v. Amazon.com, No. 5:19-cv-01311 (C.D. Cal. filed July 17, 2019), Amazon has filed motions to compel arbitration and dismiss plaintiffs’ claims. Amazon alternatively requests that the court stay plaintiffs’ claims pending arbitration. Amazon argues that the plaintiffs agreed to arbitration when they purchased or registered Alexa-enabled products and accepted Amazon’s Conditions of Use, which includes a binding arbitration clause. The plaintiffs have largely responded by asserting that their claims are not governed by the arbitration agreements. The courts have yet to rule on Amazon’s motions.
Defenses will depend on the specific claims in each case, but will likely include arguments regarding plaintiffs’ reasonable expectations of privacy.
In defending against these class actions, defendants will likely argue that plaintiffs did not have a reasonable expectation of privacy in their conversations because the terms of use and privacy policies associated with the speakers described what information is collected through the speaker’s microphone and how that information is used. Defendants may also assert that, by agreeing to the terms of use and privacy policies, plaintiffs consented to the recording and use of their conversations as described therein.
Depending on the specific claims asserted in each case, defendants may also assert that the plaintiffs have not suffered an injury-in-fact, and thus do not have standing, or that the applicable statute of limitations has passed. For example, the statute of limitations for bringing a claim under CIPA is one year from when the plaintiffs discovered, or should have discovered, the defendant’s unlawful activity. See, e.g., Montalti v. Catanzariti, 236 Cal. Rptr. 231, 232–33 (Cal. Ct. App. 1987); NEI Contracting & Engineering, Inc. v. Hanson Aggregates Pac. Sw., Inc., No. 3:12-CV-01685-BAS, 2015 WL 1346110, at *4 (S.D. Cal. Mar. 24, 2015).
Lastly, defendants may challenge the plaintiff classes for lack of ascertainability, which requires that classes be defined clearly and based on objective criteria. Defendants could argue that class members who allegedly spoke in the vicinity of a smart speaker, but did not purchase or register the speaker, such as minors and bystanders, cannot be adequately ascertained.
Takeaways
This flurry of lawsuits serves as a reminder that technology companies manufacturing smart speakers or similar products should ensure that they obtain appropriate user consent, as necessary, to collect and use voice recordings, especially when children may be involved. These cases also reveal that the practice of recording and listening to users, even if solely to improve product functionality, leaves companies potentially vulnerable to allegations of privacy violations. To decrease the risk of complaints—both in and out of court—businesses must take a thoughtful approach to user privacy, including by communicating with users in a manner that is straightforward, direct and easy-to-understand.