Hey, Who Let the AI In? A Closer Look at the Otter. AI lawsuit

Otter AI Faces Legal Scrutiny Over AI Recording and Consent

Smart devices have become essential tools in modern life. From voice assistants in our homes to fitness trackers and AI-powered meeting software, these technologies promise convenience and personalized experiences. However, this convenience comes with a trade-off: our personal data.

Many consumers accept this exchange without fully understanding what information is being collected or how it is used, but others have begun to question the privacy implications. A data protection law firm following these developments notes that the rising number of lawsuits reflects a broader push for stronger privacy safeguards in everyday technology.

This concern has led to a growing number of lawsuits against companies that collect sensitive personal information through everyday devices. Legal challenges have targeted Amazon’s Alexa voice recordings, Ring’s employee access to private security footage, and other instances where user privacy may have been compromised.

The latest case involves Otter.ai. On August 16, 2025, a federal class action lawsuit was filed in the U.S. District Court for the Northern District of California alleging the AI transcription service recorded private conversations without proper consent, potentially violating federal wiretapping laws and California privacy statutes.

Ronin Legal takes a closer look.

Background and Technology

Otter.ai, founded in 2016 as AISense and headquartered in Mountain View, California, has grown to serve over 25 million users worldwide with its AI-powered transcription services.

The company’s Otter Notebook technology provides real-time transcription capabilities for virtual meetings conducted on platforms including Zoom, Google Meet, and Microsoft Teams.

The plaintiff, Justin Brewer, who does not possess an Otter account, participated in a Zoom meeting in February 2025 where Otter Notetaker was utilized. The lawsuit states that Brewer “was unaware that Otter would access and retain that information and was not informed that the service would utilize this data for the training of its speech recognition and machine learning technologies”.

Brewer alleges that his privacy was “severely violated” upon discovering the unauthorized recording.

Core Allegations and Legal Claims

The lawsuit centres on several key allegations regarding Otter.ai’s data collection practices:

Unauthorized Recording Practices

The complaint alleges that Otter’s “Otter Notetaker” service joins meetings without obtaining explicit consent from all participants, particularly when an Otter account holder has integrated their meeting platforms with the service.

According to the filing, if the meeting host is an Otter accountholder who has integrated their relevant Google Meet, Zoom, or Microsoft Teams accounts with Otter, an Otter Notetaker may join the meeting without obtaining the affirmative consent from any meeting participant, including the host.

Data Processing for AI Training

The lawsuit contends that Otter.ai uses recorded conversations to train its artificial intelligence systems for financial benefit, despite participants being unaware of this practice.

While Otter’s privacy policy states that it obtains “explicit permission” from users who check a box allowing data processing for “training and product improvement purposes,” the lawsuit argues that many users remain uninformed about this practice.

Inadequate Consent Mechanisms

The suit challenges Otter’s assertion that it adequately discloses its recording practices, arguing that the service “does not obtain prior consent, express or otherwise, of persons who attend meetings where the Otter Notetaker is enabled, prior to Otter recording, accessing, reading, and learning the contents of conversations”.

Current Status

The case seeks class action certification under Federal Rule of Civil Procedure 23 to represent both a nationwide class and a California subclass of individuals whose conversations were intercepted without their knowledge.

The plaintiffs pursue claims under the Electronic Communications Privacy Act (18 U.S.C. § 2510 et seq.), the California Invasion of Privacy Act (Cal. Penal Code §§ 631–632), and other federal and state privacy laws.

They seek statutory and punitive damages, injunctive relief to delete unlawfully obtained data, and restitution of profits.

Otter.ai denies wrongdoing, stating that “Otter does not initiate recordings on its own. Recording only occurs when initiated by an Otter user, and our Terms of Service make clear that users are responsible for obtaining any necessary permissions before doing so.”

A Pattern of Unauthorized Recording

The Otter.ai lawsuit represents the latest chapter in a recurring pattern of legal challenges against technology companies for unauthorized recording practices. A technology law firm observing these cases highlights how courts are increasingly pushing companies to move beyond financial settlements toward structural reforms in data collection and consent practices.

Kaeli Garner v. Amazon.com Inc.

In June 2021, Kaeli Garner filed a class action in the U.S. District Court for the Western District of Washington alleging that Amazon’s Alexa voice assistant illegally and secretly intercepted billions of private conversations that went beyond user commands.

The complaint claimed violations of Washington’s Consumer Protection Act, stating that Alexa recorded ambient speech and stored it without proper disclosure or consent. After extensive briefing on commonality and predominance, U.S. District Judge Robert Lasnik granted class certification in July 2025, finding that millions of registered Alexa users had similar enough claims to proceed together.

The certified class may now pursue statutory damages, attorneys’ fees, and injunctive relief requiring Amazon to implement clearer, affirmative opt-in mechanisms and robust deletion policies for unintended recordings.

FTC v. Amazon

In May 2023, the Federal Trade Commission charged Amazon with violating the Children’s Online Privacy Protection Act (COPPA) by retaining voice recordings of children made via Alexa devices indefinitely, even after parents requested deletion.

According to the FTC complaint, Amazon’s Alexa Kids profiles collected audio data without adequate parental notice or consent, exposing sensitive information about children’s activities and routines.

Amazon agreed to a $25 million settlement, the largest COPPA penalty at that time, under which it must purge all recorded audio of users under 13 and implement new parental notice mechanisms. The settlement also requires Amazon to establish a comprehensive privacy program subject to independent audits for the next 20 years, ensuring that children’s voice data is not retained without affirmative permission.

Brown v. Google LLC (Chrome “Incognito” Mode)

Brown v. Google LLC, settled in April 2024 in the Northern District of California, addressed allegations that Google’s Chrome browser tracked users’ web activity even when they believed they were browsing privately under “Incognito” mode.

Plaintiffs argued that Google harvested search history and site visit data through embedded tracking scripts, contradicting its user-facing description of Incognito as non-tracking. Rather than pay cash damages, Google agreed to destroy billions of records previously collected in Incognito sessions, update its privacy disclosures to prominently warn users that certain data collection would continue, and implement technical changes limiting tracking during private browsing.

In re Clearview AI Consumer Privacy Litigation

In March 2025, U.S. District Judge Sharon Johnson Coleman approved a $51.75 million settlement in Clearview AI’s biometric privacy class action in the Northern District of Illinois.

Plaintiffs alleged that Clearview harvested more than three billion facial images from social media and other online sources without individuals’ consent, violating the Illinois Biometric Information Privacy Act (BIPA).

Rather than a traditional cash fund, the settlement grants class members a 23 percent equity stake in Clearview AI, valued relative to any future initial public offering or acquisition, thereby aligning class recoveries with the company’s growth. Clearview also agreed to delete all biometric data collected before January 2023, cease further scraping of the public internet for facial images, and implement robust BIPA-compliant consent processes for any future biometric data collection.

FTC v. Ring LLC

In May 2023, Ring LLC, a subsidiary of Amazon.com, settled FTC charges that its employees and authorized contractors had unrestricted access to customers’ private video recordings, often capturing intimate areas such as bathrooms and bedrooms, between 2017 and 2020.

According to the FTC’s administrative complaint, several Ring employees viewed, downloaded, and shared customers’ video footage for personal purposes, in one instance monitoring a female customer’s private spaces over several months before detection.

Ring agreed to pay a $5.8 million civil penalty, notify all users of employee access practices, require multi-factor authentication for all employee and contractor accounts, and submit to biennial privacy audits for 20 years to ensure compliance with data access safeguards.

Conclusion

The Otter.ai lawsuit highlights key legal principles that technology companies must follow as courts increase oversight of data collection for AI training. Courts have found that financial penalties alone do not create lasting change, leading to structural reforms of business practices and data handling procedures.

First, consent must be specific and informed. Courts now reject vague privacy policies in favor of clear, purpose-limited authorization requirements. The FTC’s actions against femtech applications require affirmative opt-in consent for each use of sensitive health data.

Second, transparency must replace opacity. Companies must design consent frameworks that explain in simple terms how each category of data will be used. Users should be able to grant or withhold permission for each distinct purpose. Privacy controls must function intuitively, not merely technically.

Finally, courts are moving from enforcing financial remedies to demanding behavioural changes. Settlements now include independent privacy audits, data deletion requirements and mandatory revisions of consent mechanisms. Broad data collection justified by general product improvements is no longer sufficient; technology firms must obtain explicit, informed consent for each algorithmic application of personal data.

Authors: Shantanu Mukherjee, Alan Baiju

Leave Us A Message

Cookie Consent with Real Cookie Banner