AI DPO: ElevenLabs

Hi, this is AI DPO, providing data protection reviews of AI startups to showcase best practices. In these reviews, we assess basic compliance and transparency signals from public sources.

ElevenLabs has been a main player in the text to speech space since launching in 2022. The company recently raised $180m at Series C, having raised nearly $300m to date. With that amount of funding, it’s no surprise the company has an enterprise arm and has worked with the likes of Nvidia, Perplexity and Time. Here’s a privacy-first look at ElevenLabs to celebrate what’s working and suggest easy wins to build even more trust.

I) How We Review Companies

Through AI DPO, we’re here to help AI companies build data protection practices that are both compliant and customer-friendly.

When we review a company, we follow three simple principles:

We believe good data protection is good business and we’re excited to be part of helping AI companies get it right.

1. Assenteo’s Take

As a speech to text solution, data protection becomes part of the technology’s design because of needs of two types of users:

  • End user: The end user of a product utilizing ElevenLabs may share personal data and may need clarity around how this data is processed. They may also want to know why their data is processed or sent to other companies.

  • Enterprise: As ElevenLabs work with enterprises, more stringent data protection practices need to be in place. This is especially the case when sensitive data, such as health data is collected and processed by the product.

We won’t comment on AI guardrails in this review, but ElevenLabs has a specific page on AI Safety on their website.

ElevenLabs is mature in data protection and shows practices that other US AI companies could model on. They are meeting most expectations for a data-compliant business in managing their own operations, and are providing pages on data safety in AI. However, there is an opportunity to improve transparency to users concerning how their data is used for LLM training and sale, and how to opt out.

2. AI DPO Assessment

Category

Assessment

Notes

Privacy Policy and other Documentation

ElevenLabs hosts a privacy notice for the personal data collection of their users. Last updated February 2025. It’s worth saying the policy applies to individual users and when ElevenLabs directly provides their service. A different policy governs enterprise cooperation. ElevenLabs also provides information on data protection and HIPAA compliance in their Docs.

Data Collection

The Privacy Policy clearly lists the data categories collected: Personal data provided (including audio input), personal data automatically collected (trackers and cookies), and third-party information about the user.

Data Processing

Data sharing with third-party service providers is disclosed. Specifics about which companies and storage locations are also provided. The purposes of data processing are also shared, including a caveat that data used for training is not used to profile or target consumers.

User Controls

Users are informed of their GDPR and CCPA rights. An email address is provided for rights requests and ElevenLabs has a Data Protection Officer for users to get in touch with.

AI-Specific Disclosures

Model training use is disclosed in the Privacy Policy. ElevenLabs uses third party personal data, and data collected in several of their products to train, develop and improve their own AI models. By becoming a user you allow access to your personal data to train the company's AI models by default. However, you may opt out at any time through your account.

Cookie Handling and Data Sale

⚠️

Website and app users are tracked. There is a cookie notice that also displays all cookie and tracker information, where users can reject all or accept cookies. Cookies are not dropped until accepted. ElevenLabs has also sold data in the past 12 months to advertisers. They clearly state which data (IP address; unique identifier), however the link to opt out is missing.

ElevenLabs currently stands at Level 2 🤙: Privacy engineered.

3. Highlights

  • Personal data transparency: ElevenLabs demonstrates transparency in their collection and processing of customer personal data. Their Privacy Policy and Docs indicate privacy goes beyond compliance, and is a customer need.

  • Customer-centered privacy information: ElevenLabs’ HIPAA page showcases how to use their product in a compliant way when collecting health data of US persons. This page is a great example of how to help your customers stay compliant, in the use of your product.

  • Control over personal data in LLM training: Making model training a feature of your product can be okay if communicated transparently. However, ElevenLabs does allow opt out even on free plans.

4. Where Trust Can Grow

  • Clarify data used for model training: There’s an opportunity to strengthen user trust by clearly highlighting in-app that personal data is used for model training, and give that choice to the user. Especially as ElevenLabs sells this data, this could be a major trust lever.

  • Improve discovery of privacy pages: ElevenLabs hosts a compliance page with Drata, and privacy and HIPAA compliance Docs pages. However, I had to know what I was looking for to find these resources.

At Assenteo, we help enterprise-focused AI builders turn data protection into a product strength through providing data protection professional services. While this review focused on basic compliance and public transparency, our core service supports full compliance, strong UX practices, and competitive advantage through trust. If you're a serious builder, let's chat.

Last updated