The Question Nobody Thinks to Ask
Setting up an AI notetaker takes five minutes. Most people accept the terms, paste a meeting link, and move on. The question they skip — does this tool use my conversations to train its AI? — is one worth answering before another meeting gets recorded.
AI companies need data to improve their models. Real meeting transcripts, with their natural language, jargon, and context, are enormously valuable training material. Some tools are upfront about this. Others bury it.
How to Check Your Current Tool
Step 1: Search the privacy policy for "train"
Open the actual privacy policy — not the marketing page — and search for: train, improve, machine learning, model, anonymised, de-identified. Look for phrases like:
- "We may use your content to improve our services"
- "Aggregated or de-identified data may be used to develop our AI"
- "By using our service you grant us a licence to process your content"
Any of these can mean your conversations are feeding their training pipeline.
Step 2: Check whether opting out is possible — and at what cost
Some tools offer an opt-out, but only on enterprise plans. Free and standard users may have no choice. Find out: does an opt-out exist, where is it, and does it apply retroactively to past recordings?
Step 3: Ask the vendor directly
Send support a simple question: "Are my meeting transcripts ever used — directly or in aggregated form — to train or fine-tune your AI models?" A clear "no, never" is the answer you want. Vague responses about "service improvement" are a red flag.
What "De-Identified" Actually Means
Many policies use the term "de-identified data" to imply your conversations are anonymised before use. In practice, de-identification is harder than it sounds. Conversations contain references to named people, companies, and projects. Combine that with meeting metadata and re-identification becomes possible in ways that aren't obvious.
"De-identified" means your name was removed. It does not mean your content is private.
Red Flags to Watch For
- No mention of training data at all in the privacy policy
- Training opt-out gated behind enterprise pricing
- Vague language about "improving services" without specifics
- No answer — or a deflective answer — from support
- The tool stores audio files (voice data is especially sensitive training material)
What a Clean Policy Looks Like
A trustworthy tool will state clearly: your meeting content is never used to train AI models, by us or any third party. It will tell you exactly where your data is stored, for how long, and what happens when you delete your account.
Beaver's policy is straightforward: your meeting data is never used to train any AI model. No audio files are created or stored. Transcripts live in your own account and are deleted permanently when you request it. Try Beaver free — and read our privacy policy in under two minutes.
Five Questions to Ask Before Your Next Recording
- Does this tool store audio, or text only?
- Is my content used to train or improve AI models?
- Can I opt out — and is that option on my current plan?
- If I delete my account, is my data actually deleted?
- Who at the vendor company can access my transcripts?
You don't need to be paranoid. AI meeting tools genuinely save time. But you do deserve clear answers before handing over your most sensitive business conversations.