Privacy and Data Protection.
Every prompt you send is data. Know where it goes, who sees it, and what's safe to share.
After this lesson you'll know
- What happens to the data you send to AI models
- The 5 things you should NEVER paste into an AI prompt
- How to use AI safely with sensitive information
- Business vs personal account privacy differences
Your prompts aren't private by default.
When you type something into an AI chat, you're sending data to a server. Depending on the provider, your plan, and the settings — that data might be stored, reviewed by staff, or used to train future models.
Most major AI providers (including Anthropic, OpenAI, and Google) have different policies for free vs paid accounts, and for consumer vs business plans. The differences matter enormously:
- May use your data for training
- Conversations may be reviewed
- Less control over data retention
- Fewer compliance guarantees
- Typically no training on your data
- Stricter access controls
- Data retention policies you can configure
- Compliance certifications (SOC 2, etc.)
Rule of thumb: If you're using a free or consumer plan, treat every prompt as if it could be seen by someone else.
5 things to NEVER paste into an AI prompt.
How to use AI safely with sensitive work.
You don't have to avoid AI for sensitive topics. You just need to be smart about it:
GDPR, CCPA, and what they mean for your AI use.
You don't need to be a lawyer to understand the key data protection laws that affect AI use. Here's what matters for everyday users and professionals.
- Applies if you process data of EU residents — regardless of where you are
- Right to erasure: people can demand their data be deleted
- Right to explanation: people can ask how automated decisions were made
- Data minimization: only collect what you actually need
- Fines up to 4% of annual global revenue
- Applies to businesses handling data of California residents
- Right to know: consumers can ask what data is collected about them
- Right to delete: consumers can request data deletion
- Right to opt out of data sales
- Fines up to $7,500 per intentional violation
Why this matters for AI: When you paste someone's personal data into an AI tool, you may be transferring it to a third party (the AI provider). Under GDPR, that transfer likely requires the data subject's consent. Under CCPA, it could qualify as a "sale" of personal information if the provider uses it for training.
Data minimization: share only what the task requires.
Data minimization is one of the most practical privacy principles for AI users. The idea is simple: give AI only the information it needs to complete the task — nothing more.
The minimized version gives AI everything it needs to write a great response — without exposing any personal information. The name, email, account number, and specific balance are irrelevant to the task of drafting the response.
Understanding consent in the age of AI.
Consent is foundational to data privacy, but AI complicates it in new ways. When someone gives you their email address, they consented to you having it — not to you pasting it into an AI model that might store it indefinitely or use it for training.
Anonymize sensitive data before sending it to AI.
Use this prompt to get AI's help analyzing sensitive work without exposing private information. Notice how you describe the situation instead of pasting raw data.
I need to analyze [type of data, e.g. "customer support tickets"] but I can't share the raw data because it contains personal information.
Here's what I can tell you:
- The dataset has [number] records from [time period]
- Common themes I'm seeing: [list 3-5 patterns in general terms]
- The business question I need answered: [your question]
Based on this description, give me:
1. An analysis framework I can apply to the data myself
2. The specific metrics I should track
3. Questions I should ask the data to find actionable insights
Do NOT ask me to paste the raw data. Help me analyze it without exposing it.