OpenAI CEO Sam Altman recently issued a serious warning that should make all ChatGPT users think twice before treating the AI like a personal therapist or life coach.
In a candid conversation on the podcast This Past Weekend w/ Theo Von, Altman admitted that the legal system hasn’t caught up with AI — especially when it comes to protecting user privacy during deeply personal conversations.
“People talk about the most personal sh** in their lives to ChatGPT,” Altman said. “Young people especially — relationship advice, mental health struggles, you name it. But unlike a doctor, therapist, or lawyer, your conversation with an AI has no legal privilege right now.”
In simple terms: what you say to ChatGPT could be used in court — because OpenAI could be legally required to hand over those chats.
This raises a red flag for users who rely on AI for emotional or mental health support, assuming their conversations are private. They’re not — not legally, at least.
Altman acknowledged that this lack of legal protection could slow AI adoption, especially for personal use cases. He emphasized that OpenAI believes AI chats should eventually have the same privacy protections as talking to a therapist or doctor — but that legal framework simply doesn’t exist yet.
“No one had to think about this even a year ago,” he said, pointing to how fast AI is moving ahead of policy.
This privacy concern isn’t just theoretical. OpenAI is already fighting a court order in its lawsuit with The New York Times. The court wants the company to preserve conversations from hundreds of millions of users — a move OpenAI called an “overreach.”
If the courts win, it could set a dangerous precedent: that even casual, private chats with AI could be stored and accessed for legal reasons.
The timing couldn’t be more critical. In a post-Roe v. Wade America, users are already shifting toward encrypted apps for sensitive data like health records and period tracking. Altman’s concern highlights that AI chat data could be next in line for subpoenas — raising alarm bells about digital privacy at large.
AI isn’t your therapist — at least not legally. Until lawmakers build a proper confidentiality framework around AI tools like ChatGPT, your most sensitive conversations are still fair game in a courtroom.
If you’re sharing personal or vulnerable information, be cautious. Privacy with AI is still a work in progress — and even OpenAI’s CEO agrees.
🧠 No Therapist Privilege with AI: Sam Altman’s Stark Warning About ChatGPT Privacy was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story.