No Therapist Privilege with AI: Sam Altman’s Stark Warning About ChatGPT Privacy

28-Jul-2025 Medium » Coinmonks

OpenAI CEO Sam Altman recently issued a serious warning that should make all ChatGPT users think twice before treating the AI like a personal therapist or life coach.

In a candid conversation on the podcast This Past Weekend w/ Theo Von, Altman admitted that the legal system hasn’t caught up with AI — especially when it comes to protecting user privacy during deeply personal conversations.

❌ No Legal Confidentiality with AI — Yet

“People talk about the most personal sh** in their lives to ChatGPT,” Altman said. “Young people especially — relationship advice, mental health struggles, you name it. But unlike a doctor, therapist, or lawyer, your conversation with an AI has no legal privilege right now.”

In simple terms: what you say to ChatGPT could be used in court — because OpenAI could be legally required to hand over those chats.

This raises a red flag for users who rely on AI for emotional or mental health support, assuming their conversations are private. They’re not — not legally, at least.

🧑‍⚖️ A Legal Gap No One Saw Coming

Altman acknowledged that this lack of legal protection could slow AI adoption, especially for personal use cases. He emphasized that OpenAI believes AI chats should eventually have the same privacy protections as talking to a therapist or doctor — but that legal framework simply doesn’t exist yet.

“No one had to think about this even a year ago,” he said, pointing to how fast AI is moving ahead of policy.

⚖️ OpenAI vs The New York Times — Data at the Center

This privacy concern isn’t just theoretical. OpenAI is already fighting a court order in its lawsuit with The New York Times. The court wants the company to preserve conversations from hundreds of millions of users — a move OpenAI called an “overreach.”

If the courts win, it could set a dangerous precedent: that even casual, private chats with AI could be stored and accessed for legal reasons.

🛑 The Bigger Picture: Privacy in a Post-Roe World

The timing couldn’t be more critical. In a post-Roe v. Wade America, users are already shifting toward encrypted apps for sensitive data like health records and period tracking. Altman’s concern highlights that AI chat data could be next in line for subpoenas — raising alarm bells about digital privacy at large.

🔐 Bottom Line

AI isn’t your therapist — at least not legally. Until lawmakers build a proper confidentiality framework around AI tools like ChatGPT, your most sensitive conversations are still fair game in a courtroom.

If you’re sharing personal or vulnerable information, be cautious. Privacy with AI is still a work in progress — and even OpenAI’s CEO agrees.


🧠 No Therapist Privilege with AI: Sam Altman’s Stark Warning About ChatGPT Privacy was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story.

Also read: Shiba Inu Team Unveils New Developer Hub Updates — Here’s The 411
About Author Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc fermentum lectus eget interdum varius. Curabitur ut nibh vel velit cursus molestie. Cras sed sagittis erat. Nullam id ante hendrerit, lobortis justo ac, fermentum neque. Mauris egestas maximus tortor. Nunc non neque a quam sollicitudin facilisis. Maecenas posuere turpis arcu, vel tempor ipsum tincidunt ut.
WHAT'S YOUR OPINION?
Related News