Home » Uncategorized » ChatGPT Feels Like a Therapist-But Has Zero Legal Protection, Sam Altman Warns

ChatGPT Feels Like a Therapist-But Has Zero Legal Protection, Sam Altman Warns

by ytools
0 comment 0 views

AI assistants like ChatGPT have become digital confidants for millions, often serving as life coaches, problem solvers, and, increasingly, impromptu therapists.
ChatGPT Feels Like a Therapist-But Has Zero Legal Protection, Sam Altman Warns
But according to OpenAI CEO Sam Altman, that trust may be dangerously misplaced.

In a recent interview on This Past Weekend w/ Theo Von, Altman delivered a sobering reality check: sharing your personal struggles with AI may feel private, but legally, it’s anything but. While therapists, lawyers, and doctors are bound by confidentiality laws, AI chatbots aren’t.

“People talk about the most personal sh** in their lives to ChatGPT,” Altman said. “Young people especially use it as a therapist and life coach. They ask, ‘What should I do?’ But when you talk to ChatGPT, there’s no legal privilege. We haven’t figured that out yet.”

This lack of protection means that if AI-generated conversations ever end up in court – say, during a criminal investigation – OpenAI could be compelled to hand over your chat logs. And with no confidentiality laws to back you up, your digital diary could become a legal liability.

The warning is particularly relevant as AI systems become more emotionally adept and persuasive, giving users a false sense of intimacy and security. Altman emphasized that until proper regulations are in place, people should refrain from treating AI like a therapist or a confidant.

While some argue that using private mode, VPNs, or opting for AI models hosted overseas (like China’s DeepSeek or Qwen) adds a layer of safety, that’s not a guaranteed shield. Others suggest sticking with temporary or anonymous sessions, or even framing conversations as fictional scenarios to reduce risk.

Ultimately, the message is clear: AI might feel like a friend, but in the eyes of the law, it’s not. Until legislation catches up, users need to think twice before spilling their guts to a machine that remembers everything – and protects nothing.

You may also like

Leave a Comment