OpenAI CEO Sam Altman has issued a clear warning to users: ChatGPT is not a therapist—and anything you share with it isn’t protected by doctor-patient confidentiality. As generative AI becomes more embedded in people’s daily lives, Altman’s reminder serves as a reality check for users who might be oversharing with the chatbot.
In recent months, ChatGPT has seen a surge in users treating it as a digital confidant, venting about personal issues ranging from heartbreak to mental health struggles. But Altman emphasized that the tool was never intended to replace professional mental health support.

“ChatGPT is not a therapist,” Altman said during a recent public talk. “It can be empathetic, it can listen, but it doesn’t understand context like a human—and your data is not protected by any kind of legal privilege.”
Unlike licensed therapists, who are bound by strict confidentiality laws such as HIPAA in the U.S., ChatGPT does not fall under these regulations. Although OpenAI has implemented strong privacy practices and anonymization protocols, there is no guarantee of absolute secrecy.
This comes amid growing concerns about AI usage in sensitive conversations. Experts have pointed out that many users might not be aware of the limitations of AI privacy and could be sharing deeply personal or incriminating information, believing it to be safe.
viagogodirect.com |Sportmasteries.com | PrimeSportZone.com | SportStarPlace.com | SportPowerHub.com
OpenAI’s official documentation has consistently advised users not to input private, sensitive, or confidential data. Altman’s latest remarks reinforce this stance, especially as ChatGPT continues to evolve with more humanlike capabilities.
Mental health professionals are also urging caution. “AI can feel like a safe space, especially late at night when no one’s around. But that illusion of privacy can be dangerous,” says Dr. Meera Joshi, a clinical psychologist in Mumbai. “It’s important to talk to real people—whether a friend or a therapist—when you’re truly in need.”
Altman’s message is simple but urgent: Use ChatGPT as a helpful tool, not as a substitute for real emotional support. And above all, think twice before sharing your deepest secrets with a machine.
If you or someone you know is struggling emotionally, contact a licensed therapist or reach out to a mental health helpline in your area.