
OpenAI CEO Sam Altman, speaking on Theo Von’s podcast on July 23, 2025, cautioned users against treating ChatGPT as a therapist or lawyer, as its conversations lack the legal privacy protections of doctor-patient or attorney-client privilege, per bitcoinethereumnews.com. Users, especially younger ones, often share deeply personal issues, from relationship troubles to mental health concerns, assuming confidentiality, Altman noted. However, ChatGPT chats can be subpoenaed in legal cases, exposing sensitive details, per techcrunch.com. Altman called this “very screwed up,” advocating for AI privacy laws akin to those for therapists, per businessinsider.com. X posts from @TheChiefNerd amplify the warning, urging users to seek “privacy clarity” before sharing sensitive information.
Unlike therapy or legal consultations, ChatGPT conversations have no statutory confidentiality protections, per ainvest.com. In a high-profile lawsuit with The New York Times, a court order demands OpenAI retain logs of hundreds of millions of users’ chats, including deleted ones, except for enterprise accounts, per arstechnica.com. OpenAI appealed, labeling it an “overreach” that could set a precedent for law enforcement or civil discovery demands, per fastcompany.com. Even deleted chats may be stored for 30 days or longer for legal reasons, per businessinsider.nl. X posts from @avstmd highlight risks for clinicians using public LLMs, noting potential lawsuits if patient data is breached.
This gap creates risks, especially for vulnerable users discussing mental health or legal issues, per theneuron.ai. Post-Roe v. Wade, users shifted to encrypted apps like Apple Health to avoid data exposure, per techcrunch.com.
The lack of privacy protections may hinder ChatGPT’s adoption for sensitive use cases, per bitcoinethereumnews.com. OpenAI’s appeal against the NYT order reflects broader concerns about user trust and data security, per arstechnica.com. Users should:
ChatGPT’s privacy risks underscore the need for new AI-specific laws—until then, treat it as a public tool, not a private confidant.
