Scams Radar

ChatGPT doesn't have the same legal privacy protections as therapy or legal advice.

Sam Altman looking concerned with digital grid background, symbolizing ChatGPT legal privacy concerns

OpenAI CEO Sam Altman, speaking on Theo Von’s podcast on July 23, 2025, cautioned users against treating ChatGPT as a therapist or lawyer, as its conversations lack the legal privacy protections of doctor-patient or attorney-client privilege, per bitcoinethereumnews.com. Users, especially younger ones, often share deeply personal issues, from relationship troubles to mental health concerns, assuming confidentiality, Altman noted. However, ChatGPT chats can be subpoenaed in legal cases, exposing sensitive details, per techcrunch.com. Altman called this “very screwed up,” advocating for AI privacy laws akin to those for therapists, per businessinsider.com. X posts from @TheChiefNerd amplify the warning, urging users to seek “privacy clarity” before sharing sensitive information.

Legal and Litigation Challenges

Unlike therapy or legal consultations, ChatGPT conversations have no statutory confidentiality protections, per ainvest.com. In a high-profile lawsuit with The New York Times, a court order demands OpenAI retain logs of hundreds of millions of users’ chats, including deleted ones, except for enterprise accounts, per arstechnica.com. OpenAI appealed, labeling it an “overreach” that could set a precedent for law enforcement or civil discovery demands, per fastcompany.com. Even deleted chats may be stored for 30 days or longer for legal reasons, per businessinsider.nl. X posts from @avstmd highlight risks for clinicians using public LLMs, noting potential lawsuits if patient data is breached.

Comparison to Traditional Protections

  • Therapy: Doctor-patient privilege under HIPAA (U.S.) protects mental health disclosures, except in cases of imminent harm or court orders, per psychotherapy.net. Therapists face strict ethical and legal duties to maintain confidentiality.

  • Legal Advice: Attorney-client privilege safeguards communications for legal purposes, barring exceptions like crime-fraud, per spellbook.legal. Lawyers must use secure tools to avoid privilege waivers.

  • ChatGPT: No equivalent privilege exists. OpenAI’s terms allow chat reviews for model training, and data may be shared in legal proceedings, per paragonlegal.com. Users are advised not to input sensitive data, per newsbytesapp.com.

This gap creates risks, especially for vulnerable users discussing mental health or legal issues, per theneuron.ai. Post-Roe v. Wade, users shifted to encrypted apps like Apple Health to avoid data exposure, per techcrunch.com.

Implications and User Guidance

The lack of privacy protections may hinder ChatGPT’s adoption for sensitive use cases, per bitcoinethereumnews.com. OpenAI’s appeal against the NYT order reflects broader concerns about user trust and data security, per arstechnica.com. Users should:

  • Avoid sharing personal or sensitive data with ChatGPT, as advised by Altman and OpenAI’s FAQ, per curiumlegal.com.au.

  • Use secure alternatives like encrypted health apps or licensed professionals for therapy/legal advice, per fastcompany.com.

  • Keep an eye on OpenAI’s X (@OpenAI) for news on changes to its privacy policy and new laws.

  • Diversify into regulated crypto assets like BTC ($123,091) or ETH ($3,811) for financial privacy, per CoinMarketCap, but avoid AI platforms for confidential matters.

ChatGPT’s privacy risks underscore the need for new AI-specific laws—until then, treat it as a public tool, not a private confidant.

Reviews:

Leave Your Review Here:

Scams Radar disclaimer highlighting educational purpose, no financial guarantees, risk warnings, and independent opinions.