Scams Radar

ChatGPT Conversations Lack Legal Privacy Protections, Warns OpenAI CEO

Digital illustration of an OpenAI leader with neural network logo background in Cointelegraph style

OpenAI CEO Sam Altman, in a July 23, 2025, interview on This Past Weekend with Theo Von, warned that ChatGPT conversations lack the legal confidentiality protections afforded to therapist, lawyer, or doctor interactions, per indianexpress.com. Unlike doctor-patient or attorney-client privilege, which safeguards sensitive disclosures under laws like HIPAA or attorney-client confidentiality, ChatGPT chats can be subpoenaed in legal proceedings, per techcrunch.com. Altman noted that users, especially younger ones, treat ChatGPT as a therapist or life coach, sharing “the most personal sh**” like relationship or mental health issues, unaware that these chats could be used as evidence in court, per timesofindia.indiatimes.com. He called this “very screwed up” and urged for an “AI privilege” framework to align AI privacy with traditional professions, per diasporaglitzmagazine.com. X posts from @FirstSquawk and @Currentreport1 amplify the concern, warning that sensitive chats are not legally protected.

Lack of Legal Framework and Ongoing Litigation

The absence of a legal privacy framework for AI interactions leaves OpenAI vulnerable to court orders, such as in its ongoing lawsuit with The New York Times, where a May 13, 2025, ruling by Magistrate Judge Ona T. Wang, upheld on June 26 by Judge Sidney Stein, mandates retaining all ChatGPT user data (except enterprise/educational accounts) indefinitely, per nymynet.com. OpenAI’s appeal calls this an “overreach,” arguing it could set a precedent for law enforcement or discovery demands, per mashable.com. Normally, deleted chats are removed within 30 days, but legal or security needs may extend retention, per lindaikejisblog.com. OpenAI’s privacy policy allows data sharing with authorities or third parties to comply with legal obligations, per financialexpress.com. This contrasts with encrypted apps like WhatsApp or Apple Health, which protect user data post-Roe v. Wade, per indiatoday.in.

Broader Surveillance and Ethical Concerns

Altman expressed alarm over AI-driven surveillance, noting that governments may demand access to chats to prevent misuse, like terrorism, but warned that “history shows governments take that way too far,” per thecable.ng. He suggested a balanced approach, willing to “compromise some privacy for collective safety,” but highlighted risks of overreach, per tradingview.com. The growing use of ChatGPT for therapy, medical, or financial advice—evidenced by a Common Sense Media survey showing 52% of teens use AI companions monthly—amplifies these risks, per timesofindia.indiatimes.com. Privacy expert William Agnew from Carnegie Mellon stressed that “almost everything” shared with chatbots is not private and could be accessed by insurers or others, per cnet.com. X posts from @samsolid57 underscore the loss of data ownership, urging caution.

Person discussing AI topics in a podcast interview setup with books and microphone

Investor and User Guidance

Users should avoid sharing sensitive personal, medical, or legal data with ChatGPT, as advised by Altman and privacy experts, per vanguardngr.com. For confidential matters, consult licensed professionals bound by HIPAA or attorney-client privilege, per indiatoday.in. Monitor OpenAI’s X (@OpenAI) for privacy updates and regulatory developments, like potential AI privilege laws, per diasporaglitzmagazine.com. Investors eyeing AI-driven crypto projects like Fetch.ai (FET) ($1.77, up 4%) or Render (RNDR) ($10.49, up 2%) should diversify into BTC ($123,091) or ETH ($3,755) to hedge volatility, per CoinMarketCap. Risks include:

  • Legal Exposure: Court-ordered data disclosures could erode user trust, per nymynet.com.

  • Regulatory Shifts: Trump’s AI Action Plan favors deregulation, but privacy laws may tighten, per cnet.com.

  • Data Misuse: OpenAI staff can access chats for model training, unlike encrypted platforms, per lindaikejisblog.com.

Treat ChatGPT as an unsecured tool until legal protections emerge—verify privacy policies and consult professionals for sensitive matters.

Reviews:

Leave Your Review Here:

Scams Radar disclaimer highlighting educational purpose, no financial guarantees, risk warnings, and independent opinions.