ChatGPT customers might need to assume twice earlier than turning to their AI app for remedy or other forms of emotional help. In line with OpenAI CEO Sam Altman, the AI business hasnβt but found out how one can defend person privateness in terms of these extra delicate conversations, as a result of thereβs no doctor-patient confidentiality when your doc is an AI.
The exec made these feedback on a current episode of Theo Vonβs podcast, This Previous Weekend w/ Theo Von.
In response to a query about how AI works with as we speakβs authorized system, Altman mentioned one of many issues of not but having a authorized or coverage framework for AI is that thereβs no authorized confidentiality for customersβ conversations.
βFolks discuss probably the most private sh** of their lives to ChatGPT,β Altman mentioned. βFolks use it β younger folks, particularly, use it β as a therapist, a life coach; having these relationship issues and [asking] βwhat ought to I do?β And proper now, should you discuss to a therapist or a lawyer or a health care provider about these issues, thereβs authorized privilege for it. Thereβs doctor-patient confidentiality, thereβs authorized confidentiality, no matter. And we havenβt figured that out but for while you discuss to ChatGPT.β
This might create a privateness concern for customers within the case of a lawsuit, Altman added, as a result of OpenAI could be legally required to supply these conversations as we speak.
βI feel thatβs very screwed up. I feel we should always have the identical idea of privateness in your conversations with AI that we do with a therapist or no matter β and nobody had to consider that even a yr in the past,β Altman mentioned.
The corporate understands that the dearth of privateness could possibly be a blocker to broader person adoption. Along with AIβs demand for a lot on-line information in the course of the coaching interval, itβs being requested to supply information from customersβ chats in some authorized contexts. Already, OpenAI has been preventing a court docket order in its lawsuit with The New York Occasions, which might require it to save lots of the chats of a whole bunch of tens of millions of ChatGPT customers globally, excluding these from ChatGPT Enterprise prospects.
Techcrunch occasion
San Francisco
|
October 27-29, 2025
In an announcement on its web site, OpenAI mentioned itβs interesting this order, which it known as βan overreach.β If the court docket may override OpenAIβs personal selections round information privateness, it may open the corporate to additional demand for authorized discovery or regulation enforcement functions. At presentβs tech firms are repeatedly subpoenaed for person information in an effort to assist in legal prosecutions. However in more moderen years, there have been further considerations about digital information as legal guidelines started limiting entry to beforehand established freedoms, like a ladyβs proper to decide on.
When the Supreme Court docket overturned Roe v. Wade, for instance, prospects started switching to extra non-public period-tracking apps or to Apple Well being, which encrypted their data.
Altman requested the podcast host about his personal ChatGPT utilization, as nicely, on condition that Von mentioned he didnβt discuss to the AI chatbot a lot because of his personal privateness considerations.
βI feel it is smart β¦ to essentially need the privateness readability earlier than you utilize [ChatGPT] quite a bit β just like the authorized readability,β Altman mentioned.





