Sam Altman Sounds Alarm: "ChatGPT Could Be Used Against You in Court"
As users increasingly confide in ChatGPT like a trusted friend, OpenAI’s CEO delivers a jarring truth: Those private chats could one day testify against you.

In a world where AI-powered chatbots are increasingly woven into the personal and emotional lives of millions, often acting as advisors, therapists, or even friends, OpenAI CEO Sam Altman has issued a blunt warning. In a recent revelation, Altman made it clear that conversations with ChatGPT aren’t protected by legal privilege, aren’t end-to-end encrypted, and could potentially be saved and used as evidence in court.
OpenAI CEO Sam Altman has raised serious red flags about the privacy risks tied to users’ chats with the AI chatbot ChatGPT. He pointed out that these conversations lack legal protection and might be exploited as evidence in future legal battles.
During an appearance on comedian Theo Von’s podcast *This Past Weekend*, Altman noted that more and more people—especially younger ones, are using ChatGPT as a personal counselor or stand-in therapist. “They’re spilling their deepest secrets to the bot, emotional struggles, relationship problems, questions like ‘What should I do?’, but there’s no legal shield for that,” he emphasized.
Unlike professionals such as lawyers, psychologists, or doctors, who are legally obligated to keep client information confidential, no current law extends that same protection to chats with artificial intelligence. This means that if a lawsuit triggers a request for data, OpenAI could be forced to hand over your conversations with the bot, even if they involve highly personal or sensitive topics.
In a copyright lawsuit brought by *The New York Times* against OpenAI, a U.S. federal court issued a ruling in May 2025, with Judge Ona T. Wong ordering the company to preserve all ChatGPT user conversations, even those slated for deletion, for the legal process. The order requires OpenAI to retain and separate all output data, covering accounts from the free, Plus, Pro, and Team versions. However, organizational and institutional users (like schools) are currently exempt from this retention. OpenAI tried to appeal, but federal judge Sidney Stein denied the request in June.
Unlike apps like Signal or WhatsApp, ChatGPT conversations aren’t end-to-end encrypted. OpenAI can access their content whenever it chooses. While there’s a deletion policy for free-tier users (within 30 days), many chats are kept longer for safety and security purposes—especially when legal orders are in play.
Altman is calling for a legal framework to grant similar privilege to AI interactions as those with professionals. “This was something no one even considered a year ago… but now we need a law to protect people who share personal details with AI,” he said. Until such laws are passed, privacy experts urge caution, advising users to avoid sharing sensitive information in chats. “Treat it like an email, it could end up in a legal investigation.”