Sam Altman on Using ChatGPT for Life Decisions: ‘It Makes Me Uneasy’
OpenAI CEO Sam Altman said he doesn’t feel comfortable with how people are consulting ChatGPT on major life decisions.
“A lot of people effectively use ChatGPT as a sort of therapist or life coach, even if they wouldn’t describe it that way,” Altman wrote on X on Sunday.
“I can imagine a future where a lot of people really trust ChatGPT’s advice for their most important decisions. Although that could be great, it makes me uneasy,” Altman added.
OpenAI did not respond to a request for comment from Business Insider.
Altman said in his X post that OpenAI has been “closely tracking” people’s sense of attachment to their AI models and how they react when older versions are deprecated.
“People have used technology including AI in self-destructive ways,” Altman wrote on Sunday. “If a user is in a mentally fragile state and prone to delusion, we do not want the AI to reinforce that.”
Altman said that while most ChatGPT users can distinguish “between reality and fiction or role-play,” a minority cannot. He added that ChatGPT could be harmful if it leads people away from their “longer term well-being.”
“This is just my current thinking, and not yet an official OpenAI position,” Altman said.
In his post, Altman also referenced the negative response from some ChatGPT users after GPT-5 landed on Friday.
Some ChatGPT users, for instance, called for OpenAI to restore older models like GPT-4o. People made posts on social media to voice their complaints about GPT-5, saying the model’s replies were written in a “flat” tone and lacked creativity.
OpenAI has also tweaked its models. In April, the company said it was rolling back an update to GPT-4o because the model had become sycophantic and was “overly flattering” to users.
Altman has previously expressed concern about how people are using ChatGPT as a personal therapist — and the legal concerns around it.
In a podcast that aired last month, Altman told podcaster Theo Von that OpenAI may be required to produce its users’ therapy-style chats in a lawsuit.
“So if you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, we could be required to produce that, and I think that’s very screwed up,” Altman said.
“No one had to think about that even a year ago, and now I think it’s this huge issue of like, ‘How are we gonna treat the laws around this?'” Altman added.