
Around the same time, OpenAI found themselves involved in court cases from people who were led down rabbit holes leading to death — and «ChatGPT psychosis» became a worry.
ChatGPT turned «Pretty restrictive»
Since then, GPT-5 and 4o got extremely restrictive to counter this trend and «being careful with mental health issues,» as Sam Altman puts it on x.com.
This of course ruined the experience for many adult users who like to chase down thought experiments and chat and improvise freely, and instead found this super safe bot that would refer you to helplines for almost anything you said.
But now, this is changing, Sam Altman says:
We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues. We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right.
Now that we have…
— Sam Altman (@sama) October 14, 2025
— Now that we have been able to mitigate the serious mental health issues and have new tools, we are going to be able to safely relax the restrictions in most cases.
He goes on to say there is a new version of ChatGPT rolling out «in a few weeks» that will perform «more like what people liked in about 4o» — being more human-like and able to «act like a friend.»
He also hopes the new version will be better than 4o in many ways.
At the same time, OpenAI are releasing the names of their «Expert council on well being» — which will be tapped for advice about healthy interactions with AI.
Erotica is coming — but only if you ask for it
Not stopping there, Altman sets his sights on December — when OpenAI will roll out age verification. Treating «adults like adults,» means opening up the model even more.
Part of this will be allowing erotica «for verified adults,» Altman writes.
But don’t worry, ChatGPT will not be turned into a sexbot, and you will not get any of this unless you ask for it. So if you like GPT-5 the way it is, it should remain the same.
Still, Altman says that «for a very small percentage of users in mentally fragile states there can be serious problems,» and they needed to learn how to mitigate those harms before they could allow for more freedom on the platform — but questions remain on just how, precisely, they will determine who is «not at risk of serious harm.»
Read more: @sama’s tweet, and a follow-up, LIke many others today, Axios muses on porn.