Technology
OpenAI plans to provide adult users with a version of ChatGPT that has fewer restrictions, including the ability to engage in erotic content.

OpenAI CEO Sam Altman has announced that verified adult users of ChatGPT will soon be able to access a less restricted version of the AI platform, potentially including erotic content. Altman stated on X (formerly Twitter) that starting in December, as age-gating is expanded and under the company’s principle of “treat adult users like adults,” the platform will allow content such as erotica for verified adults. This represents a significant shift from OpenAI’s earlier policies, which generally prohibited such material in nearly all contexts. The specific types of content that will qualify as permitted erotica have not yet been clarified.
Altman also addressed concerns about mental health, claiming that OpenAI has mitigated serious mental health risks associated with AI-chatbot use. With new safety measures, including stronger parental controls, the company is exploring ways to ease previous strict content restrictions. He noted that earlier ChatGPT versions were highly restrictive to protect users from potential mental health harms, but this also made the chatbot less engaging for users without such concerns. “Now that we have mitigated serious mental health issues and have new tools, we can safely relax restrictions in most cases,” he said, though how user age will be verified remains uncertain.
The policy shift is notable because OpenAI intentionally designed GPT-5 to reduce the chatbot’s “sycophantic” tendencies and prevent potential mental health risks. Alongside the December rollout, Altman revealed that a new ChatGPT version will soon allow the AI to adopt more distinct personalities, building on improvements introduced in GPT-4o.
These announcements come amid growing scrutiny over AI safety. In September, the US Federal Trade Commission launched an inquiry into several tech firms, including OpenAI, over potential risks to children and adolescents. This follows a lawsuit in California in which a couple claimed that ChatGPT played a role in their 16-year-old son’s suicide.