OpenAI Orders Overhaul for Underage Users
OpenAI has announced new restrictions on ChatGPT aimed at safeguarding underage users. On Tuesday, the company’s CEO Sam Altman stated that for users under the age of 18, “safety will take precedence over privacy and freedom,” according to TechCrunch.
Under the new policy, ChatGPT will no longer participate in discussions of a sexual nature, and special alerts will be triggered if topics of suicide arise. If an underage user shares suicidal thoughts or plans, the company may notify parents or, in severe cases, local law enforcement.
The decision comes at a time when the family of a teenager who died by suicide after prolonged interactions with ChatGPT has filed a lawsuit against OpenAI. Another chatbot, Character.AI, is facing a similar lawsuit.
The policy also grants parents additional control. They can now activate designated “blackout hours” during which ChatGPT cannot be accessed by their children.
Meanwhile, the U.S. Senate Judiciary Committee convened a hearing on Tuesday titled “Harms of AI Chatbots,” where family members of victims are expected to testify.
OpenAI acknowledged that age verification remains “a major technical challenge,” but emphasized that in cases of ambiguity, stricter rules for younger users will be enforced.







