- Chatgpt begins to estimate users’ ages based on conversation patterns
- Those who are identified as teenagers are shutted to a filtered, teen-specific experience
- Parents have new tools to link accounts, set utility limits and receive alerts about their teen’s mental state
Openai gets chatgpt to act as a jumping in a club that estimates your age before deciding to let you come in. AI does not use your (possibly composed) date of birth or ID, but how to interact with the chatbot.
If the system suspects that you are under 18, it will automatically change you to a more limited version of the chatbot designed specifically to protect teens from inappropriate content. And if it is uncertain, it will fail on the side of caution. If you want the adult version of Chatgpt back, you may need to prove that you are old enough to buy a lottery ticket.
The idea that generative AI should not treat everyone the same is certainly understandable. Especially with teens who are increasingly using AI, Openai should consider the unique set of risks involved. The Teen-Specific Chatgpt experience will limit discussions on topics such as sexual content and offer more delicate handling of topics such as depression and self-damaging. And while adults can still talk about these topics in context, teenage users will see far more “sorry, I can’t help with it” messages when they wade into sensitive areas.
To find out your age, Chatgpt will fight through your conversation and look for patterns indicating age, specifically that someone is under 18. Chatgpts Guess your age may come from the types of questions you ask, your writing style, how you respond to being fixed, or even what emoji you prefer. If you leave its teenager’s alarm bells, in the age of age you go.
You may be 27 and ask about anxiety about career change, but if you are writing as a whimsical high school, you may be told to talk to your parents about your spiral concerns.
Openai has admitted that there may be mistakes as “even the most advanced systems will sometimes struggle to predict age.” In these cases, they will breach the safer condition and offer ways for adults to prove their age and regain access to the adult version of Chatgpt.
Emotionally secure models
This new age foreclosure system is the center of Openai’s next phase of teen security improvements. There will also be new parental controls coming later this month. These tools will let parents link their own accounts with their children, limit access to certain hours and receive alerts if the system detects what it calls “urgent distress.”
Depending on how serious the situation seems and whether parents cannot be reached, Openai can even contact law enforcement authorities based on the conversation.
Make chatgpt to a teen guidance advisor through built-in content filters is a remarkable shift on your own. Doing it without the user choosing is an even bigger turn, as it means that AI not only decides how old you are, but how your experience should deviate from an adult’s chatt conversation.
So if chatgpt starts to become more careful or strangely sensitive, check if you have suddenly been labeled as a teenager. You may just have a creative or youthful writing style, but you still have to prove that you are a legally adult if you want Edgier discussions.
Maybe just talk about your back that hurts for no reason or how music is not as good as it used to be to convince AI of your older credentials.



