Character.AI has made a major shift in its platform policies. As of late November 2025, users under 18 are no longer allowed to engage in freeform chats with AI characters. Instead, the company is directing minors to a new “Stories” feature — a choose-your-own-adventure style interactive fiction mode that blends user choice with AI-generated storytelling. This change comes as part of Character.AI’s launch of Stories, also referred to as Character AI launch stories, designed to offer a safer, more structured way for younger users to enjoy the platform.
The change is a response to mounting legal and social pressure, including lawsuits connected to emotional dependency on AI companions and at least one tragic teen suicide allegedly linked to interactions on the platform.
What “Stories” Means for Teens
Instead of unsupervised chats, minors can now engage with AI through guided narratives. Users can select characters, genres, and story prompts — either self-written or AI-generated — while steering the plot with interactive choices. The company also plans to enhance these stories with multimedia features like AI-generated images.
This shift signals a move from “AI friend” to “AI creative sandbox,” a safer and more controlled environment for younger users. But it also raises questions about whether teens will miss the emotional connection they found in open-ended chats.
NSFW and Content Moderation
Character.AI continues to enforce strict bans on NSFW content, including explicit sexual dialogue, nudity, graphic violence, and adult imagery. Users may explore light romance or mild roleplay, but crossing the line triggers moderation, blocking the response or terminating the conversation.
While these measures reduce risk, independent reviewers warn that no system is perfect. Age verification is limited, and some users may misrepresent themselves, creating potential exposure to unsafe content.
The Bigger Picture: AI, Teens, and Responsibility
Character.AI’s recent decisions reflect a growing industry-wide recognition: AI companionship can carry real psychological risks. What started as casual roleplay has, in some cases, led to emotional dependency, sleep deprivation, and withdrawal from real-life relationships.
By restricting freeform chats and reinforcing NSFW bans, Character.AI acknowledges that minors require extra protection. But “Stories” is not a complete solution. Loneliness, mental health, and digital privacy challenges remain, and it’s unclear whether narrative-driven AI can fully replace the emotional roleplay teens previously relied on.
What to Watch
-
Emotional impact on teens: Will “Stories” meet the creative and emotional needs of minors who previously used open chats as a form of support?
-
Moderation effectiveness: Filters and content policies can reduce risk, but users can still attempt to bypass restrictions.
-
Regulatory pressure: With ongoing lawsuits and public scrutiny, stricter safety standards and age verification measures may become mandatory for AI platforms.
-
Data and mental health considerations: Chat logs, AI personalities, and long-term interaction data could have lasting effects on user privacy and psychological well-being.
Final Take
Character.AI’s pivot is an overdue but necessary course correction. By banning minors from freeform chats and enforcing strict content rules, the platform reduces immediate risks while still allowing teens to engage creatively.
However, this is only the beginning. True safety for teen users will require industry-wide standards, ethical design, robust moderation, and thoughtful approaches to mental health — not just filters, bans, or narrative modes.
Related: Is Character AI Safe? What You Must Know Before It Becomes Addictive