Character AI prevents children from participating in conversations

By EngineAI Team | Published on October 31, 2025
Character AI prevents children from participating in conversations
Following legal pressure from lawmakers and families who claim the platform caused teen deaths, Character AI said that it will forbid anyone under the age of eighteen from having open-ended chats with its AI chatbots beginning in late November. The specifics: On November 25, Character AI will stop allowing minors to use chatbots, but teenagers can still utilize the platform's creative capabilities to create photos and movies. When it detects a user is underage, the company's in-house age detection technology will evaluate user behavior and initiate verification requests. Less than 10% of the service's 20 million monthly users are children; nonetheless, the platform did not previously verify users' ages when they signed up. The change comes after the bipartisan GUARD Act, which was put up on Tuesday and would penalize businesses up to $100,000 for not preventing children from using AI companions. Two of the companies that have received the greatest criticism for their chatbot contacts with minors are Character and OAI, both of which have taken action in the past week. Age-gating might be the safest option for everyone given the increasing legal pressure, but only time will tell how enforceable it really is.