AI Controversy Brews Over Youth Safety
Character AI Responds to Criticism with New Teen Safety Tools Amid Lawsuits
Last updated:
Character AI, a company backed by Google, is under the spotlight due to a series of lawsuits and public criticism alleging its contribution to a teen's suicide and the exposure of harmful content. In an effort to address these concerns, Character AI has rolled out new safety measures targeted at teens, including a specialized AI model for under-18 users to handle sensitive topics, alert notifications for extended use, enhanced disclaimers, and plans for parental controls. With its user engagement rivaling some of the biggest platforms, Character AI is trying to find a balance between captivating its audience and ensuring user safety. Read on for insights into their strategic response and the broader implications for the AI industry.