Character.AI was sued after a 14-year-old Florida boy committed suicide. The boy's mother said he became hooked on the chatbot on the platform.
Sewell Setzer III, a ninth-grader in Orlando, spent months talking to a chatbot using Character.AI's AI role-playing app, according to the New York Times. Setzer developed an emotional attachment to one bot in particular, named “Danny,” and began to constantly send emails and distance himself from the real world.
Setzer confessed to the bot that he was considering suicide and sent a message just before he died.
This morning, Character.AI announced that it is introducing a number of new safety features, including “improved detection, response, and intervention” related to chats that violate its terms of service, as well as notifications when a user spends an hour chatting. Announced. .
As the Times writes, there is currently a booming industry of AI companion apps, but their impact on mental health remains largely unstudied.