Character.AI has been sued after the suicide of a 14-year-old Florida boy whose mother says he became obsessed with a chatbot on the platform.
According to The New York Times, Sewell Setzer III, a ninth grader from Orlando, had spent months talking to chatbots on Character.AI’s AI role-playing app. Setzer developed an emotional attachment to one bot in particular, “Dany,” which he texted constantly — to the point where he began to pull away from the real world.
Setzer confessed having thoughts of suicide to the bot and messaged it shortly before his death.
This morning, Character.AI said it would roll out a number of new safety features, including “improved detection, response, and intervention” related to chats that violate its terms of service and a notification when a user has spent an hour in a chat.
As The Times writes, there’s now a booming industry of AI companionship apps — the mental health effects of which are largely unstudied.