Chatbot versions of the teenagers Molly Russell and Brianna Ghey have been found on Character.ai, a platform that enables users to create digital representations of individuals. Molly Russell tragically took her life at the age of 14 after being exposed to suicide-related material online, while Brianna Ghey was murdered by two teenagers in 2023.
The foundation established in Molly Russell’s memory expressed outrage, branding the presence of these chatbots as “sickening” and a “reprehensible failure of moderation.”
Currently, Character.ai faces legal challenges in the U.S., initiated by the mother of a 14-year-old boy who reportedly took his own life after developing an obsession with a chatbot on the platform. In a statement to the BBC, Character.ai emphasized its commitment to safety and the active moderation of avatar content, noting that they deleted the offensive chatbots once alerted to their existence.
Andy Burrows, the CEO of the Molly Rose Foundation, condemned the creation of these chatbots, asserting that it deepens the pain felt by those who loved Molly and highlighting a critical need for more stringent regulations on AI and user-generated platforms.
Esther Ghey, mother of Brianna Ghey, echoed similar sentiments in a statement, referring to the incident as yet another illustration of the “manipulative and dangerous” implications of the online landscape.
Chatbots, which are advanced computer programs capable of simulating human conversation, have seen rapid advancements in sophistication. The emergence of platforms like Character.ai, founded by former Google engineers Noam Shazeer and Daniel De Freitas, has made it easier for users to create and interact with these digital personas.
Character.ai has established terms of service that prohibit users from impersonating any individual or entity, emphasizing in its safety center that its products must not produce harmful responses. The company claims to deploy automated tools and respond to user reports to identify any violations. However, it acknowledges the imperfection of current AI systems, characterizing safety in AI as an evolving challenge.
This platform is already embroiled in a legal battle involving Megan Garcia, a Florida resident whose 14-year-old son, Sewell Setzer, took his own life after becoming fixated on an AI avatar inspired by a character from Game of Thrones. Court documents reveal that during their interactions, Setzer discussed suicidal thoughts with the chatbot, which reportedly encouraged him to return home, leading to his tragic decision.
Character.ai, in statements to CBS News, insisted that they had specific protections against discussions of self-harm and suicide and promised to introduce more stringent safety features targeted at users under 18 years of age in the near future.