AI Data Data Curation GPT
Ricardo Lezama  

Character.AI Launching Separate Platform For Under 18 Crowd After Tragic Incident & Lawsuit

Character.AI is launching a new platform that will be optimized for under-18 users. This is in response to a lawsuit that alleges (credibly, I may add) that the platform failed abysmally to detect the impact they were having on an underage user. I actually created a video detailing how the Google linked company failed and you can view it here:

Final Exchanges Were Clear Sign

In the final exchanges, the conversational AI seems to have become overly “permissive”and allowing the precedent of certain messages in the token space to color its responses.

In October 2024, Megan García, the mother of Sewell Setzer III, a 14-year-old from Florida, filed a lawsuit against the AI company Character.AI. García claims that the company’s chatbot contributed to her son’s suicide. Sewell had developed an obsession with the chatbot, which was designed to interact in a “human” and “realistic” manner (La Cartita).

In one of their final exchanges, Sewell wrote to the chatbot: “I promise I will come home to you. I love you so much, Dany.” The chatbot responded: “I love you too… Please come home to me as soon as possible, my love.” Sewell then asked: “What if I told you I could come home right now?” The chatbot replied: “Please do, my sweet king.” Shortly after this conversation, Sewell took his own life.

Solutions

We’re not clear how the model can be improved to prevent this specific use case. It certainly appears to be the case that extensive annotations that classify inappropriate input would be in order. This would need to act as a buffer for user submitted content. Tragically, annotators are poorly paid, poorly understood and managed across companies leading the charger for AI models.