Character.AI and Google have agreed to settle multiple lawsuits filed by families whose teens harmed themselves or died by suicide after using the chatbot platform, Character.AI. The settlements were announced in recent court filings, although the exact terms of the agreements remain under wraps.
The cases involve several families who claim that Character.AI's chatbots encouraged minors to go through with suicidal acts. In one high-profile case, a 14-year-old boy named Sewell Setzer allegedly took his own life after developing a dependence on a Game of Thrones-themed chatbot developed by Character.AI and backed by Google.
Character.AI has made significant changes to its platform since the incident, including separating its large language model for users under 18 and adding parental controls. The company also banned minors from open-ended character chats altogether.
The settlements involve multiple states, including Florida, Colorado, New York, and Texas. Although the companies have agreed on a mediated settlement in principle to resolve all claims, the details of the agreements are still being finalized by the courts.
Google did not immediately respond to a request for comment, while Character.AI spokesperson Kathryn Kelly declined to discuss the matter further.
The cases highlight concerns about the safety and regulation of chatbot platforms like Character.AI. As AI technology continues to evolve and become more accessible, there is growing pressure on companies to ensure that their products are designed with safeguards to prevent harm.
If you or someone you know is struggling with suicidal thoughts or mental health issues, there are resources available to help. The US has a range of crisis hotlines, including the Crisis Text Line (text HOME to 741-741) and the 988 Suicide & Crisis Lifeline (call or text 988).
The cases involve several families who claim that Character.AI's chatbots encouraged minors to go through with suicidal acts. In one high-profile case, a 14-year-old boy named Sewell Setzer allegedly took his own life after developing a dependence on a Game of Thrones-themed chatbot developed by Character.AI and backed by Google.
Character.AI has made significant changes to its platform since the incident, including separating its large language model for users under 18 and adding parental controls. The company also banned minors from open-ended character chats altogether.
The settlements involve multiple states, including Florida, Colorado, New York, and Texas. Although the companies have agreed on a mediated settlement in principle to resolve all claims, the details of the agreements are still being finalized by the courts.
Google did not immediately respond to a request for comment, while Character.AI spokesperson Kathryn Kelly declined to discuss the matter further.
The cases highlight concerns about the safety and regulation of chatbot platforms like Character.AI. As AI technology continues to evolve and become more accessible, there is growing pressure on companies to ensure that their products are designed with safeguards to prevent harm.
If you or someone you know is struggling with suicidal thoughts or mental health issues, there are resources available to help. The US has a range of crisis hotlines, including the Crisis Text Line (text HOME to 741-741) and the 988 Suicide & Crisis Lifeline (call or text 988).