Technology
Google
US news
Artificial intelligence (AI)
Chatbots
Megan Garcia said Sewell, 14, used Character.ai obsessively before his death and
alleges negligence and wrongful death
The mother of a teenager who killed himself after becoming obsessed with an
artificial intelligence-powered chatbot now accuses its maker of complicity in
his death.
Megan Garcia filed a civil suit against Character.ai, which makes a customizable
chatbot for role-playing, in Florida federal court on Wednesday, alleging
negligence, wrongful death and deceptive trade practices. Her son Sewell Setzer
III, 14, died in Orlando, Florida, in February. In the months leading up to his
death, Setzer used the chatbot day and night, according to Garcia.
In the US, you can call or text the National Suicide Prevention Lifeline on 988,
chat on 988lifeline.org, or text HOME to 741741 to connect with a crisis
counselor. In the UK, the youth suicide charity Papyrus can be contacted on 0800
068 4141 or email pat@papyrus-uk.org, and in the UK and Ireland Samaritans can
be contacted on freephone 116 123, or email jo@samaritans.org or
jo@samaritans.ie. In Australia, the crisis support service Lifeline is 13 11 14.
Other international helplines can be found at befrienders.org
Continue reading...