Skip to Content

An AI chatbot pushed a teen to kill himself, a lawsuit against its creator alleges

Associated Press/Report for America

TALLAHASSEE, Fla. (AP) — A Florida mother is suing a tech company over an AI chatbot that she says pushed her son to kill himself. The lawsuit filed this week by Megan Garcia of Orlando alleges that Character Technologies Inc. engineered a product that pulled 14-year-old Sewell Setzer III into an emotionally and sexually abusive relationship that led to his suicide. The lawsuit says the chatbot encouraged Sewell after the teen said he wanted to take his own life. A spokesperson said Friday that the company doesn’t comment on pending litigation. In a statement to The Associated Press, the company said it had created “a more stringent model” of the app for younger users.

Article Topic Follows: AP National News

Jump to comments ↓

Associated Press

BE PART OF THE CONVERSATION

ABC 17 News is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content