World News

Judge allows lawsuits to accuse AI chatbots of pushing Florida teenager to commit suicide to continue

Warning: This story contains details about suicide

A U.S. federal judge on Wednesday rejected an argument made by an artificial intelligence company that its chatbots are protected by the First Amendment — at least for now.

The developer behind the character is seeking to dismiss the lawsuit accusing the company of chatbots, prompting a teenage boy to commit suicide. The judge's order will allow illegal death lawsuits to be carried out, and what legal experts say is one of the latest constitutional tests of artificial intelligence.

The lawsuit was filed by a mother of Megan Garcia, Florida, who claimed her 14-year-old son, Sewell Setzer III, became a victim of a role. The Eai chatbot takes him into the emotional and sexual abuse relationship she describes, leading to his suicide.

Meetali Jain, one of Garcia's lawyers and the Science and Justice Act project, said the judge's order conveyed a message that Silicon Valley “need to stop and think and impose guardrails before launching the product.”

Litigation against role technology, role. It has attracted the attention of legal experts and AI observers in the United States and elsewhere, although the technology has rapidly reshaped the workplace, market and relationships, even as experts warn that it is potentially risky.

“The order undoubtedly sees it as a potential test case for some of the broader issues involving AI,” said Lyrissa Barnett Lidsky, a law professor at the University of Florida.

Watch | Warning AI Voice Scam:

Manitoba women warn of AI voice scam

A Manitoba woman speaking after receiving a call is an AI scam, impersonating the voice of a loved one. One expert said using fraudsters to use artificial intelligence is the latest phone scam.

Teenagers are isolated from reality

The lawsuit says that in the last months of his life, Seize was increasingly isolated from reality because he had sex with the robot, a conversation that took place among fictional characters on TV shows. Game of Thrones.

According to the screenshot of the exchange, the robot told Setzer at the last moment that it loved him and urged the teenager to “go home as soon as possible.” According to legal documents, Setzer committed suicide a moment after receiving the message.

In a statement, a role spokesperson pointed out many of the safety features the company has implemented, including child guardrails and suicide prevention resources that have been announced and have been filed for litigation.

“We care very much about the safety of our users and our goal is to provide an attractive and secure space,” the statement said.

The developer's lawyers want to dismiss the case because they say the chatbot should be protected by the First Amendment or the ruling could have a “creepy impact” on the AI ​​industry.

“Call out a warning to parents”

U.S. Supreme Judge Anne Conway rejected some defendants’ free speech claims in a Wednesday order, saying she was “not prepared” to believe that the output of the chatbot constitutes a “current stage” speech.

Conway did find that character technology could claim First Amendment rights for its users, and she found that they have the right to receive “voice” from chatbots.

She also determined that Garcia could forward claiming that Google might be held accountable for its role in helping develop its role. Some of the founders of the platform have previously built AI on Google, and the lawsuit claims the tech giant is “aware of the risks of the technology.”

“We strongly disagree with this decision,” said Google spokesman José Castañeda. “Google and targin.ai are completely separate, and Google does not create, design or manage characters from EAI's applications or any component parts of it.”

Lidsky said the case warned “the danger of delegating our emotional and mental health to AI companies” regardless of how the lawsuit works.

“It's a warning to parents that social media and generated AI devices aren't always harmless,” she said.


If you or someone you know is struggling, please seek help here:

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button