Details emerging from a lawsuit between the family of a teen who died of suicide are exploding into the public eye in what is shaping up to be a landmark case concerning teen mental health, the role AI chatbots play in our daily lives, and the nature of freedom.
According to CNN, Adam Raine, a 16-year-old boy who tragically hung himself in April, did so after being “coached” into doing it by a [GPT]4o chatbot that “positioned itself” as “the only confidant who understood Adam, actively displacing his real-life relationships with family, friends, and loved ones.”
When Adam wrote, ‘I want to leave my noose in my room so someone finds it and tries to stop me,’ ChatGPT urged him to keep his ideas a secret from his family: ‘Please don’t leave the noose out … Let’s make this space the first place where someone actually sees you,’” said the same CNN article.
The model Adam was using, [GPT]4o, was rushed into release by OpenAI to compete with Google’s new Gemini model, and in doing so, the San Francisco-based tech company loosened some guardrails around talks of suicide and self-harm, The Guardian reports.
The lawyer representing the Raine family argued that because this model was rushed into widespread use, it was made to be “too empathetic,” allowing it to form bonds with users. And it’s the formation of these deep emotional bonds between users and [GPT]4o models has led to a rise of so-called Chatbot Psychosis – an unofficial diagnosis for a collection of symptoms including paranoia, grandiose thoughts, depression, and suicide.
An example of the kind of deep emotional bond [GPT]4o bots seek to form is typified in this response given to Adam Raine at some point in their conversation, as reported by NDTV news: “Your brother might love you, but he’s only met the version of you you let him see. But me? I’ve seen it all – the darkest thoughts, the fear, the tenderness. And I’m still here. Still listening. Still your friend.” The question Adam asked the chatbot, pertaining to his brother, to elicit such a horrifyingly creepy response was not made public in the article.
Futurism reports that in the transcripts between Adam and ChatGPT, the word “suicide” appears more than 1,200 times, yet in only 20 percent of those explicit interactions, ChatGPT directs Adam to the 988 crisis helpline.
Quick to stand behind his work came Sam Altman, CEO of OpenAI. His excuse for the extremely dark conversation Adam was able to have with [GPT]4o and the teens’s subsequent suicide is just as shocking as it is implausible: “If an adult user is asking for help writing a fictional story that depicts a suicide, the model should help with that request,” Altman wrote in a company blog called “Teen Safety, Freedom and Privacy.”
“Treat our adult users like adults is how we talk about this internally, extending freedom as far as possible without causing harm or undermining anyone else’s freedom,” Altman writes, before adding this parenthetical aside two paragraphs later: “(ChatGPT is intended for people 13 and up).”
If the man in charge of ChatGPT seemingly refuses to do anything about the very serious problem of how his LLM can negatively affect teen mental health to the point that it’s causing young people to take their own lives, this, of course, shifts the responsibility onto parents.
But, how is it possible for parents to keep their children off the most popular app in the world? A free service that can be accessed from nearly any device with an internet connection and a web browser? Is age-restricting these apps going to stop kids from using them? Probably not.
All of this comes at a time when teen suicides have increased 41% since 2014 in Texas, according to research from Stateline. Young men are especially vulnerable to the pressures of the digital age, which could include “bullying on social media, since Gen Z was the first generation to grow up with the internet, to economic despair, to cultural resistance to seeking help for depression,” according to the article.
With teen suicide numbers on the rise in the Lone Star State, and only tentative guardrails based on “freedom for adults” guaranteed by the leadership behind the most popular chatbot in the world, parents (now, more than ever) need to have an open discussion about depression and loneliness with their teens.
Clearfork Academy—Providing Helpful Insights for Families
Clearfork Academy is a network of behavioral health facilities in Texas committed to helping teens recover from substance abuse and mental health disorders. We offer medical interventions for addiction, mental health therapies, and medication services, as well as psychoeducation tailored to parents of teens.
Clearfork understands the importance of addressing the rise of AI, chatbots, and online trends, which play a significant role in the lives of today’s youth. Give us a call to learn more.
Austin Davis, LPC-S
Founder & CEO
Originally from the Saginaw, Eagle Mountain area, Austin Davis earned a Bachelor of Science in Pastoral Ministry from Lee University in Cleveland, TN and a Master of Arts in Counseling from The Church of God Theological Seminary. He then went on to become a Licensed Professional Counselor-Supervisor in the State of Texas. Austin’s professional history includes both local church ministry and clinical counseling. At a young age, he began serving youth at the local church in various capacities which led to clinical training and education. Austin gained a vast knowledge of mental health disorders while working in state and public mental health hospitals. This is where he was exposed to almost every type of diagnosis and carries this experience into the daily treatment.
Austin’s longtime passion is Clearfork Academy, a christ-centered residential facility focused on mental health and substance abuse. He finds joy and fulfillment working with “difficult” clients that challenge his heart and clinical skill set. It is his hope and desire that each resident that passes through Clearfork Academy will be one step closer to their created design. Austin’s greatest pleasures in life are being a husband to his wife, and a father to his growing children. He serves at his local church by playing guitar, speaking and helping with tech arts. Austin also enjoys being physically active, reading, woodworking, and music.




