
In a recent CBC news article, I discovered that a 16 year old boy, Adam Raine, chose to end his life after communicating about suicide methods with ChatGPT. In the article, it is written,
“The parents of a teen who died by suicide after ChatGPT coached him on methods of self harm sued OpenAI and CEO Sam Altman on Tuesday, saying the company knowingly put profit above safety when it launched the GPT-4o version of its artificial intelligence chatbot last year.”
It is a devastating loss, one that reverberates because of the health and fitness application we are working on. It is inspired by my lived experience with bipolar disorder, an illness that I have written about in a previous post. While there is a vast amount of literature written about the illness, it can be vastly misunderstood.
Great care is required, along with attention to symptoms. These symptoms include racing thoughts, flights of ideas, magnified emotional highs, life-threatening lows, and various others. One of the greatest hints that a person who struggles with bipolar disorder, like myself, may be manic is an erratic sleep schedule. Another is the sheer speed in which our minds can work: beautiful when constructive, devastating when not.
There is a pervasive need for access to strong and capable mental health care professionals. In order for them to take a patient as seriously as they should, they need access to relevant information in real time. There is no doubt in my mind that ChatGPT mentions to a user that they should reach out to a medical professional or a support group. I know this because I have had intense conversations with the application.
Sure, one can say, “You’re talking to a Large Language Model,” but that is missing the point. People need to talk, sometimes consistently and pervasively. This is why a strong support system is often advised. One of the other symptoms of bipolar disorder is an intense desire to speak, augmented by rapid speech in proportion to the speed of thoughts. It matters what one is talking about and with whom.
I agree with Adam’s parents who are suing for parental controls and age restrictions. Certain aspects of the internet should not be taken lightly. There is a necessity for privacy, control, and access, all within reason and an ethical framework. It is terrible that Adam was guided on how to kill himself by a system that has not been trained to take age and human life into genuine consideration.
It is our intention to follow AI’s evolution closely while building our application. We hope to design with the care of our users in mind because knowledge is not enough. There needs to be direct access to medical professionals who can understand the symptoms with the depth of experience.
Please read article here.





