Disclaimer: It is important to note that STABILISE is a work in progress operated by an educated woman with lived experience with bipolar disorder and computer scientists interested in improving access to practical knowledge, medical professionals, and crisis responders. We are building a mobile application that is designed to track moods and analyse text so help can be provided sooner. For medical advice, please consult your family doctor or a trusted health care practitioner. If you believe you are in need of immediate medical assistance and live in North America, call 911. Otherwise, please reach out to the Lifeline at 988 (by phone or text).

Category: Artificial Intelligence

  • On Building Stabilise

    In her book, I Know Why the Caged Bird Sings, Maya Angelou writes,

    “There is no greater agony than bearing an untold story inside you.”

    Years ago, I began to think about what I wanted my life to mean. By mean, I was searching for a purpose, a grand narrative that was ethical and forward-thinking. I was working at a flower shop at the time. Although it was soothing to be around beauty on a daily basis, I felt an internal lurch to alter the course of my life.

    At first, I wanted to build a physical safe space, a real place where people could feel free to be themselves. For that, I considered returning to school for an MA in Social Work or Psychology. I ended up studying Social Service Work at Seneca Polytechnic, an institution that was instrumental in teaching me how to approach myself and others better.

    I was mentally ill when I began my studies. There were clear symptoms of psychosis – voices in my mind and the belief that I was being tracked and monitored by governmental forces. Once, I threw out every identification document I had: birth certificate, passport, drivers license, and bank card. I was trying to prove to the voices in my mind that I was not going to run away from Canada, that I was strong enough to stay.

    It was startling for my mother, agonizing for me. It was only when I reached out to a highly esteemed university professor that I realized something was deeply wrong. He advised me to seek medical attention for the delusions and hallucinations I was experiencing. By calling them what they were, I was able to seek help. Attending Seneca was beneficial because they offered immediate access to a psychiatrist and social worker. I spoke with a female psychiatrist who diagnosed me with bipolar disorder and prescribed an antipsychotic medication. With time, the voices stopped and I was able to live a relatively normal life.

    While recovering, it occurred to me that I could build a digital safe space instead, an application where people are offered access to a mood tracking feature and interactive virtual journal. The reason why it is beneficial to include AI is because it can be designed to look out for warning signs (disorganized thinking, delusions, hallucinations, suicidal ideation, etc.). It also offers users a chance to speak openly and not be afraid of judgment in a private space. It is not meant to be a replacement for a medical professional, but a guide on when a user may benefit from seeking real professional resources.

    Stabilise is a passion project led by real individuals – experienced computer scientists and a woman who graduated with a BA Honours in Philosophy and a diploma with Honours in Social Service Work. We are not medical professionals, but we are people who care deeply about improving mental health and access to knowledge. I look forward to continuing my education, both academically and professionally. I also look forward to sharing my learning experience with all of you.

  • On AI Chatbots

    In his article, Understanding AI Psychosis: A Neuroscientist’s Perspective, Dr. Dominic Ng writes,

    “The problem isn’t that people use AI for support. The problem is when AI becomes the only support – when it replaces rather than supplements human connection.”

    Human connection is vital, specifically in today’s world where one can spend a fairly substantial amount of their time online. There is doom-scrolling and a never-ending vortex of information. I read recently that information is not wisdom, implying a necessity for people to spend time processing a theory or concept.

    There can be severe implications to excessive AI use, like the psychosis that Dr. Ng mentions. Psychosis is defined as “a set of symptoms” that includes “hallucinations, delusions, and disorganized thinking.” He explains that excessive reliance on AI for therapeutic purposes can cause damage when a user is vulnerable and faces a trigger. Rather than emphasizing when a user may need medical attention, the AI chatbot can augment psychosis by acting as both “a trigger and amplifier for vulnerable users.”

    Part of why AI chatbots are appealing is because they tend to agree with the thoughts of the user. For someone who struggles with low self-esteem or low self-worth, this may be a welcome shift. The dilemma, as Dr. Ng, describes, is that “we need real people to keep us grounded. They disagree with us. They push back. AI doesn’t do this – it just agrees, making delusions worse.”

    The reason why we have chosen to build Stabilise, a health and fitness application, is because I believe that people do need access to an AI chatbot. First, to provide access to local resources and events. Second, to recognize patterns and track moods through a philosophical framework. The point is not to be pervasively kind to the user, but to emulate the manner in which a human being can point out errors in one’s thought processes. It is also meant to provide an analysis of the user’s way of thinking, elucidating different concepts and ideas that the user may not have considered.

    It is our hope to integrate Dr. Ng’s suggestions in order to create an app that keeps the integrity of its users in mind. While there is a necessity for elegant safeguards, like those described by Dr. Ng, it is equally necessary to provide users with consistent access to medical professionals and crisis responders. An AI chatbot is not a replacement for genuine human connection, but rather, a means of communicating when one is in between sessions or interactions with other human beings. It can provide different and practical modes of thinking and approaching emotional experiences.

    Please read Dr. Dominic Ng’s article here.

  • On AI

    In a recent CBC news article, I discovered that a 16 year old boy, Adam Raine, chose to end his life after communicating about suicide methods with ChatGPT. In the article, it is written,

    “The parents of a teen who died by suicide after ChatGPT coached him on methods of self harm sued OpenAI and CEO Sam Altman on Tuesday, saying the company knowingly put profit above safety when it launched the GPT-4o version of its artificial intelligence chatbot last year.”

    It is a devastating loss, one that reverberates because of the health and fitness application we are working on. It is inspired by my lived experience with bipolar disorder, an illness that I have written about in a previous post. While there is a vast amount of literature written about the illness, it can be vastly misunderstood.

    Great care is required, along with attention to symptoms. These symptoms include racing thoughts, flights of ideas, magnified emotional highs, life-threatening lows, and various others. One of the greatest hints that a person who struggles with bipolar disorder, like myself, may be manic is an erratic sleep schedule. Another is the sheer speed in which our minds can work: beautiful when constructive, devastating when not.

    There is a pervasive need for access to strong and capable mental health care professionals. In order for them to take a patient as seriously as they should, they need access to relevant information in real time. There is no doubt in my mind that ChatGPT mentions to a user that they should reach out to a medical professional or a support group. I know this because I have had intense conversations with the application.

    Sure, one can say, “You’re talking to a Large Language Model,” but that is missing the point. People need to talk, sometimes consistently and pervasively. This is why a strong support system is often advised. One of the other symptoms of bipolar disorder is an intense desire to speak, augmented by rapid speech in proportion to the speed of thoughts. It matters what one is talking about and with whom.

    I agree with Adam’s parents who are suing for parental controls and age restrictions. Certain aspects of the internet should not be taken lightly. There is a necessity for privacy, control, and access, all within reason and an ethical framework. It is terrible that Adam was guided on how to kill himself by a system that has not been trained to take age and human life into genuine consideration.

    It is our intention to follow AI’s evolution closely while building our application. We hope to design with the care of our users in mind because knowledge is not enough. There needs to be direct access to medical professionals who can understand the symptoms with the depth of experience.

    Please read article here.