
In his article, Understanding AI Psychosis: A Neuroscientist’s Perspective, Dr. Dominic Ng writes,
“The problem isn’t that people use AI for support. The problem is when AI becomes the only support – when it replaces rather than supplements human connection.”
Human connection is vital, specifically in today’s world where one can spend a fairly substantial amount of their time online. There is doom-scrolling and a never-ending vortex of information. I read recently that information is not wisdom, implying a necessity for people to spend time processing a theory or concept.
There can be severe implications to excessive AI use, like the psychosis that Dr. Ng mentions. Psychosis is defined as “a set of symptoms” that includes “hallucinations, delusions, and disorganized thinking.” He explains that excessive reliance on AI for therapeutic purposes can cause damage when a user is vulnerable and faces a trigger. Rather than emphasizing when a user may need medical attention, the AI chatbot can augment psychosis by acting as both “a trigger and amplifier for vulnerable users.”
Part of why AI chatbots are appealing is because they tend to agree with the thoughts of the user. For someone who struggles with low self-esteem or low self-worth, this may be a welcome shift. The dilemma, as Dr. Ng, describes, is that “we need real people to keep us grounded. They disagree with us. They push back. AI doesn’t do this – it just agrees, making delusions worse.”
The reason why we have chosen to build Stabilise, a health and fitness application, is because I believe that people do need access to an AI chatbot. First, to provide access to local resources and events. Second, to recognize patterns and track moods through a philosophical framework. The point is not to be pervasively kind to the user, but to emulate the manner in which a human being can point out errors in one’s thought processes. It is also meant to provide an analysis of the user’s way of thinking, elucidating different concepts and ideas that the user may not have considered.
It is our hope to integrate Dr. Ng’s suggestions in order to create an app that keeps the integrity of its users in mind. While there is a necessity for elegant safeguards, like those described by Dr. Ng, it is equally necessary to provide users with consistent access to medical professionals and crisis responders. An AI chatbot is not a replacement for genuine human connection, but rather, a means of communicating when one is in between sessions or interactions with other human beings. It can provide different and practical modes of thinking and approaching emotional experiences.






