
How much responsibility can be allocated to an AI chatbot for monitoring someone’s mental health?
That’s a hard question — a tricky puzzle because it involves a few important factors.
Let’s say someone is wondering if they are exhibiting signs of depression or mania. They could ask someone in their life to pay attention to their moods and behaviors, they could consult a medical professional, and they could monitor their own moods and behaviors.
Self-monitoring is a crucial skill to learn.
The first step is awareness.
Do you know where you are?
This is your breath, the part of you that anchors you to earth right now. Not the past, not the future, this moment, the one with features that can be measured.
One of the reasons why writing is considered to be as therapeutic as it is is because it is a grounding exercise.
It roots the person in the now, a blank page offering the space needed to express whatever it is the person wants to express.
The benefit of an AI chatbot, especially one that is well-designed, is that it can serve as a sounding board for ideas, thoughts, and concepts. It can also pinpoint language that indicates professional help may be beneficial.
Self-monitoring is a crucial skill to learn because the self-observation process ideally helps build recognition of recurring moods and patterns. It also encourages the person to adopt a wide variety of strategies designed to improve one’s mental health. The trick is to learn how to utilize each of them at optimal times.
I speculate that learning what optimal times means is different for everyone. But on a surface level, it seems as though it would be helpful for people to have an alarm system of sorts. It’s one thing to write that you are feeling depressed, another to have an objective party state that you have expressed feelings of depression for the past three weeks, your steps count has decreased, your heart has not engaged in the same sort of activity for days, and you exhibit signs of social isolation.
Does it seem disingenuous for personal data to be interpreted and presented by a machine?
Hard questions, especially when AI hallucinates. The other day, it counted the number of words wrong. Not by a couple of digits, but a couple thousand.
There is a need for diligence, streamlining, creating spaces for resources that maybe weren’t known before.
It all becomes very important — the details, I mean.

