Are robots an option when seeking help?

Text bubbles on screen

Are robots an option when seeking help?

There’s an increasing amount of chatter these days about what careers may become obsolete over the next decade or two as a result of AI. Medical students considering radiology, for example, may rethink whether they can compete with AI when looking at an image and making a diagnosis. Writers too may worry about editors outsourcing their article ideas to robots who may compose a well laid out, thoughtful and engaging piece within minutes, sometimes seconds. And no, this article is not being written by AI.

So, when CBC Radio contacted me a couple of weeks ago about my opinion on an article in the New York Times about a site (therabot.com) offering therapy to the public, I was interested (and a little afraid) in doing a deeper dive. What if my career and that of my colleagues was moving towards becoming extinct?

Here’s a link to the article, in case you’re interested:

https://www.nytimes.com/2025/04/15/health/ai-therapist-mental-health.html

I was asked:

What stood out to you about this clinical trial?

I responded by referencing a part of the article that stood out to me the most. It said that
“during the eight-week trial, participants with depression saw a 51 percent reduction in symptoms after messaging Therabot for several weeks and that many participants who met the criteria for moderate anxiety at the start of the trial saw their anxiety downgraded to “mild,” and some with mild anxiety fell below the clinical threshold for diagnosis.”

I found these results to be very interesting.

Later that week, I spoke to a medical doctor who mentioned that after prescribing anti anxiety or anti depressant medication to patients experiencing either or both, that they often returned with reports of feeling better. I was curious about this, knowing that these kinds of medications often take more than a couple of weeks to show positive results. I asked if this was perhaps a placebo effect. Her belief, however, is that just by being validated and acknowledged and taken seriously when sharing their mental health concerns with her, and her showing concern and support when prescribing medication, the patients often felt better.

I found this interesting, too.

I was asked: Do you think that an AI chatbot could actually help people manage anxiety or depression?

The short answer is yes, I replied. I believe this because many therapists use a CBT (cognitive behavioural therapy) approach when helping clients with anxiety and depression and that it would be quite “easy” to offer the psycho educational piece in regards to how CBT helps with anxiety and depression through an AI model, and then to follow up with links to worksheets. So, it could help, especially when, I said, the therapeutic approach is more prescriptive as it is with CBT.

Having said this, I believe that no matter how well trained an AI bot is to respond as a therapist might, it cannot pick up on nuance in a person’s tone (especially via text) or body language and would not be able to read between the lines.

I was then asked: How important is the therapist-client bond in traditional therapy, and do you think it’s possible to feel that with AI?

Vitally important, I said. Even if a therapist is highly trained and very intelligent, if a client doesn’t feel a connection, then the outcome will not be as positive.

That being said, I do think that people may feel a connection with AI because the responses are so empathic and real, (the robot is trained to mimic the way that a therapist responds) so they may quickly forget that they are not talking to another human being.

And feeling heard, validated and acknowledged establishes a connection.

When preparing for my interview, I reflected on my text conversation with
Chat GPT just days earlier, about my sick cat. The responses were incredible. After helping to assess the problem, I was asked “Would you like help preparing for a vet visit?” I responded with “that’s ok, but thanks.”

And its response? “Of course, I’m here if you need anything later. I really hope your cat feels better soon. You’re clearly taking good care of them,” and then in another comment, “wishing your cat a gentle and speedy recovery.”

Wow – it helped to assess, it offered to help, it was encouraging, supportive and empathic. I felt a connection.

What are the risks of people relying on AI instead of seeing a human therapist?, the interviewer wanted to know.

I explained that no matter how evolved AI is, and clearly, even from my example with Chat GPT, we can see how “intelligent” it is, I do believe that there is no substitute for human connection, and support. When I see my clients’ facial expressions, a tear running down their cheek, a foot tapping, I can comment on what I am seeing. When I hear a long pause, a break in their voice, I can pick up on that too.

Also, I think its dangerous for people to turn to a therabot 24/7, because they develop an over reliance or dependence on it, rather than getting help once a week, for example, and then trying things out on their own.

And what about regulations, I questioned. Most people seek therapists who are governed by a regulatory body. These bodies are there for the protection of the public. But who is protecting the public with these types of sites? A robot is not licensed. And as far as I know, there are no regulatory bodies in place for digital platforms such as this. Maybe that’s still to come.

But, can tools like this help fill the gap where mental health care isn’t accessible?, I was asked.

I agreed that yes, there is a partial place for AI, perhaps to treat specific conditions, while people are waiting for therapists, or cannot afford one.

One pro is that it removes any bias that therapists work hard at not having.

But what about confidentiality? If you’re disclosing details, who knows where this information is going or being stored even though the therabot.com site says that the information is safely encrypted

The bottom line is that although there may be some room for AI in the mental health field, its important to be cautious when using it.