
Rebecca Romito ’26 with the AI therapy model she experimented with. PHOTO: PRESTON THOMAS ’28/THE HAWK
“Hi there! How can I help you? What happened today?” my therapist asked.
“I have been feeling really stressed with juggling my school work,” I replied.
“I hear you,” they reassured me. “Schoolwork can often feel like a heavy load, especially with everything else going on. Can you share what specifically about your schoolwork is causing you stress?”
I paused for a moment, took a breath and told my new therapist, an artificial intelligence bot, what was going on.
AI therapy is a type of virtual mental health treatment that involves using artificial intelligence to provide mental health support, and a growing number of people are considering the option, given the difficulty, and cost, of accessing traditional therapists. According to YouGov, in May 2024, 34% of Americans said they would be comfortable sharing their mental health concerns with an AI chatbot over a human therapist.
Nearly 50% of individuals who could benefit from therapeutic services are unable to reach them. AI therapy has its benefits: It’s accessible and cost-effective. While some platforms charge per month to use their chatbots, plenty of sites are free to use. Furthermore, AI therapy can be accessed instantly and is available 24/7.
Using Earkick, a previously free personal AI therapist, I did a 45-minute session equivalent to a typical in-person therapy session to better understand the considerations of AI therapy. Most of Earkick is no longer free and only supports limited use before needing a plan.
Operating AI therapy was simple. I would type how I was feeling into the chat, and the chatbot would respond with guidance, follow-up questions, support and strategies.
In September 2024, Nicholas Fearn, a freelance journalist from Neath, United Kingdom, spent a month with an AI therapist. I asked Fearn what that was like.
“Due to being autistic, I struggle to speak to people in real life,” Fearn said. “So, when I had the opportunity to review Wysa, I thought it was worth a try.”
Wysa is an AI-powered personal coach with accessible mental health support.
Fearn noted the often long waiting lists for mental health therapy through the U.K.’s National Health Service. He said private therapy was out of his budget.
“The AI chatbot did a great job of understanding my feelings and providing actionable advice,” Fearn said. “For instance, ahead of my nephew’s christening, I was nervous about seeing loads of old and new faces. I asked Wysa for advice on overcoming social anxiety, and it advised strategies like saying a simple hello when greeting someone and asking how they are.”
While working with Earkick, I quickly realized how easy it was to open up to the chatbot, even if at a surface level. The bot did not judge me. In addition, it was quick and convenient to access. I was able to have my session at a convenient time for my busy student schedule.
Torrey Creed, Ph.D., associate professor of psychiatry at the Hospital of The University of Pennsylvania, works to help human therapists.
“Lots of the work that I do is designed to support human therapists in doing the best possible job they can,” Creed said.
Creed noted she is hesitant about the idea of AI therapists.
“People sometimes think that because it’s a computer, it’s infallible,” Creed said. “That’s not true, because it takes in so many of the good things about people, but also the shortcomings that people carry. It’s really important that people use a grain of salt when they’re looking at AI.”
I recognized other limitations in my experience with the Earkick chatbot. It was clear human interaction was missing. As the session continued, I withheld deeper information on myself, as no trust was built.
Leila Hodzic, a licensed clinical social worker based in New Jersey also isn’t a fan of AI for therapy.
“Society is already becoming too socially withdrawn,” Hodzic said. “Seeking this human interaction from a computer promotes even more isolation.”
Hodzic also noted that AI isn’t a good substitute for human connection.
“From a clinical perspective, there’s no way to build a relationship like connecting on an emotional level with another person,” Hodzic said. “It can’t be replicated with a computer … A computer can analyze and spit out some sort of a response to you, but it’s not feeling anything. It’s not also providing anything to you about the computer’s personality or life or experiences with which you could connect.”
Creed emphasized the lack of connection between a human being and a computer as well.
“It’s an algorithm, it’s math, it’s not a human being, and there are some fundamental important things about the connection between two human beings,” Creed said.
In a Stanford study, it was revealed that AI therapy chatbots may not only lack effectiveness compared to human therapists but could also contribute to harmful stigma and dangerous responses. The American Psychological Association has urged the Federal Trade Commission and legislators to put safeguards in place as users increasingly turn to apps for mental health support.
“The clinician is responsible for calling police, calling an ambulance, calling the person’s crisis contact,” Hodzic said. “A computer is not going to do those things. A computer is not going to be able to sense when someone is in danger and follow a crisis plan protocol the way that a human being is.”
I didn’t like AI therapy. It lacked everything that makes a therapy session meaningful: personal, human connection and feeling safe and heard. I wouldn’t recommend the use of AI therapy to anyone. When I struggle with stress and anxiety, I’m going to utilize a human therapist who can properly address my needs in a safe and meaningful manner.
Members of the St. Joe’s community seeking support are encouraged to contact the following resources:
Counseling and Psychological Services (CAPS), 610-660-1090
Campus Ministry, 610-660-1030
The Office of Student Outreach & Support, 610-660-1149
The Jesuit community, 610-660-1400
Employee Assistance Program, 866-799-2728