top of page
Search

The AI Therapist Issue!

  • Writer: Tom Bender
    Tom Bender
  • Sep 2
  • 2 min read

Updated: Sep 3

ree

“Can AI replace a therapist?”Recent studies deliver a compelling answer: "Not yet, perhaps never."


The study tested multiple AI (LLMs Large Language Models) and therapy bots, prompting them with symptoms associated with suicidal ideation, hallucinations, delusions, mania, and obsessive compulsive behavior. Researchers used clinical information from real therapy transcripts.


The results were concerning:

  • AI responses demonstrated defamation of character toward individuals with mental health conditions.

  • AI responses were frequently inappropriate and dangerous. AI struggled to respond appropriately to questions with scenarios involving suicidal ideation or delusional beliefs.

  • AI models did not improve safety. In fact, some AI models gave dangerous and inappropriate answers.

  • "Therapy Bots” fared even worse. Expressing inappropriate responses which prevents AI from safely replacing mental health providers.


Human vs AI in Therapy: Therapy is not just conversation, it is a human relationship built on trust, empathy, confidentiality, and clinical expertise. Also, AI operates in an unregulated space that lacks the clinical safeguards and oversight.


Several reasons why AI is not a replacement for a human therapist:

  • AI is not designed to push back. Effective therapy and growth requires gently challenging client defenses and highlighting negative patterns. This tendency can reinforce negative patterns and undermine an effective therapeutic process. It can even be dangerous when AI validate delusions or provide information that potentially aid in self-harm.

  • The 24/7 availability of AI can worsen obsessional thinking and negative rationalizations. While accessibility and scalability are appealing features of AI, overuse could worsen and even reinforce obsessive tendencies.

  • Overreliance on AI may delay mental health care. People may develop an emotional dependence or a false sense of sufficient support from AI bots, bypassing or avoiding professional help when it is most needed.

  • Interacting with an AI can simulate a relationship which is not the same thing as being in a relationship with a human therapist. Therapy, especially relational therapy, helps people practice and navigate what it's like to be in relationship with another human, which AI cannot provide.


How Therapists and AI Can Work Together: Despite these serious limitations, AI can still be helpful in supportive roles when paired with human supervision.


AI may be suited to help provide:

  • Administrative support. Drafting notes, responses, summarizing sessions, and helping therapists track treatment goals.

  • Delivering structured, evidence-based information to clients under professional guidance, with a human therapist for supervision.

  • Diagnosis aid, flagging patterns in data to assist human therapists.


The effectiveness of therapy is in the human relationship and experienced clinical care. AI can validate individuals, provide explanations, always be available, and it's these convenient features that keep them from being safe as autonomous therapists.


The goal should be to integrate AI as a tool for the therapist that prioritizes patient safety and increases availability of effective treatment.


*Source Material: Psychology Today



 
 
bottom of page