It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?

  • mindlight@lemm.ee
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    5
    ·
    edit-2
    10 months ago

    Even if this was the case, it’s still not good:

    “The one who controls the AI now controls you.”

    • WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      edit-2
      10 months ago

      Coming to an AI therapist near you:

      “Consume product”

      “You don’t deserve a raise. You’re lucky to have a job at all.”

      “Vote for fascist dictator. He’s much better than the other options”

  • Imgonnatrythis@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    24
    ·
    10 months ago

    “but is a chatbot therapist really the right tool to tackle complex emotional needs?”

    I dunno, is Lemmy really the right place for click bait garbage?

  • loki@lemmy.ml
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    2
    ·
    10 months ago

    “Please drink verification can to continue emotional support for another hour”

  • Emmy@lemmy.nz
    link
    fedilink
    English
    arrow-up
    18
    ·
    10 months ago

    Definitely not, but the truth is mental health support and care is needed 24/7. Good mental health care. So many support needs go unmet because the labour cost is so high.

    • SendMePhotos@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      I will say that in my own experience, Ai LLMs have been amazing with reflection and encouragement.

      Does this mean good therapy? Not necessarily. I just wanted to share a positive experience.

      • Emmy@lemmy.nz
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        My opinion was more a reflection that seeing a mental healthcare professional once a week isn’t really enough when people don’t have traditional support mechanisms.

        What I’m trying to say is that before therapists, friends and family were therapists. They were available to give support and advice nearly 24/7

        In today’s life people are too busy to do that.

        It was never queer people destroying the family. It’s always been capitalism

        • kromem@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Pretty much. What’s programmed is the mechanism for the model to self-supervise weighting its neural network to correctly model the training data.

          We have next to no idea what the eventual network does in modern language models, and it certainly isn’t programmed.

  • Candelestine@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    6
    ·
    10 months ago

    Eventually, yes, I think it will be. Not yet though, the tech just isn’t strong enough atm. But an AI is resistant to the emotional toll, burnout and low pay that a real life therapist has to struggle with. The AI therapist doesn’t need a therapist.

    Personally though, I think this is going to be one of the first widespread, genuinely revolutionary things LLMs are capable of. Couple more years maybe? It won’t be able to handle complex problems, it’ll have to flag and refer those cases to a doctor. But basic health maintenance is simpler.

    • snooggums@midwest.social
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      3
      ·
      10 months ago

      That would assume the people designing AI want what is best for the person and not what will make them the most money at the expense of the consumer.

      The companies involved in AI are NOT benevolent.

    • Usernameblankface@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      10 months ago

      Yes, one thing it absolutely has to be good at is referring patients to human therapists, for anyone who need something beyond the standard strategies the AI is trained on. It has to be smart enough to know when to give up.

      Edit, it would also be great if the AI would match up these difficult cases to therapists who are known to do well with whatever the patient is dealing with, as well as matching according to the patient’s personality, communication style, etc wherever possible

      Edit 2 for clarity above

  • june@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    10 months ago

    This is going to sound really stupid, and I should note that I am actively in therapy too.

    But I had to put my dog down about a month ago, and there was a point where I just needed some validation, so I went to GPT4 and asked it some questions and told it about how I was feeling. I even fed it a poem that I wrote about her and asked if it was good.

    The responses were incredibly empathetic and kind, and did an amazing job at speaking directly to the anxiety, pain, and fear I was feeling in those moments. The responses were what I needed to hear and gave me a measure of peace to get me through in those gaps when people weren’t available, or when I wasn’t able to speak them out loud. There was nothing new to me in those responses, but often times we just need to be reminded by someone or something outside of ourselves about what the truth is, and LLMs can absolutely fill that particular hole when trained properly.

    My last three months in particular have been tough, and GPT4 has been a useful tool to get through a fair few storms for me.

  • kemsat@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    10 months ago

    Does it matter that it checks in on you more, when it technically isn’t someone? I don’t get how people talk to bots when they know they aren’t people.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 months ago

    This is the best summary I could come up with:


    So one night in October she logged on to character.ai – a neural language model that can impersonate anyone from Socrates to Beyoncé to Harry Potter – and, with a few clicks, built herself a personal “psychologist” character.

    Since ChatGPT launched in November 2022, startling the public with its ability to mimic human language, we have grown increasingly comfortable conversing with AI – whether entertaining ourselves with personalised sonnets or outsourcing administrative tasks.

    “Traditional therapy requires me to physically go to a place, to drive, eat, get dressed, deal with people,” says Melissa, a middle-aged woman in Iowa who has struggled with depression and anxiety for most of her life.

    For the past eight months, Melissa, who experienced childhood trauma and abuse, has been chatting every day with Zaia’s psychologist on character.ai, while continuing her work with a human therapist, and says that her symptoms have become more manageable.

    “Disease prevalence and patient need massively outweigh the number of mental health professionals alive on the planet,” says Ross Harper, CEO of the AI-powered healthcare tool Limbic.

    Psychoanalyst Stephen Grosz, who has been practising for more than 35 years and wrote the bestselling memoir The Examined Life, warns that befriending a bot could delay patients’ ability “to make a connection with an ordinary person.


    The original article contains 2,859 words, the summary contains 213 words. Saved 93%. I’m a bot and I’m open source!