• orioler25@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    20 hours ago

    Well, I’m sure someone who uses “clanker” wouldn’t need therapy anyway.

    Seriously though, I doubt the health implications or claims about the efficacy of AI therapists, but we can’t just ignore the fact that there are people who use it, which means there’s something about it that makes it accessible or preferable to a human therapist.

    If you’ve ever had to get a psychotherapist, you know that it is prohibitively expensive for a large number of people, and that a human therapist may not actually be capable of treating you because of personal incompatibility; which often results in retraumatization in patients who are seeking therapy for particularly traumatic or sensitive issues. Since much of the value in therapy is learning management strategies that, while not standard, are often consistent across different practitioners, they do not necessarily need to come from a therapist to learn what they are (even if the practice of them does need one).

    I think if there is a need for it, that need is a consequence of the deeply dysfunctional, exploitative, and isolating system we live under, and I don’t think I’d ever accept it as a genuine alternative to human therapists. But, we can’t dismiss it out of hand if there are people who say it is useful for them and when we can’t maintain a system that can guarantee them access to treatment.

    • SpacetimeMachine@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      15 hours ago

      The problem with people saying they are useful, is that it is nearly impossible to tell if that is actually true. If someone is mentally unhealthy there are many ways to make them feel better, but not all of those will actually help the underlying issue, they could even make it worse. A lot of people seem to equate happiness and mental health, when it is very possible to be happy and mentally ill at the same time.

      This is especially worrisome with AI because it is literally designed to say what it “thinks” it wants you to hear. It has no real training in any of the disciplines a psychologist or therapist needs to be effective. You can’t just apply a cut and paste answer to a patient, you need to understand their personality, their history, and a multitude of other things to be a really effective therapist. The answer to this issue is increasing access to real mental health treatment, not giving snake oil to millions of people.

      • orioler25@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 hours ago

        Yes, I don’t get why so many of you appear to not understand that these problems coexist with the reality that people have been using it anyway. As I alluded to above when I said that a psychotherapist would be required to actually learn to practice those strategies and expressed my disagreement with AI therapists on a treatment basis in multiple instances, there is no replacing a human therapist or any reasonable basis to even call AI therapists “therapists.”

        As I said, again multiple times, since people use it anyway and prefer it to nothing or a bad therapist, we have to take its merits seriously and identify why. Reality does not care that you find it dumb and icky, I would love it if everything I know is dumb and icky was simply not a problem because I found it dumb and icky.

        All of these people are clearly not just stupid, which is what you and the person I responded to seem to think, which is just foolish. No, everyone else is not just dumber than you. There is clearly a material reason why people use these things and why some even say they want to. How many people do you know who do not go to therapy because they can’t afford it, or because they’ve been traumatized by it, or because they could get fucking institutionalized for it. Have you thought about, perhaps, the people as people?

        I swear to god, some of you see a long comment from someone you don’t like the sound of and you just make up whatever it says based on the shit you imagine people who disagree with you say. And they say reading levels are down, pshaw.

        • lifeinlarkhall@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          I think you’re making some interesting observations. I definitely agree that it’s the easy answer to just dismiss people who use AI therapists, friends, relationships are just stupid.

          You’re right that it says something about the system we live in and I extend that to society in general. We have a society who criticizes people for answering “how are you” honestly, who doesn’t have time for each other, who use terms like “trauma dumping” - so personally, I can see why some people are turning to machines whether it’s therapy or connection. It’s really bloody sad and it’s not a good solution but I can see the WHY behind it - which is what I think you’re also getting at.

          We do need to listen to why people turn to these services and figure out what people aren’t finding in human connection that they are, or think they are, in machines. I don’t buy that an individuals intelligence has much to do with why people turn to AI.