• givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    2 days ago

    The suit says that Setzer repeatedly expressed thoughts about suicide to the bot. The chatbot asked him if he had devised a plan for killing himself. Setzer admitted that he had but that he did not know if it would succeed or cause him great pain. The chatbot allegedly told him: “That’s not a reason not to go through with it.”

    Yeah…

    They should be liable.