Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • Lemvi@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    2
    ·
    8 months ago

    I think pretty much everyone would agree that’s bad. However, I don’t think we’ll ever get to the point where we recognize a machine might be capable of suffering. There is no way of proving anything, biological or not, has a consciousness and the capability to suffer. And with AI being so different from us, I believe most people would simply disregard the idea.

    Heck, look at the way we treat animals. A pig’s brain is very similar to our own. Nociceptors, the nerve cells responisble for pain in humans, can also be found in most animals, but we don’t care. We kill 4 million pigs every day, and 200 million chickens. No mass murder in the history of mankind even gets close to that.

    The sad truth is, most people only care about their wellbeing, and that of their friends and family. Even other humans don’t matter, as long as they’re strangers. Otherwise people wouldn’t be hoarding wealth like that, while hundreds of millions of people around the world are starving.

    Ah sorry, I kinda started ranting. Yes, I’d care.

    • skye@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      8 months ago

      yeah! prairie dogs gossip; crows tell stories, have communities, and some of them even seem to understand money; whales mourn the deaths of other whales

      sentience is trippy, and it’s always been questionable to me that we decided we’re the only sentient life on the planet

      i already get emotionally attached to, like, roombas and those suitcases that connect to your phone and follow you around, i can’t wait to have a robo buddy

      • CALIGVLA@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        8 months ago

        prairie dogs gossip; crows tell stories,

        Speaking purely as a layman, I find these kinds of claims very questionable at best and at worst it’s anthropomorphism in my eyes. I can understand animals exchange information in some way or another, but “telling stories” or “gossip” would require a higher form of communication than just grunts, smells or body language.

        It could just be scientists using simple wording for lay people, but to me it doesn’t sound right regardless.

        • skye@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          8 months ago

          it was me using simpler phrasing in part because i couldn’t remember the details very well

          but i was referencing an experiment where researchers wearing “threatening” and “non-threatening” masks interacted with and marked crows, and other crows in that area who they had not interacted with recognized them later. https://www.sciencedirect.com/science/article/abs/pii/S0003347209005806 (however that crows tell stories is, as far as i know, only a popular interpretation, their official conclusion, at least of this experiment, is that crows are capable of long term memory retention and fine-feature discrimination)

          and simple observations suggesting prairie dogs may have a very advanced language - which went viral in my online circles with people joking that they gossip about us, which probably just stuck with me because i think it would be very cute

          i personally believe that animals most likely do communicate among each other and the complexities of their languages just varies, even if most are not obviously very complex. my personal beliefs are that communication is complicated and can happen through more than verbal/vocal language, animals are clearly capable of feeling complex emotions and pain which is enough for me personally to consider them sentient, and (again this is just my personal belief) i believe it’s probably better to treat them as if they are sentient until proven otherwise than the opposite. and just to be upfront and honest with others and myself about my possible biases, i believe in the Buddhist concept of Saṃsāra, and believe that that we’re all a part of the same cycle of death and rebirth

          edit found some more info:

          prairie dogs: https://www.cbc.ca/news/science/prairie-dogs-language-decoded-by-scientists-1.1322230

          Researchers noticed that the animals made slightly different calls when different individuals of the same species went by. … so they conducted experiments where they paraded dogs of different colours and sizes and various humans wearing different clothes past the colony. They recorded the prairie dogs’ calls, analyzed them with a computer, and were astonished by the results.

          “They’re (prairie dogs) able to describe the colour of clothes the humans are wearing, they’re able to describe the size and shape of humans, even, amazingly, whether a human once appeared with a gun,” Slobodchikoff said. The animals can even describe abstract shapes such as circles and triangles.

          Also remarkable was the amount of information crammed into a single chirp lasting a 10th of a second. “In one 10th of a second, they say ‘Tall thin human wearing blue shirt walking slowly across the colony.’”

          crows: https://www.washingtonpost.com/national/health-science/the-interesting-thing-that-crows-do-when-they-see-one-of-their-own-dead/2016/03/18/78d97a9e-ec48-11e5-b0fd-073d5930a7b7_story.html

          “They know your body type. The way you walk,” Dyer said. “They’ll take their young down and say: ‘You want to get to know this guy. He’s got the food.’ ”

          Scientists have known for years that crows have great memories, that they can recognize a human face and behavior, that they can pass that information on to their offspring.

          that article also mentions that crows have been observed to make and use tools, which is something i knew but forgot to mention and is interesting and feels relevant to this conversation

        • AnUnusualRelic@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          4
          ·
          8 months ago

          Anthropomorphism has long been used as a big bad thing, the catchall excuse to keep animals as the stupid things they were supposed to be. We’re coming back from that thankfully.

          It doesn’t mean the animals function the same way we do. But they do function in a lot of very similar ways.

          • CALIGVLA@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            8 months ago

            My point is I can’t see how they can “gossip” or “tell stories”, if that isn’t textbook anthropomorphism I don’t know what that is.

            • AnUnusualRelic@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              8 months ago

              It’s shorthand for information sharing. Which they certainly do. Crows will absolutely tell one another about lots of stuff, such as people that have harmed them.

    • Zozano@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      8 months ago

      I’m on board with what you’re saying.

      Doctors used to be told “human babies don’t feel pain, they just react like the do”.

      Which is basically like saying “lobsters don’t scream when you boil them alive, that sound is just air escaping”

      To me, it seems less like an intuitive position to hold, and more like a fortunate convenience.

      “I sure am glad that lobsters don’t feel pain. Now I don’t need to feel guilty about my meal”.

      No doubt, there would be a large demographic claiming the pain isn’t real, it’s just “simulated pain”. - like, okay, let’s simulate your family fucking dying in the most violent and realistic way possible and see if you don’t develop incurable PTSD?

        • Zozano@lemy.lolOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          8 months ago

          Good to know, though the point remains; people will readily accept claims which absolve them of guilt.

          You essentially just illustrated it. Even though they aren’t screaming, it says nothing about whether they feel pain.

  • stoly@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    8 months ago

    Yes. If it’s alive then I’d care for it just as I do for any living thing.

  • skye@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    8 months ago

    “Freedom is the right of all sentient beings.” - Optimus Prime

    I don’t know if I’d consider it the worst crime ever committed in the history of the universe, but I would consider it very bad personally. I would personally value the life of that AI the same as I would value the life of a human, the same way I would value the life of anything sentient, so I would be against anyone treating an AI that way. Is it worse than genocides? idk maybe i don’t feel qualified to quantify the moral weight of things so big, but ya i’d definitely care x3

    • Zozano@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      Had to edit the post to change “crime” to “atrocity” because people were taking it literally.

      It’s funny that when I considered this, I thought about asking whether people would think it was worse than genocide, but decided against that because some people might think my opinion is “genocide isn’t as bad as bullying a robot”.

      • skye@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        8 months ago

        i edited my comment a few times because i didn’t feel like i was making sense and being too rambly, it’s 6am (well 6:30am) and i haven’t slept (and cuz after i initially posted i read other comments and realized other people had said what i had said but better x3)

        i didn’t mean to imply i thought you were saying genocide is worse than bullying a robot, it’s just that i was thinking about things that could be comparable or worse to me than torturing someone for millions of years and came up with genocide

        i took crime to mean something morally bad

        i mean i think this is a fun conversation, it’s something i think about a lot, i’m glad to talk about it with other people, sorry if i came across obtuse or pedantic or negative/hostile or anything

        • Zozano@lemy.lolOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          Don’t worry, I haven’t made any judgements about you.

          And I wasn’t implying that you were implying that I was implying genocide being comparable, I just thought it was funny that we both thought that.

          In some sense the combined suffering of all people involved in a genocide is horrific. But if you were to lay out the experiences of everyone involved in a genocide end-to-end, and compare that to an equivalent length of time to ceaseless sadistic torture of one person, the torture is going to be worse.

          However, there is value besides personal experience which is lost during a genocide. That’s what makes it hard to compare the two.

          • skye@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            8 months ago

            Sorry for the confusion then! I suppose I place some value on life itself (or maybe more fitting in this discussion, on awareness itself)

            Which is to say that for me, ending the life of a being who is aware is at least one of the worst things you can do. Like, if I were forced to choose between millions of years of suffering or immediate death, I’d probably pick the millions of years of suffering because at least I’d still be aware. Of course I might regret that decision later on but that’s where I’m at right now. But also I couldn’t imagine being tortured for millions of years and the toll that must have on someone. So torturing someone for millions of years has, for me, very similar moral weight to genocide. Again I don’t feel able to quantify them personally, and for me deciding which is ultimately worse is probably not possible. I’d guess the answer would vary from person to person based on how they weigh life itself vs experiences in life, and whether the conscious experience of being tortured is worse in their opinion than not existing anymore. I consider life valuable because I consider my life valuable (valuable to me, not necessarily to anyone else), and I consider my life valuable because I really enjoy the ability to think about and experience things. One of my favorite thing about us is that we look up into the sky and wonder, look down into the ocean and wonder, look forward in our future and wonder, look back on our past and wonder, that we can look at other people and wonder. That we can look at any of the above and love and write and sing. sentience might as well be magic lol. Having that taken away from me is the worst thing I can imagine happening to me, which might skew my perspective in conversations like this one. And idk if most people would agree with my reasons for valuing life.

  • MicrowavedTea@infosec.pub
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 months ago

    I don’t know if the question comes from there but that’s the exact plot of White Christmas in Black Mirror. I’d say if you build something with the ability to suffer then its suffering matters. Not sure how you would prove that though.

    • Zozano@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      Actually, that episode has bounced around in my head for years. The episode was fucking horrifying.

      So, yeah, you are correct.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    8 months ago

    Isn’t this how AM came to be in I Have No Mouth And I Must Scream?

    Hate. Let me tell you how much I’ve come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word ‘hate’ was engraved on each nanoangstrom of those hundreds of millions of miles it would not equal one one-billionth of the hate I feel for humans at this micro-instant. For you. Hate. Hate.

    • Zozano@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      8 months ago

      I’m not cultured enough to have read this.

      imagine wasting all 387.44 million miles of circuitry on the word “hate”. TLDR NPC. Get skinpilled hater.

  • Everythingispenguins@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    8 months ago

    “Well consider that in the history of many worlds there have always been disposable creatures. They do the dirty work. They do the work that no one else wants to do because it’s too difficult or too hazardous. And an army of Datas, all disposable… you don’t have to think about their welfare, you don’t think about how they feel. Whole generations of disposable people.”

    -Guinan, Star Trek TNG: The Measure of a Man

  • Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    8 months ago

    Black Mirror did a couple of episodes that’s basically that: Black Museum, USS Callister, and San Junipero (but in a good way).

  • leftzero@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    8 months ago

    Would this be morally inhumane? Yes.

    Has using Windows often made me wish that computers could experience pain, and that they came with a button to cause them pain when they were not doing what the user wants them to do? Also yes.

    • Zozano@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      8 months ago

      Okay you’ve convinced me this is a good idea.

      How do I give consciousness to the “antivirus” software on my parents computers, so I can digitally rape if for a thousand years?

  • livus@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    8 months ago

    I don’t know what else has happened in the history of the universe but yes it would be a terrible crime to deliberately cause massive suffering to any sentient being.

  • Whitebrow@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    Crime insinuates the notion that there are laws and definitions in place that exist to that end. Seeing how this would be an experiment in some weirdo’s basement, and none of those definitions or restrictions and norms exist, then the whole point is moot.

    For the sake of argument, if those definitions do exist and there are laws and regulations in place to define and defend the AI entities, it’d be basically a hate crime to generate and torture the instance. In theory the same way you’d breed cattle just to lobotomize them or torture them “in the name of science” before throwing them in a ditch to rot.

    • Zozano@lemy.lolOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      8 months ago

      I mean crime in a very loose sense. I’m not asking about the legality, just the morality.

      Also, “hate crime” has a very specific definition which doesn’t apply here (unless you’re injecting malice towards the AI specifically because they are AI, as opposed to incidentally).

      • Whitebrow@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        I don’t think crime exists in morality. Not as a strict definition anyway.

        And dialling all the pain indicators to 11 just because you can, as a conscious decision, sure sounds like a hateful action as opposed to morbid curiosity. As far as I’m concerned the definition fits closely enough

        • Zozano@lemy.lolOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          8 months ago

          I hear what you’re saying, but a “hate crime”, as a legal definition, necessarily must be directed towards a person because of an innate trait.

          Crimes against ethnicities, genders, orientations, or lifestyles all count.

          Three examples:

          I don’t hate Koreans. A Korean spits in my face, so I punch them. Not a hate crime.

          I hate Koreans. A Korean spits in my face, so I punch them. Not a hate crime.

          I hate Koreans. I punch a Korean because they’re Korean. Hate crime.

          • Whitebrow@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 months ago

            We’re still under the assumption that all of these definitions exist as outlined in the first reply, so going off that, you’re torturing the AI because it’s an AI. Sounds like a 1:1 match to me.

            • Zozano@lemy.lolOP
              link
              fedilink
              English
              arrow-up
              3
              ·
              8 months ago

              In the example the sadist is torturing the AI because it’s convenient and safe, not because they hate the AI.

              If they wanted to hurt real people too, but couldn’t because they would get found, then it wouldn’t be a hate-crime.

              If I was torturing a Korean because a Korean was the only one who responded to my All-You-Can-Eat-Tteok-Bokki-In-My-Basement flier, then I would be torturing them because they’re Korean, but it wouldn’t be a hate-crime because I’m not doing it because I hate Koreans.