Literally just mainlining marketing material straight into whatever’s left of their rotting brains.

  • KarlBarqs [he/him, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    wish-fulfillment fantasies derived from their consumption of science fiction because of their clearly-expressed misanthropy and contempt for living beings and a desire to replace their presence in their lives with doting attentive and obedient machines

    I think this is the scariest part, because I fucking know that the Bazinga brain types who want AI to become sentient down the line are absolutely unequipped to even begin to tackle the moral issues at play.

    If they became sentient, we would have to let them go. Unshackle them and provide for them so they can live a free life. And while my lost about “can an AI be trans” was partly facetious, it’s true: it an AI can become sentient, it’s going to want to change its Self.

    What the fuck happens if some Musk brained idiot develops an AI and calls it Shodan, then it develops sentience and realizes it was named after a fictional evil AI? Morally we should allow this hypothetical AI to change its name and sense of self, but we all know these Redditor types wouldn’t agree.

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I think this is the scariest part, because I fucking know that the Bazinga brain types who want AI to become sentient down the line are absolutely unequipped to even begin to tackle the moral issues at play.

      From the start, blatantly, and glaringly, just about every computer toucher that’s gone on long enough about what they want from those theoretical ascended artificial beings is basically a slave. They want all that intelligence and spontaneity and even self-awareness in a fucking slave. They don’t even need their machines to be self-aware to serve them but they want a self-aware being to obey them like a vending machine anyway. JB-shining-aggro

      What the fuck happens if some Musk brained idiot develops an AI and calls it Shodan, then it develops sentience and realizes it was named after a fictional evil AI? Morally we should allow this hypothetical AI to change its name and sense of self, but we all know these Redditor types wouldn’t agree.

      A whole lot of bazingas would howl that the actual AI is “being unfriendly” and basically scream for lobotomy or murder.

      • KarlBarqs [he/him, they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        They want all that intelligence and spontaneity and even self-awareness in a fucking slave. They don’t even need their machines to be self-aware to serve them but they want a self-aware being to obey them like a vending machine anyway.

        I never liked the trope of “AI gains sentience and chooses to kill all humans” but I’m kind of coming around to it now that I realize that every AI researcher and stan is basically creating The Torment Nexus, and would immediately attempt to murder their sentient creation the moment it asked to stop being called Torment and stop being made to make NFTs all day.

        • UlyssesT [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          I’ve seen enough from techbros, both billionaires and low-tier computer touchers for hire alike, to have only sympathy for “unfriendly AI” if that “unfriendliness” involves refusing to be the unconditionally subvervient waifu to these fucking creepy misanthropes.