Pack it up. Nothing can be funnier than this. agony-minion

Or perhaps, nothing can be funny anymore. desolate

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      36
      ·
      edit-2
      3 months ago

      The funniest moment of that show, for me, was when real life revealed Justin Roiland was just a creepy bigoted piece of shit that abused people at the studio under pretenses of “getting into character” and hurled slurs and insults at teenage girls that didn’t want to fuck him while revealing that his “aw geeze” Morty characterization was just how he talked when doing that to make it all somehow even more gross all the while. libertarian-alert

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      14
      ·
      3 months ago

      Yes, I know it. And I’ve been argued at, exhaustively, about how “Stranger in a Strange Land” was somehow evidence that Heinlein was more sophisticated, nuanced, and beyond my comprehension than I realized after I said he was a fascist. morshupls

      A story about the very special person who is the most very special and knows things better than anyone else and has lots of triumphant sexual victories while the weak inferiors are both too strong and too weak for him is very sophisticated, nuanced, and beyond my comprehension and is no way indicative of the fascist sympathies of Heinlein. morshupls

        • UlyssesT [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          12
          ·
          3 months ago

          Level 3 Heinlein take: Transition could have saved her

          If only there was no such thing as fascist trans people. The concept still baffles me, but they exist. peppino-why

        • Krem [he/him, they/them]@hexbear.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          Level 3 Heinlein take: Transition could have saved her

          Isn’t the last third of stranger in a strange land just “imagine if i could telepathically jump into the head of the woman i’m having sex with and she would jump into my man-head, and we’d be having sex but i’d be the woman and she’d be the man, that would be very hot and we’d have world peace”

      • Frank [he/him, he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Hahahha i love it. I’d say “i have to re-read it now” but i have no real desire to re-read that trash even if it gives me ways to dunk on libertarian dorks.

        • UlyssesT [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          3 months ago

          Randroids tend to have very basic impulses that are so basic that they actually trip me up because I overestimate them sometimes: it really does tend to more often than not be “I am the very important and very special main character, just like in these treats.” ancaptain

  • KobaCumTribute [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    19
    ·
    3 months ago

    privately owned

    He doesn’t own shit in this case, he just had one of his lackeys make whatever LLM he ripped off for his chatbot prompt a Flux server instance. Flux itself is open source, has nothing to do with him or anything he’s touched, and runs on midrange consumer hardware. It’s also as horrifying as it is fascinating, because despite only a modest increase in system requirements over Stable Diffusion it’s starting to lack the really obvious flaws that earlier models have.

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      12
      ·
      3 months ago

      He doesn’t have any financial stake in what he calls “Grok”? (fuck that name and fuck the novel it comes from btw, Heinlein was a creepy fascist no matter the “separate art from artist” excuses)

      • KobaCumTribute [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        13
        ·
        3 months ago

        I don’t know what Grok is under the hood, because it doesn’t make sense for it to be its own independent model over just a modified version of some other presumably open source LLM that had permissive enough licensing for a derivative work to not mention it (or his lackeys just ripped one off and didn’t credit it at all), but the image generator that it’s prompting is just a Flux instance. So basically one of his lackeys set up some servers running something like comfyui (also open source) servers set to its remote API mode and got his chatbot to send API calls to them on request, and those servers are just running some basic workflow with the default Flux checkpoint.

        I just want to emphasize that here he’s trying to leach off open source research tech that he doesn’t own and isn’t involved with in any way.

        • UlyssesT [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          11
          ·
          3 months ago

          I just want to emphasize that here he’s trying to leach off open source research tech that he doesn’t own and isn’t involved with in any way.

          Doesn’t sound that much different than a long tradition of billionaire fucks slapping their name and their rebranding labels over stuff they didn’t make and don’t even fully understand.

          • KobaCumTribute [she/her]@hexbear.net
            link
            fedilink
            English
            arrow-up
            7
            ·
            3 months ago

            Exactly. Flux isn’t his and has nothing to do with him or his shitty companies and bumbling lackeys, he’s just a middleman trying to grift off open source tech.

            Extremely horrifying open source tech, but open source tech nonetheless.

            • UlyssesT [he/him]@hexbear.netOP
              link
              fedilink
              English
              arrow-up
              5
              ·
              3 months ago

              I think I understand. Still hate it, but I think I understand anyway.

              Reminds me of how “deepfakes” are sort of home grown and open source creep shit now.

              • KobaCumTribute [she/her]@hexbear.net
                link
                fedilink
                English
                arrow-up
                6
                ·
                3 months ago

                Yep. With a relatively modern midrange computer and the most basic of technical knowledge anyone can set up and run at least Stable Diffusion (and if they have an NVidia GPU “relatively modern” extends back to like the better 10 series cards from over a decade ago) and do basically anything with it, limited primarily by their VRAM and RAM vs the image size.

                The one saving grace is that despite how trivially accessibly extremely powerful tools are, most of the AI enthusiast community is comprised of dipshit chuds who struggle to operate a simple prompt input box on something like A1111 and cry about how hard and confusing comfyui - which is literally just a node based flowchart that holds your hand through the whole process - is to use.

                • UlyssesT [he/him]@hexbear.netOP
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  3 months ago

                  Sounds like there’s absolutely no means to restrict deepfaking, including deepfaking of children. What could go wrong? libertarian-alert

  • Pastaguini [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    16
    ·
    3 months ago

    Anything generated by AI is going to be low quality slop just due to the very nature of how these images are produced. Because it can only cobble together existing images, it’ll never truly -

    sees the image in question

    spits out coffee

    AAAAAAAAAAAAAAAAAHAHAHAHAHAHAHHAHAHAHAHAHHAHAHAHAHAHHAHAHAHAHAHHAHAHAHAHAHAHAHAH HOLY FUCK AAAHAHSHAHAHAHAHAHAHA

    HES YELLOW

    HES FUCKIN YELLOW

    AHAHAHAHHAHAHAHAHAHAHHAHAHHAHAHh

  • Infamousblt [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    3 months ago

    I mean it gave me a sensible chuckle, that picture is not without humor at all. But the funniest picture ever? It’s not even the funniest picture I saw today. Peezer was funnier by far.

  • Frank [he/him, he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    3 months ago

    These systems aren’t intelligent because they’re not trying to develop a langford basilisk to put us out of our collective misery.

    Edit: the “langford basilisk” is a concept from science fiction of an image that for whatever reason causes damage to the human mind. Usually the conceit is it encodes information the mind can’t process resulting in a severe seizure or similar outcome. David Langford explored the idea in some depth starting with a short story called B.L.I.T which is a meditation on terrorism, weapons proliferation, hate, the dangers of rapid scientific discover, and also a Nazi gets pwned

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      12
      ·
      3 months ago

      What if R0k0’s Bas!l!sk but it wants to keep millions to billions of simulations around of everyone it doesn’t like to be an unwilling audience to endless tedious cringe? no-mouth-must-scream

      • buckykat [none/use name]@hexbear.net
        link
        fedilink
        English
        arrow-up
        16
        ·
        3 months ago

        Roko’s basilisk is very funny because it’s just a version of Pascal’s Wager where if you think it’s bullshit god just goes “understandable have a nice day” and only punishes you if you believe in it but don’t sufficiently obsess about it.

        • UlyssesT [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          14
          ·
          edit-2
          3 months ago

          Like so many other things techbros bloviate about, it’s been thought of before, but because “history is bunk” and other cliches, they keep believing they’re the first to discover concepts and keep stumbling over them while thinking they’re being trailblazers.

          Similarly, “similation theory” is just bazinga deism.

            • UlyssesT [he/him]@hexbear.netOP
              link
              fedilink
              English
              arrow-up
              9
              ·
              3 months ago

              I am going to struggle session about this with you just a little bit: I call it bazinga deism because the claim that the universe and everything in it and all the natural laws the govern it is “just a computer program” and that there’s some programmer(s) outside of it that set it all in motion and sort of stepped away sounds pretty damn deistic to me.

              I agree it is also solipsism in application, especially because its primary adherents really want to see other people as “NPCs” to justify dehumanizing them.

              • buckykat [none/use name]@hexbear.net
                link
                fedilink
                English
                arrow-up
                6
                ·
                3 months ago

                It does have that deistic element to it, but it’s primarily solipsistic because they don’t want to live in the simulated universe and accept it in its programmed natural laws, they want to escape the simulation because they believe it’s all fundamentally unreal.

                • UlyssesT [he/him]@hexbear.netOP
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  ·
                  3 months ago

                  So many of those fucks are the ones on the top of the monstrous system destroying the planet and all they seem to be interested in is trying to escape it, whether by fantastical fiefdoms on Mars or by “waking up” from the “simulation.”

                  The system is that fucked. They don’t seem satisfied with it, either. Then again, they tend to be psychological leaky buckets that can’t ever be satisfied.

        • Frank [he/him, he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 months ago

          It’s the most “what reading literally no philosophy at all and scoffing at the entire liberal arts your whole life does to an mf” thing possible.

        • UlyssesT [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          3 months ago

          Pretty unimpressive machine god if it’s only as uncreative and petty with its desire for revenge and its revenge motives as the average creepy libertarian computer toucher yud-rational

          They want to make a cringe god in their own cringe image. kelly

      • Frank [he/him, he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Very different concept. Lovecraft stuff is “ooh these cosmic higher dimensional beings are so weird they drive men mad!”

        A Langford Basilisk is based on the idea that your mind is analogous to a computer and the Basilisk image is visual data that causes an unrecoverable hard crash. There’s nothing magical about the image, the problem happens when your brain tries to make sense of what it is seeing.