• parpol@programming.dev
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    4
    ·
    3 months ago

    Learning how to build a bomb shouldn’t be blocked by llms to begin with. You can just as easily learn how to do it by googling the same question, and real and accurate information, even potentially dangerous information, shouldn’t be censored.

    • Fubarberry@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      52
      ·
      3 months ago

      I’m not surprised that a for-profit company for wanting to avoid bad press by censoring stuff like that. There’s no profit in sharing that info, and any media attention over it would be negative.

      • Armok: God of Blood@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        2
        ·
        3 months ago

        No one’s going after hammer manufacturers because their hammers don’t self-destruct if you try to use one to clobber someone over the head.

      • vithigar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        I’m more surprised that a for-profit company is willing to use a technology that is able to randomly spew out unwanted content, incorrect information, or just straight gibberish, in any kind of public facing capacity.

        Oh, it let them save money on support staff this quarter. And fixing it can be an actionable OKR for next quarter. Nevermind.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 months ago

      They use the bomb-making example but mostly “unsafe” or even “harmful” means erotica. It’s really anything, anyone, anywhere would want to censor, ban, or remove from libraries. Sometimes I marvel that the freedom of the (printing) press ever became a thing. Better nip this in the butt, before anyone gets the idea that genAI might be a modern equivalent to the press.