Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.


God damn I hate this fucking AI bullshit so god damn much.

      • LostWon@lemmy.ca
        link
        fedilink
        arrow-up
        73
        ·
        edit-2
        7 months ago

        That’s highly subjective, but the fascinating book The Dawn of Everything argues otherwise. There are even parts about the anthropological evidence some peoples just up and changed systems every so often (yes, non-violently). Our problem as people in the modern era is many can’t imagine anything else, not that no one ever did.

          • webghost0101@sopuli.xyz
            link
            fedilink
            arrow-up
            40
            ·
            edit-2
            7 months ago

            Unintentional Strawman misses the point.

            A economy is but a subsystem to serve an organized society.

            Not every society requires a economy, there are many ways to organize, the original foundational ideas go back to ancient greece. Read up about them.

            The people with wealth and power have all the insensitive to keep things as they are. They own the planets resources, the means of productions. They loby or laws.

            To think were waiting on one person to have “a better idea” for things to change is incredibly naive.

            I don’t know how the system will change how the next one will look but the current one is mathematically not sustainable for another century.

          • LostWon@lemmy.ca
            link
            fedilink
            arrow-up
            11
            ·
            edit-2
            7 months ago

            It doesn’t. Graeber was an anthropologist and Wengrow is an archaeologist. It’s a review of existing evidence from past civilizations (the diversity of which most people are hugely ignorant about), making the case the most common representations of “civilization” and “progress” are severely limited, probably to a detrimental extent since we often can only base our conceptions of what is possible on what we know.

      • belated_frog_pants@beehaw.org
        link
        fedilink
        arrow-up
        32
        ·
        7 months ago

        God, this argument. Its such history washing to insist that no other functioning system where people have been happy has existing. People cant even imagine life without capitalism.

        Capitalism enforces itself. Its not pervasive because “its the best we can do”.

      • bl_r@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        20
        arrow-down
        1
        ·
        7 months ago

        That’s some serious capitalist realism you’ve got there, it would be a shame if you were incorrect.

        Real big shame…

        It would also be a shame if people are trying to put said other systems in action right now

        Real

        big

        shame

      • dustycups@aussie.zone
        link
        fedilink
        arrow-up
        13
        ·
        7 months ago

        Surely in a liberal democracy enfettered capitalism is restrained by laws. I think a big problem we have now is a combination of regulatory capture & (sometimes AI generated) targeted, emotional, populist advertising by political brands.

      • JackGreenEarth@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        ·
        7 months ago

        Many people have suggested better systems. They just haven’t been implemented. And even if they hadn’t, people should still be allowed to criticise the current system, if only to get the discussion started on how to improve it.

  • Gaywallet (they/it)@beehaw.org
    link
    fedilink
    arrow-up
    49
    ·
    7 months ago

    I can’t help but wonder how in the long term deep fakes are going to change society. I’ve seen this article making the rounds on other social media, and there’s inevitably some dude who shows up who makes the claim that this will make nudes more acceptable because there will be no way to know if a nude is deep faked or not. It’s sadly a rather privileged take from someone who suffers from no possible consequences of nude photos of themselves on the internet, but I do think in the long run (20+ years) they might be right. Unfortunately between now and some ephemeral then, many women, POC, and other folks will get fired, harassed, blackmailed and otherwise hurt by people using tools like these to make fake nude images of them.

    But it does also make me think a lot about fake news and AI and how we’ve increasingly been interacting in a world in which “real” things are just harder to find. Want to search for someone’s actual opinion on something? Too bad, for profit companies don’t want that, and instead you’re gonna get an AI generated website spun up by a fake alias which offers a "best of " list where their product is the first option. Want to understand an issue better? Too bad, politics is throwing money left and right on news platforms and using AI to write biased articles to poison the well with information meant to emotionally charge you to their side. Pretty soon you’re going to have no idea whether pictures or videos of things that happened really happened and inevitably some of those will be viral marketing or other forms of coercion.

    It’s kind of hard to see all these misuses of information and technology, especially ones like this which are clearly malicious in nature, and the complete inaction of government and corporations to regulate or stop this and not wonder how much worse it needs to get before people bother to take action.

    • tim-clark@kbin.social
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      7 months ago

      Flat earthers on the rise. I can only trust what i see with my eyes, the earth is flat!

      How will this affect the courts? How can evidence be trusted?

        • trev likes godzilla@beehaw.orgOP
          link
          fedilink
          arrow-up
          31
          ·
          edit-2
          7 months ago

          I believe Tim means to say that the spread of misinformation can be linked to the rise of Flat Earthers. That if we can only trust what we see before us, and we see a flat horizon, we can directly interpret this visual to mean that the Earth is flat. Thus, if we cannot trust our own eyes and ears, how can future courtroom evidence be trusted?

          “Up to the Twentieth Century, reality was everything humans could touch, smell, see, and hear. Since the initial publication of the chart of the electromagnetic spectrum, humans have learned that what they can touch, smell, see, and hear is less than one-millionth of reality.” -Bucky Fuller

          ^ basically that

      • DdCno1@beehaw.org
        link
        fedilink
        arrow-up
        29
        ·
        7 months ago

        I don’t know about you, but I started to notice that not everything that was printed on paper was truthful when I was around ten or eleven years old.

        • Storksforlegs@beehaw.org
          link
          fedilink
          English
          arrow-up
          13
          ·
          7 months ago

          Ok, but acting like you can’t trust any sources of info on anything is pretty destructive also. There are still some fairly reliable sources out there. Dismiss everything and you’re left with conspiracy theories.

        • lad@programming.dev
          link
          fedilink
          arrow-up
          7
          ·
          7 months ago

          In the long run everything might be false, even your own memory changes over time and may be affected by external forces.

          So, maybe there will be no way to tell the truth except when experiencing it firsthand, and we will once again live like ancient Greeks, pondering about things.

          To be fair, I hope that science and critical thinking might help to distinguish what is true or not, but that would only apply to abstract things, as all the concrete things might be fabricated

        • floofloof@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          I don’t remember saying all books were good. But it’s often possible to find well edited and curated material in print.

      • Melmi@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        21
        ·
        7 months ago

        There are already AI-written books flooding the market, not to mention other forms of written misinformation.

        • floofloof@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          7 months ago

          They still seem relatively easy to filter out compared to the oceans of SEO nonsense online. If you walk into a library you won’t be struggling to find your way past AI crap.

        • tlf@feddit.de
          link
          fedilink
          arrow-up
          5
          ·
          7 months ago

          The flood of AI generated anything is really annoying when looking for an actual human created source.

    • TexMexBazooka@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      7 months ago

      I think AI might, eventually, stop people from posting their entire lives.

      Not having a ton of data floating around about your looks, your voice, that’s the only way to protect yourself from ai generated shit.

    • Tiltinyall@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      What if one day those people doctoring images like that should have a fake doctored profile photo of themselves on a dating app. Ai writes a tailored bio for victim to swipe on should that victim be on a dating app. If the match is made, the profile adds both the perps Ai doctoring attempt on the victim and the real face of the perp.

    • kandoh@reddthat.com
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      I think people are going to get very good at identifying fakes. Only is older people will be tricked.

  • eveninghere@beehaw.org
    link
    fedilink
    arrow-up
    40
    ·
    edit-2
    7 months ago

    This one’s fixable. Just hold Meta accountable. It should be illegal they don’t manually filter ads when submitted.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    I would’t have thought to train an LLM to create nude but say nothing about that when presenting it to Apple for App Store review only to then advertise that feature elsewhere. Damn.