Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • Tylerdurdon@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    17
    ·
    7 months ago

    AI gives creative license to anyone who can communicate their desires well enough. Every great advancement in the media age has been pushed in one way or another with porn, so why would this be different?

    I think if a person wants visual “material,” so be it. They’re doing it with their imagination anyway.

    Now, generating fake media of someone for profit or malice, that should get punishment. There’s going to be a lot of news cycles with some creative perversion and horrible outcomes intertwined.

    I’m just hoping I can communicate the danger of some of the social media platforms to my children well enough. That’s where the most damage is done with the kind of stuff.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      3
      ·
      edit-2
      7 months ago

      The porn industry is, in fact, extremely hostile to AI image generation. How can anyone make money off porn if users simply create their own?

      Also I wouldn’t be surprised if the it’s false advertising and in clicking the ad will in fact just take you to a webpage with more ads, and a link from there to more ads, and more ads, and so on until eventually users either give up (and hopefully click on an ad).

      Whatever’s going on, the ad is clearly a violation of instagram’s advertising terms.

      I’m just hoping I can communicate the danger of some of the social media platforms to my children well enough. That’s where the most damage is done with the kind of stuff.

      It’s just not your children you need to communicate it to. It’s all the other children they interact with. For example I know a young girl (not even a teenager yet) who is being bullied on social media lately - the fact she doesn’t use social media herself doesn’t stop other people from saying nasty things about her in public (and who knows, maybe they’re even sharing AI generated CSAM based on photos they’ve taken of her at school).

      • archon@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        7 months ago

        How can anyone make money off porn if users simply create their own?

        What, you mean like amateur porn or…?

        Seems like professional porn still does great after over two decades of free internet porn so…

        I guess they will solve this one the same way, by having better production quality. 🤷

        • Petter1@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          How old? My parents certainly understand this, may great-parants not so much and my son not yet (5yo)

          • stewie3128@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 months ago

            70 or older in my family. My dad’s wife just posted an excited post on Facebook about a Tesla Concorde taking off, and do had to explain to her that it’s a flight simulator. She’s 73.