A lawsuit filed by more victims of the sex trafficking operation claims that Pornhub’s moderation staff ignored reports of their abuse videos.


Sixty-one additional women are suing Pornhub’s parent company, claiming that the company failed to take down videos of their abuse as part of the sex trafficking operation Girls Do Porn. They’re suing the company and its sites for sex trafficking, racketeering, conspiracy to commit racketeering, and human trafficking.

The complaint, filed on Tuesday, includes what it claims are internal emails obtained by the plaintiffs, represented by Holm Law Group, between Pornhub moderation staff. The emails allegedly show that Pornhub had only one moderator to review 700,000 potentially abusive videos, and that the company intentionally ignored repeated reports from victims in those videos.

The damages and restitution they seek amounts to more than $311,100,000. They demand a jury trial, and seek damages of $5 million per plaintiff, as well as restitution for all the money Aylo, the new name for Pornhub’s parent company, earned “marketing, selling and exploiting Plaintiffs’ videos in an amount that exceeds one hundred thousand dollars for each plaintiff.”

The plaintiffs are 61 more unnamed “Jane Doe” victims of Girls Do Porn, adding to the 60 that sued Pornhub in 2020 for similar claims.
Girls Do Porn was a federally-convicted sex trafficking ring that coerced young women into filming pornographic videos under the pretense of “modeling” gigs. In some cases, the women were violently abused. The operators told them that the videos would never appear online, so that their home communities wouldn’t find out, but they uploaded the footage to sites like Pornhub, where the videos went viral—and in many instances, destroyed their lives. Girls Do Porn was an official Pornhub content partner, with its videos frequently appearing on the front page, where they gathered millions of views.

read more: https://www.404media.co/girls-do-porn-victims-sue-pornhub-for-300-million/

archive: https://archive.ph/zQWt3#selection-593.0-609.599

  • HelloThere
    link
    fedilink
    English
    838 months ago

    I agree that pornhub, et al, should be liable for abuse their platform distributes, but how on earth is AI meant to help in sex trafficking?

    • @[email protected]
      link
      fedilink
      English
      678 months ago

      A lot of people have this very naive view that if we just build AI overlords to monitor all human activity, we can somehow automate good behavior and make the world a better place.

      Really we’ll just end up with RoboCop.

        • @[email protected]
          link
          fedilink
          English
          108 months ago

          That seems like an excellent idea, we should all make everything possible to make sure such AI overlords are built.

          Please don’t hurt me, or an eventual future indistinguishable facsimile of myself…?

      • @[email protected]
        link
        fedilink
        English
        38 months ago

        But robocop was the good guy.

        ED-209 was the bad guy.

        He looked much cooler, but he was kind of a dick. And bad at stairs.

    • Riskable
      link
      fedilink
      English
      18 months ago

      AI will help with sex trafficking by generating all the porn so humans won’t need to be involved at all.

      In the future the equivalent lawsuit will be from the victims of hackers who used people’s PCs to generate porn.

      • HelloThere
        link
        fedilink
        English
        138 months ago

        That’s like saying professional porn got rid of amateur / “real” sex porn. It didn’t.

        There will always be a demand for real humans actually doing the thing depicted. While I’m sure there will be very popular AI production houses, similar to hentai, etc, if you think AI generated porn will completely remove the desire for humans from performing, then you do not understand why people watch porn.

    • El Barto
      link
      fedilink
      English
      -5
      edit-2
      8 months ago

      Edit: I said “ideally,” as in utopian. In practice, corporations, governments and overall greed are in the way.


      Ideally, sci-fi style, an effective AI can sift through all the reports and take down the videos that are clearly suspicious (as opposed to popular and well-known videos of porn stars that could be found elsewhere, for example, in dvd format.) It could message the reporter asking for more information, for example. Then it could message an actual human for the videos it is not confident to deem as abusive.

      It may even try to contact the victims and offer them options to report the perpetrators to the authorities. Or lead them to a safe house, etc.

      It could do this without never being tired, never being hungry, never feeling shocked.

      In practice, we’re not there yet. Close, but not there.

      • HelloThere
        link
        fedilink
        English
        148 months ago

        Close? Pull the other one.

        And that’s long before we get the ethical quandary of sourcing training data, and implicit biases.

        • El Barto
          link
          fedilink
          English
          48 months ago

          I know what you mean.

          My scenario was ideal, from the point of view of my 80s kid self looking forward to a promising future.

          That future is now, and I hate it, because governments and big corporations ruined it for all of us.

      • @[email protected]
        link
        fedilink
        English
        08 months ago

        This is why I can’t jive with idealists. They put forth a proposal because “ideally…” and get people to thinking “yeah he’s right,” but he conveniently left off the fact that due to human nature it is basically an impossible pipedream and you’re more likely to find true gnosis than for that to become reality.

        • El Barto
          link
          fedilink
          English
          08 months ago

          The funny thing is that I’m a realist. But if course I like to think about what the supposed scenario is.

    • Allseer
      link
      fedilink
      English
      -108 months ago

      that’s for the bad guys to figure out in prison after being arrested