• Rikj000@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    10
    ·
    10 hours ago

    No copyright law seems dangerous to me,
    why create content if you can just steal it,
    and earn on the back of the original creator without consequences?

    I think I’d rather see it updated instead.
    E.g. To hold AI companies and users accountable.
    So they need explicit approval of copyright holders before they’re allowed to train upon / use their data.

    • verstra@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      9 hours ago

      I mean, updating the rules would help - clarifying that feeding data to any model / doing analysis on it requires copyright - but I doubt that it would stop companies from doing it. Because it is hard to prove in court that your work has been stolen.

      But there is no real way of enforcing the rules. How would be combat piracy? If you make BitTorrent protocol illegal, people will just that using HTTP or anything else to share copyright-ed material.

      • Rikj000@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        If the fines regarding to it are in proportion with the revenue of the business, then it likely would make a lot of them think twice about doing so.

        I agree that it’s hard to enforce the rules,
        and that some would still ignore them.
        However updating the rules give the abused people a chance of getting justice/consolidation for their stolen work, and diminishes the chance of companies breaking the rules.

        It would not combat bit torrent (P2P) piracy.
        But that’s also not that important imo.
        Most pirates are rather poor folks,
        just trying to watch/play some content which they can’t afford, they make up for a rather neglible amount of the profit that can be had.

        However it would combat billion dollar companies that would use pirated content to train LLMs to sell further. All they need is x1 internal whistleblower about doing so, and they could be fined with an amount larger then the risk is worth.