• Politically Incorrect@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    54
    ·
    8 months ago

    Watching a video or reading an article by a human isn’t copyright infringement, why then if an “AI” do it then it is? I believe the copyright infringement it’s made by the prompt so by the user not the tool.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      7
      ·
      8 months ago

      If you read an article, then copy parts of that article into a new article, that’s copyright infringement. Same with ais.

      • anlumo@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        24
        ·
        8 months ago

        Depends on how much is copied, if it’s a small amount it’s fair use.

        • echo64@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          2
          ·
          8 months ago

          Fair use depends on a lot, and just being a small amount doesn’t factor in. It’s the actual use. Small amounts just often fly under the nose of legal teams.

        • FireTower@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          8 months ago

          Fair use is a four factor test amount used is a factor but a low amount being used doesn’t strictly mean something is fair use. You could use a single frame of a movie and have it not qualify as fair use.

    • Drewelite@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      12
      ·
      8 months ago

      This is what people fundamentally don’t understand about intelligence, artificial or otherwise. People feel like their intelligence is 100% “theirs”. While I certainly would advocate that a person owns their intelligence, It didn’t spawn from nothing.

      You’re standing on the shoulders of everyone that came before you. You take a prehistoric man or an alien that hasn’t had any of the same experiences you’ve had, they won’t be able to function in this world. It’s not because they are any dumber than you. It’s because you absorbed the hive mind of the society you live in. Everyone’s racing to slap their brand on stuff to copyright it to get ahead and carve out their space.

      “No you can’t tell that story, It’s mine.” “That art is so derivative.”

      But copyright was only meant to protect something for a short period in order to monetize it; to adapt the value of knowledge for our capital market. Our world can’t grow if all knowledge is owned forever and isn’t able to be used when even THINKING about new ideas.

      ANY VERSION OF INTELLIGENCE YOU WOULD WANT TO INTERACT WITH MUST CONSUME OUR KNOWLEDGE AND PRODUCE TRANSFORMATIONS OF IT.

      That’s all you do.

      Imagine how useless someone would be who’d never interacted with anything copyrighted, patented, or trademarked.

      • rottingleaf
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        8 months ago

        Yes, so how come all these arguments were not popular before the current hype about text generators?

        Have some integrity.

        • dezmd@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          They absolutely were, the entire time. You just didn’t have interest in hearing about it aned weren’t engaged on it.

          Learn what integrity means if you want to use it as a snarky one liner.

          Have some common sense.

          • rottingleaf
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            8 months ago

            They absolutely were, the entire time. You just didn’t have interest in hearing about it aned weren’t engaged on it.

            Why express your opinion on subjects where it’s not worth anything?

            You are saying these mutated cryptobros cared about copyright and patent laws being obsolete and harmful before “AI”?

            Learn what integrity means if you want to use it as a snarky one liner.

            I know what every word I use means

      • raspberriesareyummy@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        6
        ·
        edit-2
        8 months ago

        That’s not a very agreeable take. Just get rid of patents and copyrights altogether and your point dissolves itself into nothing. The core difference being derivative works by humans can respect the right to privacy of original creators.

        Deep learning bullshit software however will just regurgitate creator’s contents, sometimes unrecognizable, but sometimes outright steal their likeness or individual style to create content that may be associated with the original creators.

        what you are in effect doing, is likening learning from the ideas of others to a deep learning “AI” using images for creating revenge porn, to give a drastic example.

        • Drewelite@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          edit-2
          8 months ago

          Yes. Your last sentence is my point exactly. LLMs haven’t replicated everything about the human brain. But the hype is here because it cracks one of our brains key features: How it learns. Your brain isn’t magic. It just records training data until it has enough to mash it together into different things.

          A child doesn’t respect copyright, they’ll draw a picture of Mario. You probably would too If I asked you to. Respecting copyright is something we learn to do in specific situations. This is called “coming up with an original idea”. But that’s bullshit. There are no original ideas.

          If you come up with a product that’s a cold brew cup that refrigerates its contents, I’d say that’s a very original idea. But you didn’t come up with refrigeration, you didn’t come up with cups, or cold brew, or the idea of putting technology in a cup, or the concept of a product you sell to people. Name one thing about this idea that you didn’t learn somewhere else? You can’t. Because that’s not how people work. A very real part of business, that you will learn as you put your new cup to market, is skirting around copyright. Somebody out there with a heated cup might come after you for example.

          This is a difficult thing to learn the precise line on. Mostly because it can’t work as a concrete rule. AI still has to be used, tested, and developed to learn the nuances here. And it will. But what baffles me is how my example above outlines how every process of invention has worked since the beginning of humanity. But if an LLM does it, people say, “That’s not a real idea. It just took a bunch of stuff it’s learned and mashed it together.” But I hear, “My brain is 🪄magic✨ I’m special.”

    • topinambour_rex@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      8 months ago

      What does this human is going to do with this reading ? Are they going to produce something by using part of this book or this article ?

      If yes, that’s copyright infringement.

      • Drewelite@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 months ago

        How do you expect people will create AI if it can’t do the things we do, when “doing the things we do” is the whole point?

    • Uninvited Guest@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      12
      ·
      edit-2
      8 months ago

      When a school professor “prompts” you to write an essay and you, the “tool” go consume copyrighted material and plagiarize it in the production of your essay is the infringement made by the professor?

        • ominouslemon@lemm.ee
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          2
          ·
          8 months ago

          Copilot lists its sources. The problem is half of them are completely made up and if you click on the links they take you to the wrong pages

        • Uninvited Guest@lemmy.ca
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          4
          ·
          8 months ago

          It definitely does not cite sources and use it’s own words in all cases - especially in visual media generation.

          And in the proposed scenario I did write the student plagiarizes the copyrighted material.

          • Politically Incorrect@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            10
            ·
            edit-2
            8 months ago

            If you read a book or watch a movie and get inspired by it to create something new and different, it’s plagiarism and copyright infringement?

            If that were the case the majority of stuff nowadays it’s plagiarism and copyright infringement, I mean generally people get inspired by someone or something.

            • buffaloseven@fedia.io
              link
              fedilink
              arrow-up
              10
              arrow-down
              2
              ·
              8 months ago

              There’s a long history of this and you might find some helpful information in looking at “transformative use” of copyrighted materials. Google Books is a famous case where the technology company won the lawsuit.

              The real problem is that LLMs constantly spit out copyrighted material verbatim. That’s not transformative. And it’s a near-impossible problem to solve while maintaining the utility. Because these things aren’t actually AI, they’re just monstrous statistical correlation databases generated from an enormous data set.

              Much of the utility from them will become targeted applications where the training comes from public/owned datasets. I don’t think the copyright case is going to end well for these companies…or at least they’re going to have to gradually chisel away parts of their training data, which will have an outsized impact as more and more AI generated material finds its way into the training data sets.

              • stephen01king
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                2
                ·
                8 months ago

                How constantly does it spit out copyrighted material? Is there data on that?

                • buffaloseven@fedia.io
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  8 months ago

                  There’s more and more research starting to happen on it, but I’ve seen anywhere from 20% to 60% of responses. Here’s a recent study where they explicitly try to coerce LLMs to break copyright: https://www.patronus.ai/blog/introducing-copyright-catcher

                  I don’t have the time to grab them right now, but in many of the lawsuits brought forward against companies developing LLMs, their openings contain some statistics gathered on how frequently they infringed by returning copyrighted material.

            • potustheplant@feddit.nl
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              7
              ·
              edit-2
              8 months ago

              You do realize that AI is just a marketing term, right? None of these models learn, have intelligence or create truly original work. As a matter of fact, if people don’t continue to create original content, these models would stagnate or enter a feedback loop that would poison themselves with their own erroneous responses.

              AIs don’t think. They copy with extra steps.

                • potustheplant@feddit.nl
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  3
                  ·
                  8 months ago

                  Except that the information it gives you is often objectively incorrect and it makes up sources (this happened to me a lot of times). And no, it can’t do what a human can. It doesn’t interpret the information it gets and it can’t reach new conclusions based on what it “knows”.

                  I honestly don’t know how you can even begin to compare an LLM to the human brain.