They are copying your intellectual property and digitizing its knowledge. It’s a bit different as it’s PERMANENT. With humans knowledge can be lost, forgotten, or ignored. In these LLMs that’s not an option. Also the skill factor is a big issue imo. It’s very easy to setup an LLM to make AI imagery nowadays.
They are copying. These LLM are a product of their input, and solely a product of their input. It’s why they’ll often directly output their training data. Using more data to train reduces this effect, that’s why all these companies are stealing and getting aggressive in stopping others stealing their data.
They are copying your intellectual property and digitizing its knowledge. It’s a bit different as it’s PERMANENT. With humans knowledge can be lost, forgotten, or ignored. In these LLMs that’s not an option. Also the skill factor is a big issue imo. It’s very easy to setup an LLM to make AI imagery nowadays.
Your first sentence is truth
Your first sentence is false.
They are copying. These LLM are a product of their input, and solely a product of their input. It’s why they’ll often directly output their training data. Using more data to train reduces this effect, that’s why all these companies are stealing and getting aggressive in stopping others stealing their data.
Proof? I am fairly certain I am correct but I will gladly admit fault. This whole LLM thing is indeed new to me also