Artists aim to thwart AI with data-poisoning software and legal action
With AI-generated content continuing to evolve, the advent of data-poisoning tools capable of shielding an artist’s works from AI could be a game changer.
As the use of artificial intelligence (AI) has permeated the creative media space — especially art and design — the definition of intellectual property (IP) seems to be evolving in real time as it becomes increasingly difficult to understand what constitutes plagiarism.
Over the past year, AI-driven art platforms have pushed the limits of IP rights by utilizing extensive data sets for training, often without the explicit permission of the artists who crafted the original works.
For instance, platforms like OpenAI’s DALL-E and Midjourney’s service offer subscription models, indirectly monetizing the copyrighted material that constitutes their training data sets.
Read more