Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Poisoning Tool. Show all posts

New Data ‘Poisoning’ Tool Empowers Artist to Combat AI Scraping

 

AI scraping, a technique employed by AI companies to train their AI models by acquiring data from online sources without the owners' consent, is a significant issue impacting generative AI models. 

AI scraping, which uses artists' works to create new art in text-to-image models, can be particularly damaging to visual artists. But now there might be a way out. 

Nightshade is an entirely novel tool developed by University of Chicago researchers that allows artists the option to "poison" their digital artwork in order to stop developers from using it to train AI systems. 

According to the MIT Technology Review, which received an exclusive preview of the research, artists can employ Nightshade to alter pixels in their creations of art that are invisible to the human eye but that trigger "chaotic" and "unpredictable" breaks in the generative AI model. 

By influencing the model's learning, the prompt-specific attack makes generative AI models generate useless outputs that confuse subjects for each other.

For instance, the model might come to understand that a dog is actually a cat, which would lead to the creation of misleading visuals that don't correspond with the text prompt. Additionally, the research paper claims that in less than 100 samples, traces of nightshade poison can corrupt a stable diffusion prompt. 

The poisoned data is tough to remove from the model because the AI company would have to go in and delete each poisoned sample manually. Nightshade not only has the ability to deter AI firms from collecting data without authorization, but it also encourages consumers to exercise caution when using any of these generative AI models. 

Other efforts have been made to address the issue of artists' work being utilised without their permission. Some AI picture-generating models, such as Getty photos' image generator and Adobe Firefly, only use images that have been accepted by the artist or are open-sourced to train their models and have a compensation program in place.