Researchers at the University of Chicago have released a new tool called Nightshade that allows artists to protect their work from AI companies that scrape artworks without consent for training AI models.
The tool will thwart attempts of data scraping by turning the image into a sample that is unsuitable for training. More precisely, Nightshade will transform the images into “poison samples” so that any AI models training on them will learn unpredictable behaviours. These models will then output erroneous result when a user attempts to generate an image. For example, “a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space,” the researchers explained.
Nightshade will not affect how a normal user sees the artwork. “While human eyes see a shaded image that is largely unchanged from the original, the AI model sees a dramatically different composition in the image. For example, human eyes might see a shaded image of a cow in a green field largely unchanged, but an AI model might see a large leather purse lying in the grass. Trained on a sufficient number of shaded images that include a cow, a model will become increasingly convinced cows have nice brown leathery handles and smooth side pockets with a zipper, and perhaps a lovely brand logo,” the researchers elaborated. Moreover, images edited by Nightshade are resilient to changes. Even if a user crops it, compresses it, adds noise, takes a screenshot, etc., the effects of the “poison” will remain. “This is because it is not a watermark or hidden message (steganography), and it is not brittle.”
However, the tool does come with certain limitations. Changes made by Nightshade to an image are more visible on artworks that have flat colours and smooth backgrounds and the tool is unlikely to stay future-proof over long periods as AI companies will find ways around them, the developers cautioned.
“Since their arrival, generative AI models and their trainers have demonstrated their ability to download any online content for model training. For content owners and creators, few tools can prevent their content from being fed into a generative AI model against their will. Opt-out lists have been disregarded by model trainers in the past, and can be easily ignored with zero consequences. They are unverifiable and unenforceable, and those who violate opt-out lists and do-not-scrape directives can not be identified with high confidence,” the researchers said.
“Nightshade’s goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative,” the researchers said.
The tool is free for artists and is designed to run locally on the user’s computer, so there is no data sent back to the developers. Those interested in using the tool can find more details here. The researchers have also published a technical paper for those interested.
The team behind Nightshade is the same team behind another popular tool called Glaze, which is a defensive tool that allows artists to protect themselves against style mimicry by AI models. Nightshade, on the other, is an offensive tool that curbs scraping without consent. “Glaze should be used on every piece of artwork artists post online to protect themselves, while Nightshade is an entirely optional feature that can be used to deter unscrupulous model trainers. Artists who post their own art online should ideally have both Glaze AND Nightshade applied to their artwork,” the researchers advised.
Not all are happy about this tool though. Some web users have complained that this is “tantamount to a cyberattack on AI models and companies,” VentureBeat reported.
Also Read
- Six Key Arguments By Stability AI Over Allegations Of Copyright Violation By Artists In US
- OpenAI Responds To The New York Times Copyright Lawsuit Calling It Meritless
- US Copyright Office Seeks Public Views On Study Into AI And Copyright
- US Court Says Copyright Law Only Protects Work Created By Humans, Not AI: Here’s Why
STAY ON TOP OF TECH NEWS: Our daily newsletter with the top story of the day from MediaNama, delivered to your inbox before 9 AM. Click here to sign up today!