Art & Design

Latest Stories

Nightshade: The Artistic Shield Against AI Scraping

Written by: Molly-Anna MaQuirl | Posted: 19-02-2024

Nightshade: The Artistic Shield Against AI Scraping

This is an AI-generated image created with Midjourney by Molly-Anna MaQuirl

A newly released tool created to disrupt AI models scraping artwork, ‘Nightshade’, has hit 250,000 downloads in the first five days since its release in January 2024.

The tool was created by computer science researchers at the University of Chicago and has been used to disrupt the AI technology that drives AI image generators by altering the images it scrapes.

Nightshade hit 250K downloads in 5 days since release,” said the head of the project, Ben Zhao, a professor of computer science, “I expected it to be [met with] extremely high enthusiasm. But I still underestimated it…The response is simply beyond anything we imagined.” 

The appetite for a tool like this comes from artists hitting back at AI news of advancements in training AI with existing images. There are over 2 million artists in the United States, and many of them don’t want their images to be used to train models like this. 

We have not done geolocation lookups for these downloads,” Zhao wrote. “Based on reactions on social media, the downloads come from all over the globe.”

How does AI data poisoning work?

The software works by trying to “poison” AI image generator models, altering the artworks by adding shading on a pixel level, so that the algorithms show something totally different to the actual content. Its goal is to make the algorithms produce errors and not respond properly to the prompts.

On the Nightshade project page, Zhao, along with colleagues Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng, discussed how they want to increase the cost of training the AI on unlicensed data so that paying for licensing becomes a viable alternative.

The team last year released a tool called Glaze, which does a similar thing, adding pixels that show an image as something else to the algorithms and stopping them from learning an artist’s “style”. It has received over two million downloads already.

The Glaze Project is the moniker that Zhao and his fellow researchers have taken on, and they have previously stated their intention to release a tool combining both of their innovations; Glaze (defensive) and Nightshade (offensive).

We simply have a lot of to-dos on our list right now,” Zhao said. “The combined version must be carefully tested otherwise ensure we don’t have surprises later. So I imagine at least a month, maybe more, for us to get comprehensive tests done.”

Many artists feel like they have no power over AI’s ability to recreate their work, and this provides an option for them to tackle the issues and try to safeguard their work for the future. 

With such high demand, it’s likely similar products may be seen in AI news features in the near future.

Explore More News

Nightshade: The Artistic Shield Against AI Scraping

Written by: Molly-Anna MaQuirl | Posted: 19-02-2024

Nightshade: The Artistic Shield Against AI Scraping

This is an AI-generated image created with Midjourney by Molly-Anna MaQuirl

A newly released tool created to disrupt AI models scraping artwork, ‘Nightshade’, has hit 250,000 downloads in the first five days since its release in January 2024.

The tool was created by computer science researchers at the University of Chicago and has been used to disrupt the AI technology that drives AI image generators by altering the images it scrapes.

Nightshade hit 250K downloads in 5 days since release,” said the head of the project, Ben Zhao, a professor of computer science, “I expected it to be [met with] extremely high enthusiasm. But I still underestimated it…The response is simply beyond anything we imagined.” 

The appetite for a tool like this comes from artists hitting back at AI news of advancements in training AI with existing images. There are over 2 million artists in the United States, and many of them don’t want their images to be used to train models like this. 

We have not done geolocation lookups for these downloads,” Zhao wrote. “Based on reactions on social media, the downloads come from all over the globe.”

How does AI data poisoning work?

The software works by trying to “poison” AI image generator models, altering the artworks by adding shading on a pixel level, so that the algorithms show something totally different to the actual content. Its goal is to make the algorithms produce errors and not respond properly to the prompts.

On the Nightshade project page, Zhao, along with colleagues Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng, discussed how they want to increase the cost of training the AI on unlicensed data so that paying for licensing becomes a viable alternative.

The team last year released a tool called Glaze, which does a similar thing, adding pixels that show an image as something else to the algorithms and stopping them from learning an artist’s “style”. It has received over two million downloads already.

The Glaze Project is the moniker that Zhao and his fellow researchers have taken on, and they have previously stated their intention to release a tool combining both of their innovations; Glaze (defensive) and Nightshade (offensive).

We simply have a lot of to-dos on our list right now,” Zhao said. “The combined version must be carefully tested otherwise ensure we don’t have surprises later. So I imagine at least a month, maybe more, for us to get comprehensive tests done.”

Many artists feel like they have no power over AI’s ability to recreate their work, and this provides an option for them to tackle the issues and try to safeguard their work for the future. 

With such high demand, it’s likely similar products may be seen in AI news features in the near future.