Saturday, January 11

The AI lab waging a guerrilla war over exploitative AI

videobacks.net

remembers well moment he officially jumped into the between and : when one asked for AI

at the of , Zhao had made a name for himself by to from facial recognition . It was this that caught the of , a who invited him to a call in 2022 hosted by the , an for artists working in

On , artists shared of how they had been hurt by the boom, which was then . At that moment, AI was suddenly everywhere. The was buzzing over -generating AI models, such as , Diffusion, and ' DALL-E 2, which could follow simple prompts to depict fantasylands or whimsical chairs made of avocados. 

But these artists saw this technological as a new kind of . They felt the models were effectively stealing and replacing their work. Some had found that their had been scraped off the and used to the models, while others had discovered that their own names had become prompts, causing their work to be drowned out by AI knockoffs.

Zhao remembers being shocked by what he heard. “People are literally telling you they're losing their livelihoods,” he told me one afternoon this , sitting in his Chicago living room. “That's something that you just can' ignore.” 

So on the Zoom, he made a proposal: What if, hypothetically, it was possible to build a mechanism that would help mask their art to interfere with AI ?

would a that if someone wrote my name and made a prompt, like, came out,” responded Karla Ortiz, a prominent artist. “Just, like, bananas or some weird stuff.” 

That was the convincing Zhao needed—the moment he joined the cause.

Fast- to , and millions of artists have deployed tools born from that Zoom: Glaze and Nightshade, which were developed by Zhao and the 's Lab (an acronym for “security, , , and ”).

Arguably the most prominent in an artist's against nonconsensual AI scraping, Glaze and Nightshade work in similar ways: by adding what the call “barely perceptible” perturbations to an image's pixels so that machine-learning models cannot read them properly. Glaze, which has been downloaded more than 6 million times since it launched in , adds what's effectively a secret cloak to images that prevents AI algorithms from picking up on and copying an artist's . Nightshade, which I wrote about when it was almost exactly a ago this fall, cranks up the offensive against AI by adding an invisible layer of poison to images, which can break AI models; it has been downloaded more than 1.6 million times. 

 » …
Read More

videobacks.net