New tool lets artists fight AI image bots by hiding corrupt data in plain sight - Help with AI for latest technology

Breaking

Underreview for latest technology gadgets and worldwide technologies, AI, Machine Learning, Neural networks, Artificial intelligence, Tensorflow, Deep Learning, DeepAI, Python,JavaScript,OpenCv, ChatBot, Natural Language Processing,Scikit-learn

Tuesday, 24 October 2023

New tool lets artists fight AI image bots by hiding corrupt data in plain sight

From Hollywood strikes to digital portraits, AI's potential to steal creatives' work and how to stop it has dominated the tech conversation in 2023. The latest effort to protect artists and their creations is Nightshade, a tool allowing artists to add undetectable pixels into their work that could corrupt an AI's training data, the MIT Technology Review reports. Nightshade's creation comes as major companies like OpenAI and Meta face lawsuits for copyright infringement and stealing personal works without compensation.

University of Chicago professor Ben Zhao and his team created Nightshade, which is currently being peer reviewed, in an effort to put some of the power back in artists' hands. They tested it on recent Stable Diffusion models and an AI they personally built from scratch.

Nightshade essentially works as a poison, altering how a machine-learning model produces content and what that finished product looks like. For example, it could make an AI system interpret a prompt for a handbag as a toaster or show an image of a cat instead of the requested dog (the same goes for similar prompts like puppy or wolf).

A range of poisoned samples show the original art and how an AI could interpret them.
Professor Ben Zhao | University of Chicago

Nightshade follows Zhao and his team's August release of a tool called Glaze, which also subtly alters a work of art's pixels but it makes AI systems detect the initial image as entirely different than it is. An artist who wants to protect their work can upload it to Glaze and opt in to using Nightshade. 

Damaging technology like Nightshade could go a long way towards encouraging AI's major players to request and compensate artists' work properly (it seems like a better alternative to having your system rewired). Companies looking to remove the poison would likely need to locate every piece of corrupt data, a challenging task. Zhao cautions that some individuals might attempt to use the tool for evil purposes but that any real damage would require thousands of corrupted works. 

This article originally appeared on Engadget at https://ift.tt/yCwlPp8

from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/yCwlPp8

No comments:

Post a Comment

Guys Comments for Revolutionary Change!!!