{"id":9603,"date":"2023-10-28T16:37:32","date_gmt":"2023-10-28T11:07:32","guid":{"rendered":"https:\/\/farratanews.online\/poison-pill-tool-could-break-ai-systems-stealing-unauthorized-data-allowing-artists-to-safeguard-their-works\/"},"modified":"2023-10-28T16:37:32","modified_gmt":"2023-10-28T11:07:32","slug":"poison-pill-tool-could-break-ai-systems-stealing-unauthorized-data-allowing-artists-to-safeguard-their-works","status":"publish","type":"post","link":"https:\/\/farratanews.online\/poison-pill-tool-could-break-ai-systems-stealing-unauthorized-data-allowing-artists-to-safeguard-their-works\/","title":{"rendered":"Poison pill tool could break AI systems stealing unauthorized data, allowing artists to safeguard their works"},"content":{"rendered":"
[ad_1]\n<\/p>\n
\n
A new image protection tool was designed to poison AI programs that are trained using unauthorized data, giving creators a new way to safeguard their pieces and harm systems they say are stealing their works.\u00a0<\/p>\n
Nightshade, a new tool from a University of Chicago team, puts data into an image’s pixels that damage AI image generators that scour the web looking for pictures to train on, causing them to not work properly. An AI program might interpret a Nightshade-protected image of a dog, for example, as a cat, a photo of a car could be seen as a cow, and so on, causing the machine to malfunction, according to the team’s research.\u00a0<\/p>\n