308
Data poisoning: how artists are sabotaging AI to take revenge on image generators
(theconversation.com)
This is a most excellent place for technology news and articles.
So it sounds like they are taking the image data and altering it to get this to work and the image still looks the same just the data is different. So, couldn't the ai companies take screenshots of the image to get around this?
Not even that, they can run the training dataset through a bulk image processor to undo it, because the way these things work makes them trivial to reverse. Anybody at home could undo this with GIMP and a second or two.
In other words, this is snake oil.