These folks have made this tool that can apparently mess with the way image generators interpret specific pieces of art or images that get ran through it.
The group behind it consists of well-established names, so it's a trustworthy program. I had found a post listing them at one point, but I'm stuck on my phone for the time being and can't easily find it.
Here's the first artwork to be ran through it:
There are some examples of visible artifacts on the protected images.
Replies
but how does it look when processed by ai? because lets face it, the majority of people doesnt care if it looks slightly different as slightly different is what they generate anyways?
There was an example posted somewhere within that Twitter thread- I can't find it, though.
They showed the version the generator recreated as something akin to a messed up VHS tape- huge blocks of random colors and structural corruption
Ah, here it is
Do we know if there's a paper to support this ? it sounds far-fetched to me.
Even if it does work I have serious doubts it'll continue to work for more than about a week after it gets in someone's way
Here's the paper on it- still under peer review, though:
If it does get engineered around, I wonder if that might give artists some ammo in terms of legal options- maybe it would prove malicious intent on the image generator's end?
We've discussed the legal stuff ad-nauseum on here. Like any copyright issue it's fundamentally unenforceable if the person doing the bad stuff is somewhere that bad stuff isn't illegal.
As far as the tech goes..
I have trouble believing it could ever guarantee an imperceptible change because the whole process is reliant on the source image being changed via style transfer.
It certainly isn't imperceptible now
It's probably good enough for most use cases though - eg. Portfolio
You get the feeling it is going to be like a doctor treating bacteria. Yes there is a method to stop AI for now, but how long until the AI programs are able to overcome this band aid? Then you are going to need a Glaze 2.0, then AI art finds a way to get past 2.0 and repeat the process.
Yeah, you can see the artifacts of the process- they don't really interfere with the art itself any more than a watermark does, though.
As it goes- thieves have ever been in an arms race against defensive countermeasures.
As far as enforceability is concerned shouldn't it be enough to outlaw the use of the data gained from scraping in the AI tools as well as put a stop to the practice as such? Presumably these companies will still want to do business on the open market and not hide out in North Korea or wherever.
Then wouldn't the issue simply go away? Private users might swap the cherished database containing most of Artstation up to 2022 and keep old builds alive but any new development would have to happen on a clean slate and businesses would want to stay well clear of any possibly dodgy AI output.
yes, for sure.
my point is that if someone decides it's worth their while to rip stuff off, they'll still do it.
there's lots of knockoff spiderman pyjamas etc. that people make a decent living off.
That's always been the case with all manner of thievery.
We still always lock our doors, as it were.
Here's how I would do it:
Unless it is part of the public domain, the data needs to be opt-in, completely transparent and royalties need to be paid to the artist/writer/programmer each time it is used.
Because a large amount of data was used unconsensually, all AI will have to be retrained from scratch or they can't be used in commercial products. The models were trained using data that was acquired unethically. That means no more scraping ArtStation and Sketchfab.
AI is nothing without our data. AI is here to stay but so far Adobe are the only ones doing this ethically.
Would I be able to use it on Sketchfab? I am not sure how it works, is it only for rendered images, or could I, let's say, use it on my textures?
Or maybe we can get a similar feature on Sketchfab as noise. It already has grain in post processing, if a Glaze like feature would get added that could not be turned off it would maybe help the cause.
I have had my art added to one of those "AI dataset" collections before the NoAI tag was added
That's a good question- does anybody know how 3d artwork is fed to generators?
You could always ask the folks on Twitter, or send an email through their website