Hoping tumblr will roll back the idiocy but in case it doesn't:
Glaze applies an invisible-to-humans filter that interrupts style mimicry. Link
Nightshade, by the same people, poisons datasets. Link
Lastly, remember how this went when ArtStation tried to pull this?
[ID: 3 images of various thumbnails from the ArtStation website. A large number of them are the same image of "AI" in a barred circle, over the text "No to AI generated images". A number of the thumbnails, on close inspection, are AI-generated images that copied this anti-AI image and incorporated it into the results. End ID.]