this post was submitted on 19 Sep 2023
71 points (88.2% liked)

Lemmy

12514 readers
58 users here now

Everything about Lemmy; bugs, gripes, praises, and advocacy.

For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.

founded 4 years ago
MODERATORS
 

There's another round of CSAM attacks and it's really disturbing to see those images. It was really bothering to see those and they weren't taken down immediately. There was even a disgusting shithead in the comments who thought it was funny?? the fuck

It's gone now but it was up for like an hour?? This really ruined my day and now I'm figuring out how to download tetris. It's really sickening.

you are viewing a single comment's thread
view the rest of the comments
[–] ryannathans@lemmy.fmhy.net 4 points 1 year ago (1 children)

Bingo, that's the issue. With an endless supply of fresh content, hash checking is dead

[–] fubo@lemmy.world 4 points 1 year ago (1 children)

On the other hand, if the people who want those images can satisfy their urges using AI fakes, that could mean less spreading of images of actual abuse. It might even mean less abuse happening.

However, because they're terrible people, I have to suspect that's not the case.

[–] Facebones@reddthat.com 4 points 1 year ago

People who create the content and insane monsters, but a LOT of actual pedos (vs predators looking for a power play) are disgusted by their preference. I know a ton of them look to cartoons already for stimulation, so I think AI content could draw more people away from actual material. Hopefully if demand reduces there will be less creation of new real content as the potential profits fall more proportionate to the risk.