70
submitted 3 months ago by andrewrgross to c/meta

I don't know if you need this info, but I was pretty disturbed to see unexpected child pornography on a casual community. Thankfully it didn't take place on SLRPNK.net directly, but if anyone has any advice besides leaving the community in question, let me know. And I wanted to sound an alarm to make sure we have measures in place to guard against this.

you are viewing a single comment's thread
view the rest of the comments
[-] andrewrgross 2 points 3 months ago

That's pretty shocking.

What tools are available to us to manage this?

[-] poVoq 13 points 3 months ago

The best tool that is currently available is lemmy-safty AI image scanning that can be configured to check images on upload or regularly scan the storage and remove likely csam images.

It's a bit tricky to set up as it requires an GPU in the server and works best with object storage, but I have a plan to complete the setup of it for SLRPNK sometimes this year.

[-] silence7 4 points 3 months ago

This is probably the best option; in a world where people use ML tools to generate CSAM, you can't depend on visual hashes of known-problematic images anymore.

this post was submitted on 04 Feb 2024
70 points (98.6% liked)

Meta (slrpnk.net)

553 readers
15 users here now

Here we can discuss anything about this Lemmy instance/server itself.

Our XMPP support chat: Movim or XMPP client.

Please also refer to our Wiki

founded 2 years ago
MODERATORS