this post was submitted on 24 Jul 2023
193 points (79.3% liked)

Technology

34778 readers
312 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] gabe@literature.cafe 0 points 1 year ago (1 children)

Does the CSAM scanner hook into lemmy properly though?

It looks like it scans and flags on the outbound (user download of the image), so as long as it sits in front of your instance, it should work just fine.

You're still responsible for removing the material, complying with any preservation requirements, and any other legal obligations, and notifying CloudFlare that it's been removed.

It would be ideal if it could block on upload, so the material never makes it to your instance, but that would likely be something else like integration with PhotoDNA or something similar.