this post was submitted on 06 Nov 2023
15 points (66.0% liked)
Lemmy.world Support
3217 readers
47 users here now
Lemmy.world Support
Welcome to the official Lemmy.world Support community! Post your issues or questions about Lemmy.world here.
This community is for issues related to the Lemmy World instance only. For Lemmy software requests or bug reports, please go to the Lemmy github page.
This community is subject to the rules defined here for lemmy.world.
You can also DM https://lemmy.world/u/lwreport or email report@lemmy.world (PGP Supported) if you need to reach our directly to the admin team.
Follow us for server news π
Outages π₯
https://status.lemmy.world
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Hey folks,
So what happened? We manage our defederation process with fediseer. It's a system that allows us, instance admins, to follow the defederation happening on other selected instances. We generally follow the defederation of sh.itjust.works, lemmy.dbzer0.com, literature.cafe, lemmings.world and a few other instances that have proven to be trustworthy in the past. However, in most of the cases I will also review them manually instead of blindly following them. That's why I added a censure for ani.social with tag "Loli" as was being reported - contrary to the "CSAM" that was being reported by Lemmy.ml.
When I was informed that might have been a mistake I delved a bit deeper and saw there was indeed no proof of any CSAM or even LOLI being posted - and I deleted the censure and we federated again. After this I made a post on the ani.social instance about what happened.
We always put censures in Fediseer to be as transparant as possible but we usually don't announce blocks on CSAM, Loli or Spam instances. Why? Because for Loli and Csam we don't want to give those instances extra exposure by announcing them. And for Spam, there's just too many and it would be hard to keep up.
Again, yes we were quick on the buttons but the csam/loli blocking is treated in a "better safe than sorry" way. We do this to keep our users and team safe. And everything was corrected within an acceptable time imo.
Fair enough. Thank you for the transparency and for resolving it quickly.