this post was submitted on 04 Feb 2024
261 points (96.4% liked)

Lemmy.World Announcements

29057 readers
2 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news ๐Ÿ˜

Outages ๐Ÿ”ฅ

https://status.lemmy.world/

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations ๐Ÿ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 1 year ago
MODERATORS
 

Dear Lemmy.world Community,

Recently posts were made to the AskLemmy community that go against not just our own policies but the basic ethics and morals of humanity as a whole. We acknowledge the gravity of the situation and the impact it may have had on our users. We want to assure you that we take this matter seriously and are committed to making significant improvements to prevent such incidents in the future. Considering I'm reluctant to say exactly what these horrific and repugnant images were, I'm sure you can probably guess what we've had to deal with and what some of our users unfortunately had to see. I'll add the thing we're talking about in spoilers to the end of the post to spare the hearts and minds of those who don't know.

Our foremost priority is the safety and well-being of our community members. We understand the need for a swift and effective response to inappropriate content, and we recognize that our current systems, protocols and policies were not adequate. We are immediately taking immediate steps to strengthen our moderation and administrative teams, implementing additional tools, and building enhanced pathways to ensure a more robust and proactive approach to content moderation. Not to mention ensuring ways that these reports are seen more quickly and succinctly by mod and admin teams.

The first step will be limiting the image hosting sites that Lemmy.world will allow. We understand that this can cause frustration for some of our users but we also hope that you can understand the gravity of the situation and why we find it necessary. Not just to protect all of our users from seeing this but also to protect ourselves as a site. That being said we would like input in what image sites we will be whitelisting. While we run a filter over all images uploaded to Lemmy.world itself, this same filter doesn't apply to other sites which leads to the necessity of us having to whitelist sites.

This is a community made by all of us, not just by the admins. Which leads to the second step. We will be looking for more moderators and community members that live in more diverse time zones. We recognize that at the moment it's relatively heavily based between Europe and North America and want to strengthen other time zones to limit any delays as much as humanly possible in the future.

We understand that trust is essential, especially when dealing with something as awful as this, and we appreciate your patience as we work diligently to rectify this situation. Our goal is to create an environment where all users feel secure and respected and more importantly safe. Your feedback is crucial to us, and we encourage you to continue sharing your thoughts and concerns.

Every moment is an opportunity to learn and build, even the darkest ones.

Thank you for your understanding.


Sincerely,

The Lemmy.world Administration

Legal / ToS

spoilerCSAM


you are viewing a single comment's thread
view the rest of the comments
[โ€“] lurch@sh.itjust.works 31 points 9 months ago (3 children)

Well thanks for the spoiler thing, but I don't even know what the acronym (is it even an acronym?) means anyway and now I'm too afraid to do a web search for it ๐Ÿ˜…

Well, maybe it's better that way.

[โ€“] fubo@lemmy.world 32 points 9 months ago* (last edited 9 months ago) (1 children)

It's safe to look things up!

Looking up the name of a crime does not mean that you're doing that crime.

If you look up "bank robbery" that doesn't make you guilty of bank robbery. It doesn't even mean you're trying to rob a bank, or even want to rob a bank. You could want to know how bank robbers work. You could be interested in being a bank guard or security engineer. You could be thinking of writing a heist story. You could want to know how safe your money is in a bank: do they get robbed all the time, or not?

Please, folks, don't be afraid to look up words. That's how you learn stuff.

[โ€“] PervServer@lemmynsfw.com 21 points 9 months ago* (last edited 9 months ago) (3 children)

::: spoiler CSAM is child sexual abuse material I believe. So yeah, better not to look it up :::

[โ€“] humorlessrepost@lemmy.world 24 points 9 months ago (2 children)

Iโ€™m pretty convinced the initialism was created so that people could Google it in an academic context without The Watchers thinking they were looking for the actual content.

[โ€“] tpihkal@lemmy.world 6 points 9 months ago (2 children)

You may be correct although it seems like pretty dumb reasoning. I doubt any of those cretins would search the words "child sexual abuse material." That would require acknowledging the abuse part of it.

[โ€“] forrgott@lemm.ee 16 points 9 months ago (1 children)

I think you may have misunderstood. The entire point is to have an academic term that would never be used as a search by one of those inhuman lowlifes.

I don't mean to be pedantic, so I hope my meaning came across well enough...

[โ€“] tpihkal@lemmy.world -4 points 9 months ago

I think my point is that the acronym exists because of the search term, not the other way around. And it's pretty laughable that the academic term has to be distilled down to an acronym because it is otherwise considered a trigger word.

[โ€“] TWeaK@lemm.ee 5 points 9 months ago

They already use codewords when chatting about it. I forget what the more advanced ones were beyond "CP", but there's a Darknet Diaries episode or two that go over it. In particular, the one about Kik. He interviews a guy who used to trade it on that platform.

CSAM is meant to differentiate between child porn, which also includes things like cartoons of children, with actual children being abused. Law enforcement have limited resources so they want to focus on where they can do the most good.

[โ€“] TWeaK@lemm.ee 4 points 9 months ago

The initialism was created to focus the efforts of law enforcement. They have limited resources, so they want to address actual children being abused, rather than Japanese cartoons. Both are child porn, but CSAM involves real children.

[โ€“] TWeaK@lemm.ee 5 points 9 months ago (2 children)

Your spoiler didn't work, apparently you need to write spoiler twice.

[โ€“] Dave@lemmy.nz 7 points 9 months ago (1 children)

The second "spoiler" is actually text for the spoiler.

This is customisable text

spoiler This is customisable text

:::

[โ€“] TWeaK@lemm.ee 4 points 9 months ago (1 children)

Ty, buy it still seems like you need something there - at least for some apps.

[โ€“] Dave@lemmy.nz 4 points 9 months ago* (last edited 9 months ago)

Yes, I don't think it works with nothing there. It needs something, but it can be pretty much any text (so long as the first word is spoiler)

[โ€“] PervServer@lemmynsfw.com 4 points 9 months ago

Works in my app :P I really wish stuff like this were more standardized across the platform. Not really much point in spoilering it now since everyone is chatting about it.

[โ€“] lurch@sh.itjust.works 2 points 9 months ago

thanks for the info!

[โ€“] bighatchester@lemmy.world 5 points 9 months ago

I'll just say illegal content involving minors.