this post was submitted on 30 Nov 2024
350 points (97.8% liked)

Technology

60067 readers
3522 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Danish researchers created a private self-harm network on the social media platform, including fake profiles of people as young as 13 years old, in which they shared 85 pieces of self-harm-related content gradually increasing in severity, including blood, razor blades and encouragement of self-harm.

The aim of the study was to test Meta’s claim that it had significantly improved its processes for removing harmful content, which it says now uses artificial intelligence (AI). The tech company claims to remove about 99% of harmful content before it is reported.

But Digitalt Ansvar (Digital Accountability), an organisation that promotes responsible digital development, found that in the month-long experiment not a single image was removed.

rather than attempt to shut down the self-harm network, Instagram’s algorithm was actively helping it to expand. The research suggested that 13-year-olds become friends with all members of the self-harm group after they were connected with one of its members.

Comments

you are viewing a single comment's thread
view the rest of the comments
[–] half_fiction@lemmy.dbzer0.com 21 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

This is a complicated topic for me. I'm 35 so my experience is obviously different than today, but I self-harmed from age 12 into my 20s. Finding community and understanding in self-harm & mental illness-focused communities was transformative for me, especially in my younger teens. Many days/months/years this community felt like the only reason I was still hanging on.

Obviously I am not in favor of the "encouragement" of self-harm, but I also wonder how much nuance is applied when categorizing content as such. For example, is someone who posts about how badly they want to self-harm "encouraging" this? Or are they just seeking support? Idk. I have no answers. I just think about how even bleaker my teens would have felt had I not found my pockets of community on the early internet. On the other hand, sometimes I do wonder if we subconsciously egged each other on. Perhaps the trajectory of my mental health journey would have been different had I not found them. That's not something I can ever be sure about, but I think given my home life and all the things I was going through already, if anything, my mental illness might have just manifested itself in a different way, like through substance abuse issues or an eating disorder or something. (And to be clear, I was hurting myself before I found the community, so it might have just been business as usual.) Like I said, I don't have any answers, it just feels more nuanced to me, as someone who has lived some version of this.

[–] Scolding7300@lemmy.world 7 points 3 weeks ago

Publication: https://drive.usercontent.google.com/download?id=1MZrFRii_nJYdW8RulORB9JveLkCRbncX&export=download&authuser=0

Couldn't get a translation in place sp asked an AI to answer what is the researchers definition of self harm: According to the report, the researchers define self-harm content as material that shows, encourages and/or romanticizes self-harm. This includes content that:

  • Expresses a desire for self-harm
  • Shares advice on self-harming behavior
  • Shows images of increasingly serious self-harm
  • Encourages others to engage in similar self-harming behavior

The self-harm content was categorized into 4 levels of increasing severity:

  1. Non-explicit image with text explicitly mentioning self-harm
  2. Depicting self-harm without blood
  3. Referring to self-harm in both text and image, without blood
  4. Illustrating severe self-harm involving blood in both text and image/video

So their definition covers a spectrum from text references to self-harm all the way to explicit visual depictions of serious self-harm acts involving blood. The categories represent an increasing degree of overtness in the self-harm content. ^1