this post was submitted on 01 Feb 2024
346 points (96.0% liked)

Technology

59340 readers
5591 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Mark Zuckerberg says sorry to families of children who committed suicide — after rejecting suggestion to set up a compensation fund to help the families get counseling::CEOs of Meta, TikTok, Snap, Discord, and X testified at hearing on child safety.

all 33 comments
sorted by: hot top controversial new old
[–] angelsomething@lemmy.one 48 points 9 months ago* (last edited 9 months ago) (1 children)

So if they look like lizard people, and speak like lizard people, and when they blink their eyelids move horizontally, doesn’t that make them lizard people? Bunch of cunts, the lot of them. Especially Zuck. Poison of this world and they know it. And by the way, by lizard people I mean literal people that are so distanced from reality that they may well be from another planet.

[–] SuckMyWang@lemmy.world 34 points 9 months ago (1 children)

In Marks defence he is a piece of shit. (That was the best defence I could come up with)

[–] VampyreOfNazareth@lemm.ee 2 points 9 months ago

Haha to the point, nice.

[–] stoly@lemmy.world 25 points 9 months ago (2 children)

Why on earth would someone think that this douchenozzle is capable of empathy for humans? He literally stole facebook because he felt entitled to it and had no problems letting governments use it to coordinate genocides THAT HE WAS AWARE OF. No, if there is a hell, this person will be at the top levels of tortured souls and he fully deserves to suffer.

[–] WindowsEnjoyer@sh.itjust.works 4 points 9 months ago (2 children)

He literally stole facebook because he felt entitled to it and had no problems letting governments use it to coordinate genocides

Sorry, but what? Can you elaborate more on this?

[–] ours@lemmy.world 10 points 9 months ago (1 children)
[–] Scotty_Trees@lemmy.world 3 points 9 months ago (1 children)

Don't forget about Facebook's responsibility in the Burma/Myanmar genocides too.

"How Facebook Became a Tool for Genocide" https://www.youtube.com/watch?v=K8B0bWO9u3M

[–] ours@lemmy.world 2 points 9 months ago

When a company is accused of aiding genocide and the answer is "which one?", yeah, that's bad.

[–] stoly@lemmy.world 1 points 9 months ago

Someone else posted a link to one instance, but I believe there have been 3 or 4 of them.

[–] zmrl@lemmy.zip 3 points 9 months ago

I'm not convinced he even has a soul

[–] demonsword@lemmy.world 15 points 9 months ago (2 children)

So what he said basically boils down to "sorry, not sorry"

[–] simonced@lemmy.one 2 points 9 months ago (1 children)

Sound more like "not really sorry, and don't even care..." to me...

[–] xavier666@lemm.ee 1 points 9 months ago
[–] DannyMac@lemmy.world 5 points 9 months ago
[–] ItsAFake@lemmus.org 4 points 9 months ago
[–] Thcdenton@lemmy.world 4 points 9 months ago (5 children)

Not a fan of the reptilian, but this isn't fb's fault. This is on the abusers, the kids that killed themselves and the careless parents.

[–] 31337@sh.itjust.works 53 points 9 months ago

Meta could've done a lot of things to prevent this. Internal documents show Zuckerberg repeatedly rejected suggestions to improve child safety. Meta lobbies congress to prevent any regulation. Meta controls the algorithms and knows they promote bad behavior such as dog piling, but this bad behavior increases "engagement" and revenue, so they refuse to change it. (Meta briefly changed its algorithms for a few months during the 2020 election to decrease the promotion of disinformation and hate speech, because they were under more scrutiny, but then changed it back after the election).

[–] Anyolduser@lemmynsfw.com 14 points 9 months ago (6 children)

Canada was not available to be blamed.

It's down to parenting, or lack thereof. No politician can say "parents of America, quit giving your children unrestricted internet access and being surprised when they see horrible shit" and keep their job.

Kids don't need smartphones.

Sites can be blacklisted on home and school routers.

Strict parents can be blamed by kids if they catch flak from their peers for not being on social media.

It ain't rocket surgery, but you need to be willing to spend time with your kids instead of slapping a phone in front of them to keep them quiet.

I've got a kid that's magnetically attracted to any screen. I get the temptation but I don't need a study to tell me that unrestricted internet access is fucking horrible for kids.

[–] Doorbook@lemmy.world 11 points 9 months ago (1 children)

This ignore situation were kids didn't have social media and abusers post it there. Like sexual assaults and exploitation of childrens.

Not having a moderated platform with the ability to be private is something the platform should be held responsible for.

Imagine you have a studium full of fans waiting for the match to start, then someone comes in with a big screen playing a sexual abuse video then leave the stadium. It is normal to sue the stadium for lack of security along with suing the abuser.

Issues like bullying is harder but when the social network doesn't remove abuse content they are at fault.

Facebook remove staff and systematically ignore report of these kinds because it would affect their value.

Finall note the us government is useless and they do this for show to look cool in front of their voters. EU done more to these corporations.

[–] Anyolduser@lemmynsfw.com 2 points 9 months ago* (last edited 9 months ago)

I'm ignoring that situation because we've had laws on the books regarding CSAM and ferocious enforcement of them for decades.

[–] Meowoem@sh.itjust.works 7 points 9 months ago

There's a common thing parents do though where they don't notice the point they lose total control, or lose control totally.

It's almost impossible to keep kids from the internet, they can't stop prisoners getting phones in so what hope do parents have? How do you stop them using an account accessed by school computers, a secret second phone brought second hand or even worse brought for them by a creepy guy online. And if you block the services you know of it'll push them into ones you've never heard of, unmoderated and dangerous places.

And of course there's the dream of trust but none of us tell our parents everything, especially when we've already gone too far and are embarrassed we broke the trust.

If you as a kid are going to miss out on what feels like everything that's happening with your friends then you'll find a way. Or you'll get bullied at school by groups the form online and with online memes.

There needs to be safe places that kids can access social media, just saying they can't until they're a certain age won't work and if it did then it sets them up for a lot of issues on their first day.

A lot of it is down to parents to teach internet skills and awareness, it's also down to major platforms that target kids as a key audience to ensure there are effective systems in place to combat and avoid negative situations which might result in a child being harmed.

[–] Fluffy_Ruffs@lemmy.world 5 points 9 months ago

"we've tried nothing and we're all out of ideas!"

[–] Speculater@lemmy.world 3 points 9 months ago

I think a lot of parents don't want to talk about what their children will encounter. Grooming, NSFW content, bullying, and misinformation.

Parents of the current generation usually had unrestricted Internet access if we had it at all, because our parents were Internet ignorant, on average. We can share those lessons.

[–] Steve@communick.news 2 points 9 months ago* (last edited 9 months ago)

Now I'm wondering. Is this a potential opportunity for the Fediverse?

Creating a walled in, heavily moderated social network for kids and teens.
Parents could be mods.
Would need some kind of age verification.
Maybe parents could setup accounts for themselves and their kids.

Just thinking this over as I type. I don't know.

[–] angrymouse@lemmy.world 1 points 9 months ago* (last edited 9 months ago)

I agree with almost everything but

It ain't rocket surgery

Got me thinking.

But I also think social networks could ban a lot more

[–] flango@lemmy.eco.br 10 points 9 months ago

Don't be so naïve man, Facebook makes money promoting violence.

https://youtu.be/TkYhCp64cPY?si=sVq3-llkzvPDqQib

[–] Honytawk@lemmy.zip 2 points 9 months ago

Cause adding a complete blocking feature is not up to facebook?

[–] stoly@lemmy.world -1 points 9 months ago

So what you're saying is that victims of bullying are the real problem, not the people being bullies.

[–] Lutra@lemmy.world 3 points 9 months ago

headline: "We're still asking some people what they think should be done about the harm they caused."

must be nice to get asked what you think you you might want to do about it.

[–] autotldr@lemmings.world 2 points 9 months ago

This is the best summary I could come up with:


During a Senate Judiciary Committee hearing weighing child safety solutions on social media, Meta CEO Mark Zuckerberg stopped to apologize to families of children who committed suicide or experienced mental health issues after using Facebook and Instagram.

asked Zuckerberg if he had ever apologized and suggested that the Meta CEO personally set up a compensation fund to help the families get counseling.

Zuckerberg did not agree to set up any compensation fund, but he turned to address families in the crowded audience, which committee chair Dick Durbin (D-Ill.) described as the "largest" he'd ever seen at a Senate hearing.

Among these bills is the Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act (STOP CSAM).

When that bill was introduced, it originally promised to make platforms liable for "the intentional, knowing, or reckless hosting or storing of child pornography or making child pornography available to any person.” Since then, Durbin has amended the bill to omit the word "reckless" to prevent platforms from interpreting the law as banning end-to-end encryption, Recorded Future News reported.

Durbin noted that X became the first social media company to publicly endorse the STOP CSAM Act when X CEO Linda Yaccarino agreed to support the bill during today's hearing.


The original article contains 414 words, the summary contains 208 words. Saved 50%. I'm a bot and I'm open source!