this post was submitted on 25 Aug 2023
262 points (90.2% liked)

Technology

59436 readers
2960 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Social media companies are receding from their role as watchdogs against conspiracy theories ahead of the 2024 presidential election.

you are viewing a single comment's thread
view the rest of the comments
[–] Jakeroxs@sh.itjust.works 21 points 1 year ago (3 children)

Why do we want social media companies to be the arbiters of truth anyway...

[–] TwilightVulpine@lemmy.world 20 points 1 year ago (1 children)

Because like it or not that is where a lot of people get information these days. If it keeps pushing bulshit, people believe bulshit. For an example, anti-vaxxers didn't use to be so common, until their bulshit was spread all over social media.

I would love for people to be wise enough to verify information in reliable sources and not just believe everything they see, but sadly that's not the world where we live in.

[–] Jakeroxs@sh.itjust.works 1 points 1 year ago (1 children)

Antivax sentiment has been around for hundreds of years, long before the Internet, mostly through political party rhetoric and/or religion, not saying the spread likely hasn't increased, but people believe wrong information all the time.

[–] TwilightVulpine@lemmy.world 1 points 1 year ago

There is always a nutball, but my point is that, yes, it has increased significantly. Vaccines were a settled matter already, people far and wide trusted them. Now vaccination rates have gone down and diseases that we had nearly eliminated are having a comeback. This has happened because now any stupid grifter can have a worldwide platform and a following who actively spreads their nonsense.

[–] andrewrgross 17 points 1 year ago* (last edited 1 year ago) (1 children)

I think we need to pursue a strategy that attempts to discourage the spread of disinformation while avoiding making them the arbiters of truth.

I think social media platforms are like a giant food court. If you do nothing to discourage the spread of germs, your salad bar and buffets are all going to be petri dishes of human pathogens. That doesn't mean that the food court needs to put in hospital-level sterilization measures. It just means that the FDA requires restaurants to use dishwashers that get up to 71 C, and employees are required to wash their hands.

In this case, I think we should experiment. What if platforms were required to let users flag something as disinformation, and share a credible source if they like? Maybe users could see all the flags and upvote or downvote them. The information would still be there, but you'd go to the InfoWars page and it would say, "Hey: You should know that 95% of people say this page posts mostly bullshit."

Something like that. I don't like the role the companies play currently, but disinformation does carry the potential to cause serious harm.

[–] root@precious.net -4 points 1 year ago (2 children)

Remember when social media was deleting news stories about a certain laptop?

[–] andrewrgross 3 points 1 year ago

Yes?

I can't tell if you're agreeing with me or not.

[–] Jakeroxs@sh.itjust.works 1 points 1 year ago

I am also against deleting valid news about wrongdoing for Democrats, if you're implying this stance is political in some way.

[–] transistor@lemdro.id 5 points 1 year ago

They shouldn't be the arbiters of truth anyway.