this post was submitted on 10 Jul 2023
329 points (93.2% liked)

Technology

58133 readers
4709 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

About 59% of Americans say TikTok a threat to the national security of the United States, according to a new survey of U.S. adults.

you are viewing a single comment's thread
view the rest of the comments
[–] NightOwl@lemmy.one 6 points 1 year ago (2 children)

I've come to the conclusion that it is algorithms that have become evil. There was a thread where someone was asking for help stoppinh YouTube from radicalizing their mother due to the videos it would suggest to her.

I use stuff like newpipe and freetube to try and get away from these personalized attempts at content, since there is still good content on YouTube. It's just that so many sites try and keep you there as long as possible and then start feeding you content that can warp people. But, algorithms don't understand the impact of it, since it's either a 0 or 1 of user stays or user leaves.

[–] SCB@lemmy.world 2 points 1 year ago (1 children)

If you can be radicalized by videos from YouTube, it isn't the algorithm, it's you

[–] NightOwl@lemmy.one 0 points 1 year ago (1 children)
[–] SCB@lemmy.world 1 points 1 year ago (1 children)
[–] NightOwl@lemmy.one 3 points 1 year ago (1 children)

World doesn't exist in an individual vacuum. The people negatively influenced by disinformation go onto to take a role in society and interact with others to either negatively or positively affect the people they encounter. Congratulations on your individual resilience, but the world is not a population consisting of only you with you alone determining the impact other people have on the world.

[–] SCB@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (2 children)

Yeah and, once again, those people are the problem.

Unless you want to ban any food that isn't fruits and vegetables, cars, not sleeping enough, not getting enough exercise etc, at some point you have to accept that people do in fact make their own choices.

I'm not for banning things because some people are idiots.

[–] surewhynotlem@lemmy.world 1 points 1 year ago (1 children)

you have to accept that people do in fact make their own choices.

I feel bad that you've been radicalized into thinking this way.

[–] SCB@lemmy.world 1 points 1 year ago

Im not concerned with your feelings.

[–] hglman@lemmy.ml 0 points 1 year ago (1 children)

It's not just all you or all youtube. Both matter. It's harmfully reductionist to act like it, not both.

[–] SCB@lemmy.world 2 points 1 year ago (1 children)

Both really don't matter, since adults have a right to chose to consume any content they'd like.

If your grandma finds Q fascinating, that's on your grandma

[–] hglman@lemmy.ml 0 points 1 year ago (1 children)

It's also the fault of those producing Q content.

[–] SCB@lemmy.world 2 points 1 year ago (1 children)

No it isn't. Every demand will find a seller. That's just how reality works.

The overwhelming majority of "radical" content creators are just riding a grift train.

[–] hglman@lemmy.ml 0 points 1 year ago (1 children)

So you agree, they are also at fault.

[–] SCB@lemmy.world 2 points 1 year ago (1 children)

No. Snake oil salesmen are not to blame for people craving snake oil. They're merely filling the demand gap

[–] hglman@lemmy.ml 0 points 1 year ago* (last edited 1 year ago) (1 children)

Explain why people want novel new things.

[–] SCB@lemmy.world 2 points 1 year ago (1 children)
[–] hglman@lemmy.ml 1 points 1 year ago (1 children)

So your saying that the supply caused a demand?

[–] SCB@lemmy.world 2 points 1 year ago (1 children)

No I'm literally saying the opposite. You cannot manufacture demand

[–] hglman@lemmy.ml 0 points 1 year ago* (last edited 1 year ago) (1 children)

Hahhaa ok sure, buddy. Big news for advertising.

[–] SCB@lemmy.world 2 points 1 year ago (1 children)

If we can manufacture demand, then supply-side economics is the way to go, full stop. Youre a big fan of Reagan, I take it?

[–] hglman@lemmy.ml 1 points 1 year ago (1 children)

That is not the implication of the ability to make others want things. Also, I clearly stated that the fault lies with both the producer of materials and the consumer.

[–] SCB@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Advertising does work, yes, but only for products in which demand already exists. No significant amount of people will buy a hand-cutting-off machine no matter how it is advertised.

The radicalization began decades ago, initially couched in racial resentment, and has spread largely via Americans' innate dissatisfaction with government, which goes back literally hundreds of years.

This resentment has certainly been fostered, even engineered, and I would separate those who have directly engineered or manipulated such resentment (e.g. Benghazi trials, trans panic, Critical Race Theory) with those who merely profit from it.

Two very different groups, imo.

[–] stefenauris@pawb.social 2 points 1 year ago (1 children)

algorithms can't "become evil" any more than your toaster can. It's being directed and programmed by people who know exactly what they're intending to achieve.

[–] NightOwl@lemmy.one 2 points 1 year ago* (last edited 1 year ago)

But, algorithms don’t understand the impact of it, since it’s either a 0 or 1 of user stays or user leaves.

It's to say algorithms despite no intent to be evil have led to negative impact due to no care for the context of the recommendation. So someone can go in searching up health information then go down a rabbit hole of being recommended pseudo health advice then flat earth and so on. Not because the algorithm wants to turn people a certain way, but because it's just recommending videos that users that liked similar videos might find of interest. It's just broad categories.

Wasn't implying algorithms are sentient. At least not yet until AI integration happens.