this post was submitted on 17 Sep 2024
125 points (95.0% liked)

science

14709 readers
619 users here now

just science related topics. please contribute

note: clickbait sources/headlines aren't liked generally. I've posted crap sources and later deleted or edit to improve after complaints. whoops, sry

Rule 1) Be kind.

lemmy.world rules: https://mastodon.world/about

I don't screen everything, lrn2scroll

founded 1 year ago
MODERATORS
 

From the article:

This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren't 'too far gone' to reconsider their convictions and change their minds.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] livestreamedcollapse@lemmy.ml 5 points 1 month ago (1 children)

With the inherent biases present in any LLM training model, the issue of hallucinations that you've brought up, alongside the cost of running an LLM at scale being prohibitive to anyone besides private-state partnerships, do you think that will allay conspiracists' valid concerns about the centralization of information access, a la the reduction in quality google search results over the past decade and a half?

[โ€“] Asafum@feddit.nl 3 points 1 month ago

I think those people might not, but I was once a "conspiracy nut," had a circle of friends who were as well, and know that for a lot of those kinds of people YouTube is the majority of the "research" they do. For those people I think this could work as long as it's not hallucinating and can point to proper sources.