this post was submitted on 21 Nov 2023
995 points (97.9% liked)
Technology
59559 readers
3800 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
What always strikes me as weird is how trusting people are of inherently unreliable sources. Like why the fuck does a robot get trust automatically? It's a fuckin miracle it works in the first place. You double check that robot's work for years and it's right every time? Yeah okay maybe then start to trust it. Until then, what reason is there not to be skeptical of everything it says?
People who Google something and then accept whatever Google pulls out of webpages and puts at the top as fact.. confuse me. Like all machines, there are failures. Why would we trust that the opposite is true?
At least a Google search gets you a reference you can point at. It might be wrong, it might not. Maybe it points to other references that you can verify.
ChatGPT outright makes shit up and there's no way to see how it came to a given conclusion.
That's a good point... So long as you follow the links and read more. My girlfriend for example, often doesn't
Because the average person hears “AI” and thinks Cortana/Terminator, not a bunch of if statements.
People are dumb when it comes to things they don’t understand. I’m dumb when it comes to mechanical engineering of any kind, but I’m competent with software. It’s all about where people’s strengths lie, but some people aren’t aware enough to know they don’t know something
My guess, wholly lacking any scientifc rigor, is that humans naturally trust each other. We don't assume the info someone shares with us as wrong, unless there's "a reason" to doubt. Chatting with any of these LLM bots feels like talking to a person (most of the time), so there's usually "no reason" to doubt what it spews.
If human trust wasn't so easy to get and abuse, many scams would be much harder to pull.
I think you might be onto something. Thanks for sharing!
People trust a squid predicting football matches.