this post was submitted on 17 May 2024
503 points (94.8% liked)

Technology

59106 readers
3835 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Wirlocke@lemmy.blahaj.zone 56 points 5 months ago (3 children)

I'm a bit annoyed at all the people being pedantic about the term hallucinate.

Programmers use preexisting concepts as allegory for computer concepts all the time.

Your file isn't really a file, your desktop isn't a desk, your recycling bin isn't a recycling bin.

[Insert the entirety of Object Oriented Programming here]

Neural networks aren't really neurons, genetic algorithms isn't really genetics, and the LLM isn't really hallucinating.

But it easily conveys what the bug is. It only personifies the LLM because the English language almost always personifies the subject. The moment you apply a verb on an object you imply it performed an action, unless you limit yourself to esoteric words/acronyms or you use several words to overexplain everytime.

[–] calcopiritus@lemmy.world 14 points 5 months ago* (last edited 5 months ago) (2 children)

It's easily the worst problem of Lemmy. Sometimes one guy has an issue with something and suddenly the whole thread is about that thing, as if everyone thought about it. No, you didn't think about it, you just read another person's comment and made another one instead of replying to it.

I never heard anyone complain about the term "hallucination" for AIs, but suddenly in this one thread there are 100 clonic comments instead of a single upvoted ones.

I get it, you don't like "hallucinate", just upvote the existing comment about it and move on. If you have anything to add, reply to that comment.

I don't know why this specific thing is so common on Lemmy though, I don't think it happened in reddit.

[–] ZILtoid1991@lemmy.world 3 points 5 months ago

"Hallucination" pretty well describes my opinion on AI generated "content". I think all of their generation is a hallucination at best.

Garbage in, garbage out.

[–] emptiestplace@lemmy.ml 3 points 5 months ago

I don't know why this specific thing is so common on Lemmy though, I don't think it happened in reddit.

When you're used to knowing a lot relative to the people around you, learning to listen sometimes becomes optional.

[–] ZILtoid1991@lemmy.world 4 points 5 months ago

They're nowadays using it to humanize neural networks, and thus oversell its capabilities.

[–] abrinael@lemmy.world -1 points 5 months ago* (last edited 5 months ago) (2 children)

What I don’t like about it is that it makes it sound more benign than it is. Which also points to who decided to use that term - AI promoters/proponents.

Edit: it’s like all of the bills/acts in congress where they name them something like “The Protect Children Online Act” and you ask, “well, what does it do?” And they say something like, “it lets local police read all of your messages so they can look for any dangers to children.”

[–] zalgotext@sh.itjust.works 16 points 5 months ago (1 children)

The term "hallucination" has been used for years in AI/ML academia. I reading about AI hallucinations ten years ago when I was in college. The term was originally coined by researchers and mathematicians, not the snake oil salesman pushing AI today.

[–] abrinael@lemmy.world 5 points 5 months ago (1 children)

I had no idea about this. I studied neural networks briefly over 10 years ago, but hadn’t heard the term until the last year or two.

[–] KeenFlame@feddit.nu 0 points 5 months ago

We were talking about when it was coined, not when you heard it first

[–] Wirlocke@lemmy.blahaj.zone 7 points 5 months ago* (last edited 5 months ago) (1 children)

In terms of LLM hallucination, it feels like the name very aptly describes the behavior and severity. It doesn't downplay what's happening because it's generally accepted that having a source of information hallucinate is bad.

I feel like the alternatives would downplay the problem. A "glitch" is generic and common, "lying" is just inaccurate since that implies intent to deceive, and just being "wrong" doesn't get across how elaborately wrong an LLM can be.

Hallucination fits pretty well and is also pretty evocative. I doubt that AI promoters want to effectively call their product schizophrenic, which is what most people think when hearing hallucination.

Ultmately all the sciences are full of analogous names to make conversations easier, it's not always marketing. No different than when physicists say particles have "spin" or "color" or that spacetime is a "fabric" or [insert entirety of String theory]...

[–] abrinael@lemmy.world 5 points 5 months ago* (last edited 5 months ago) (1 children)

After thinking about it more, I think the main issue I have with it is that it sort of anthropomorphises the AI, which is more of an issue in applications where you’re trying to convince the consumer that the product is actually intelligent. (Edit: in the human sense of intelligence rather than what we’ve seen associated with technology in the past.)

You may be right that people could have a negative view of the word “hallucination”. I don’t personally think of schizophrenia, but I don’t know what the majority think of when they hear the word.

[–] Knock_Knock_Lemmy_In@lemmy.world 5 points 5 months ago

You could invent a new word, but that doesn't help people understand the problem.

You are looking for an existing word that describes providing unintentionally incorrect thoughts but is totally unrelated to humans. I suspect that word doesn't exist. Every thinking word gets anthropomorphized.