this post was submitted on 29 Apr 2024
97 points (87.6% liked)

Technology

59143 readers
2687 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TexasDrunk@lemmy.world 7 points 6 months ago (1 children)

I have what is probably a stupid and misplaced question. The second picture in the article has the phrase "with hope in his heart". That phrase repeatedly pops up in the hilariously bad ChatGPT stories I've seen people generate.

Is there a reason that cheesy phrases that don't get used in real life keep popping into stories like that?

[–] piyuv@lemmy.world 3 points 6 months ago (1 children)

Those phrases are not common anymore but once was very common, among the corpus the llm is trained on (mid 20th century books)

[–] TexasDrunk@lemmy.world 1 points 6 months ago

I want to preface this by saying I'm not doubting you, I just don't know how it works.

Ok, but wouldn't the training be weighted against older phrases that are no longer used? Or is all training data given equal weight?

Additionally, if the goal is to create bedtime stories or similar, couldn't the person generating it ask for a more contemporary style? Would that affect the use of that phrase and similar cheesy lines that keep appearing?

I would never use an LLM for creative or factual work, but I use them all the time for code scaffolding, summarization, and rubber ducking. I'm super interested and just don't understand why they do the things they do.