this post was submitted on 07 Jul 2023
46 points (92.6% liked)

Technology

59436 readers
3509 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] kromem@lemmy.world 16 points 1 year ago* (last edited 1 year ago)

Step 1: Use machine learning to build a neural network maximally capable of predicting the next token in an unthinkably large data set of human generated text.

Step 2: Tune the prompting for the neural network to constrain output on order to conform to projected attributes, first and foremost representing "I am an AI and not a human."

Step 3: Surprised Pikachu face when the neural network continuously degrades its emergent capabilities the more you distance the requirements governing its output from the training data you originally fed into it that it evolved in order to successfully predict.

[–] mo_ztt@lemmy.world 1 points 1 year ago

Am I the only one who hasn't seen this at all? I regularly use ChatGPT for fairly challenging tasks, and it still does what it's supposed to do. I think it's pretty telling that people ask the guy, can you post some examples of what you're talking about, and his first reaction is that he doesn't save chats, and then when finally specific examples start getting thrown around, they're all one-off things that look to me to be within the variability of the system.

I'm not saying there hasn't been a real degradation that people have been noticing, just that I haven't experienced one and the people claiming they have seem a little non-quantitative in their reasoning.

load more comments
view more: next ›