this post was submitted on 19 Jul 2024
440 points (98.5% liked)

Technology

58072 readers
4131 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] kometes@lemmy.world 11 points 1 month ago (2 children)

What happens if you make a mistake with your initial instructions?

[–] Avatar_of_Self@lemmy.world 7 points 1 month ago

You'd change the system prompt, just like now. If you mean in the session, I'm sure it'll ignore your session's prompt's instructions as normal but if not, I guess you'd just start a new session prompt.

[–] vxx@lemmy.world 3 points 1 month ago* (last edited 1 month ago)

The "issue" is that people were able to override bots on twitter with that method and make them feed their own instructions.

I saw it first time being used on a Russian propaganda bot.