this post was submitted on 30 Jan 2024
500 points (93.1% liked)

Technology

59298 readers
4992 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] R00bot@lemmy.blahaj.zone 65 points 9 months ago (3 children)

Well tbf chatGPT also shouldn't remember and then leak those passwords lol.

[–] Ganbat@lemmyonline.com 58 points 9 months ago (2 children)

Did you read the article? It didn't. Someone received someone else's chat history appended to one of their own chats. No prompting, just appeared overnight.

[–] narc0tic_bird@lemm.ee 49 points 9 months ago

Well, that's even worse.

[–] wildginger@lemmy.myserv.one 35 points 9 months ago (1 children)

........ That shouldnt be happening, regardless of chat content

[–] Ganbat@lemmyonline.com 9 points 9 months ago (1 children)

Well, yeah, but the point is, ChatGPT didn't "remember and then leak" anything, the web service exposed people's chat history.

[–] wildginger@lemmy.myserv.one 2 points 9 months ago

Well, that depends. Do you mean gpt the specific chunk of lln code? Or do you mean gpt the website and service?

Because while the nitpicking details matter to the programmers fixing it, how much does that distinction matter to you or I, the laymen using the site?

[–] topinambour_rex@lemmy.world 12 points 9 months ago (2 children)

How ? How it should be implemented? It's just a llm. It has no true intelligence.

[–] Feathercrown@lemmy.world 7 points 9 months ago

If it's not trained on user data it cannot leak it

[–] pirat@lemmy.world 1 points 9 months ago (1 children)
[–] topinambour_rex@lemmy.world 1 points 9 months ago

Able to have a reflection.

[–] GBU_28@lemm.ee 7 points 9 months ago (2 children)

A huge value add of.chatgpt is that you can have running, contextual conversation. That requires memory.

[–] GamingChairModel@lemmy.world 6 points 9 months ago (1 children)

All of these LLMs should have walls between individual users, though, so that the chat history of one user is never accessible to any other user. Applying some kind of restriction to the LLM training and how chats are used is a conversation we can have, but the article and the example given is a much, much simpler problem that a user checking his own chat history was able to see other user's chats.

[–] GBU_28@lemm.ee 2 points 9 months ago
[–] abfarid@startrek.website 5 points 9 months ago* (last edited 9 months ago) (1 children)

It doesn't actually have memory in that sense. It can only remember things that are in the training data and within its limited context (4-32k tokens, depending on model). But when you send a message, ChatGPT does a semantic search of everything in the conversation and tries to fit the relevant parts inside the context, if there's room.

[–] GBU_28@lemm.ee 6 points 9 months ago* (last edited 9 months ago)

I'm familiar, it's just easiest for the layman to consider the model having "memory" as historical search is a lot like it at arm's length