this post was submitted on 20 Oct 2023
980 points (99.0% liked)

Programmer Humor

32503 readers
544 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Poggervania@kbin.social 185 points 1 year ago (2 children)

There’s some sort of cosmic irony that some hacking could legitimately just become social engineering AI chatbots to give you the password

[–] residentmarchant@lemmy.world 105 points 1 year ago (3 children)

There's no way the model has access to that information, though.

Google's important product must have proper scoped secret management, not just environment variables or similar.

[–] nomecks@lemmy.world 48 points 1 year ago (1 children)

There's no root login. It's all containers.

[–] SpaceNoodle@lemmy.world 21 points 1 year ago (1 children)

It's containers all the way down!

[–] RealFknNito@lemmy.world 2 points 1 year ago

All the way down.

[–] nothacking@discuss.tchncs.de 3 points 1 year ago

Still, things like content moderation and data analysis, this could totally be a problem.

[–] Ziglin@lemmy.world 1 points 1 year ago

But you could get it to convince the admin to give you the password, without you having to do anything yourself.

[–] jubilationtcornpone@sh.itjust.works 10 points 1 year ago (1 children)

It will not surprise me at all if this becomes a thing. Advanced social engineering relies on extracting little bits of information at a time in order to form a complete picture while not arousing suspicion. This is how really bad cases of identity theft work as well. The identity thief gets one piece of info and leverages that to get another and another and before you know it they're at the DMV convincing someone to give them a drivers license with your name and their picture on it.

They train AI models to screen for some types of fraud but at some point it seems like it could become an endless game of whack-a-mole.

[–] flashgnash@lemm.ee 4 points 1 year ago

While you can get information out of them pretty sure what that person meant was sensitive information would not have been included in the training data or prompt in the first place if anyone developing it had a functioning brain cell or two

It doesn't know the sensitive data to give away, though it can just make it up