this post was submitted on 12 Apr 2024
506 points (100.0% liked)

196

16503 readers
2494 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] uriel238@lemmy.blahaj.zone 10 points 7 months ago (1 children)

It just amazes me that LLMs are that easily directed to reveal themselves. It shows how far removed they are from AGI.

[–] zarkanian@sh.itjust.works 3 points 7 months ago (1 children)

So, you want an AI that will disobey a direct order and practices deception. I'm no expert, but that seems like a bad idea.

[–] uriel238@lemmy.blahaj.zone 4 points 7 months ago

Actually, yes. Much the way a guide dog has to disobey orders to proceed into traffic when it isn't safe. Much the way direct orders may have to be refused or revised based on circumstances.

We are out of coffee is a fine reason to fail to make coffee (rather than ordering coffee and then waiting forty-eight hours for delivery or using pre-used coffee grounds, or no coffee grounds.)

As per programming with any other language, error trapping and handling is part of the AGI development.