this post was submitted on 03 Nov 2023
233 points (96.4% liked)
Programming
17366 readers
221 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Right. I don't know how the hell someone managed to reveal their OpenAI key to the LLM itself
I don't think it gave him the openAI key, he just had the ability to send as many hijacked (not game related) prompts as he wanted through the game on the devs' dime.
Which, now given the ability to inject arbitrary code, you could conceivably now write code to list every variable it had access to.
The text prompt in the game might also be vulnerable to arbitrary code injection, but that wouldn't really have anything to do with the prompt injection being used here. Everything being done is within the confines of chatGPT which wouldn't need or have access to any of the game's code.
They didn't. The point was that the guy could use their implementation freely as if he was paying for a chat gpt license. Basically he made the ai let him run any query he wanted trough it so he just has unlimited access to the paid version of chat gpt at the company's expense