this post was submitted on 10 Aug 2023
560 points (98.1% liked)

World News

32286 readers
861 users here now

News from around the world!

Rules:

founded 5 years ago
MODERATORS
 

Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items

you are viewing a single comment's thread
view the rest of the comments
[–] Hobo@lemmy.world 5 points 1 year ago

I think it was more poking fun at the fact that the developers, not the LLM, basically didn't do any checks for edible ingredients and just exported it straight to an LLM. What I find kind of funny is you could've probably exported the input validation to the LLM by asking a few specific questions about whether or not it was safe for human consumption and/or traditionally edible. Aside from that it seems like the devs would have access to a database of food items to check against since it was developed by a grocery store...

I do agree, people are trying to shoehorn LLMs into places they really don't belong. There also seems to be a lot of developers just straight piping input into a custom query to chatgpt and spitting out the output back to the user. It really does turn into a garbage in garbage out situation for a lot of those apps.

On the other hand, I think this might be a somewhat reasonable use for LLMs if you spent a lot of time training it and did even the most cursory of input validation. I'm pretty sure it wouldn't even take a ton of work to get some not completely horrendous results like the “aromatic water mix” or "rat poison sandwich" called out in the article.