this post was submitted on 07 Nov 2023
43 points (85.2% liked)
Games
32587 readers
1244 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Honestly, fair enough, it would take a lot of work to have an LLM direct your game in the intended way.
That said though, to create an AI system equivalent to an LLM would be even more work.
I think a lot of this comes down to AAA vs Indie, as well. For a AAA game, there is a lot more pressure to keep the LLM in line - which is of course very difficult if not impossible. For an indie game, though, the goofiness can be part of the charm, I think.
I guess my point is that I'm just excited to see what people can come up with. The point of games is to play, and I personally think LLMs are fun to play with.
I think the idea that it takes more work to do things by just engineering it in is flawed, to say the least.
Building llms is incredibly difficult. Building good training data is even more difficult. Then ironing out all the problems and biases in your model is even more difficult than that. You often need to build new models just to correct the old models.
There are places ai does well, it is however not a magic wand that just makes everything easier. It is far far far from that.
Sure, but once the model is trained and all that, the developer doesn't need to worry about any of that - it becomes a black box as far as actual implementation goes.
I don't think anyone is proposing that game devs create an LLM from scratch.
I don't really think you grasp the problem at hand here. You think as long as their is a conversational llm everything else is solved. No.
Oooookay? Pretty sure I understand just fine, but agree to disagree I guess.