this post was submitted on 18 Oct 2023
4 points (75.0% liked)

PC Gaming

8573 readers
293 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
 

The Nvidia NV1 was released in 1995, it was the first GPU with 3D capabilities for PC... form there we know how things went by.

Now it's 2023, so let's make some "retro futuristic" prediction... what would you think about a AI board, open source driver, open API as Vulkan which you can buy to power the AI for your videogames? It would make sense to you? Which price range it should be?

What's supposed to do for your games... well, that's depend on videogames. The quickiest example I can think of is having endless discussion with your NPC in your average, single player, Fantasy RPG.

For example, the videogame load your 4~5 companions with the psychology/behaviors: they are fixated with the main quest goal (like you talk with fanatic people, this to make sure the game the main quest is as much stable as possible) but you can "break them" by making attempt to reveal some truths (for example, breaking the fourth wall), and if you go for this path, the game warns that you're probably going to lock out the main quest (like in Morrowind when you kill essential NPC)

you are viewing a single comment's thread
view the rest of the comments
[–] howrar@lemmy.ca 1 points 1 year ago

Right, I see where the confusion comes from. I mention current LLMs to say that the architecture and pre-training procedure we currently have produce models that are already capable of generating the type of outputs that can be used in this context. I make no claims about the quality of the output, but some additional fine-tuning on the game's specific story can take things very far.

When you say LLMs are not AI, I'm guessing what you mean is that they are not artificial general intelligence (AGI), and that I agree with. But AI is very broad, including things as simple as A* search. Decision trees aren't any more AGI than LLMs and they've been able to produce some very compelling stories, so this isn't a very good argument. We don't need AGI to write good stories.

The compute resources required for these models is something that can be fixed as well. On the hardware side, consumer hardware are continuously getting more powerful over time. On the software side, we're also seeing a lot of great results from the smaller 7b parameter models, and these are general purpose language models. If you just need something for your one game, you can likely distill the model into something much smaller.

The training data that we used for the current generation of LLMs are already out there and curated. We know that this dataset can achieve the performance of today's LLMs, and you can continue to train on that same data in the future. As long as you control where your new data comes from, this is not an issue.