this post was submitted on 19 Jan 2024
93 points (92.7% liked)

PC Gaming

8576 readers
234 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 18 comments
sorted by: hot top controversial new old
[–] GlitterInfection@lemmy.world 11 points 10 months ago (3 children)

Oh Kotaku.

AI has the potential to flesh out immersive worlds in video games in ways that are completely impossible for a team to accomplish today.

If it's used to augment scripted characters and stories it can only make the soulless NPCs we are used to into much more interesting characters.

I welcome, and in fact, long for that treatment in games like the Elder Scrolls.

There's absolutely no need for AI to replace Link from legend of Zelda, but hells yes it should be used to stop guards from talking about my stolen sweetroll.

This article and headline are just propaganda.

[–] bionicjoey@lemmy.ca 24 points 10 months ago (5 children)

The point is that right now language models are only good at generating coherent text. They aren't at the level where they can control an NPC's behaviour in a game world. NPCs need to actually interact with the world around them in order to be interesting. That words that come out of their mouths are only part of the equation.

[–] warmaster@lemmy.world 13 points 10 months ago (1 children)

Yes, language models are good for text. That's their sole purpose. They can't control characters. There are other models for that, and they are obviously not language models.

[–] kogasa@programming.dev 7 points 10 months ago* (last edited 10 months ago)

>3d navigation models

>look inside

>language models

[–] kakes@sh.itjust.works 1 points 10 months ago (1 children)

Well, they actually can, at least to an extent. All you need to do is encode the worldstate in a way the LLM can understand, then decode the LLM's response to that worldstate (most examples I've seen use JSON to good effect).

That doesn't seem to be the focus of most of these developers though, unfortunately.

[–] bionicjoey@lemmy.ca 3 points 10 months ago (1 children)

That assumes the model is trained on a large training set of the worldstate encoding and understands what that worldstate means in the context of its actions and responses. That's basically impossible with the state of language models we have now.

[–] kakes@sh.itjust.works 1 points 10 months ago

I disagree. Take this paper for example - keeping in mind it's a year old already (using ChatGPT 3.5-turbo).

The basic idea is pretty solid, honestly. Representing worldstate for an LLM is essentially the same as how you would represent it for something like a GOAP system anyway, so it's not a new idea by any stretch.

[–] jerkface@lemmy.ca -3 points 10 months ago

Right, there's no possible way actions can be represented by a stream of symbols.

[–] GlitterInfection@lemmy.world -5 points 10 months ago* (last edited 10 months ago) (1 children)

They're a massive and combinatorially exploding part of the equation, though.

Imagine a world where instead of using AI to undermine writers and artists, we use it to explode their output. A writer could write the details that make a character unique, and the key and side quest dialogs that they write now, which could be used to customize a model for that character.

The player can now have realistic conversations with those characters that would make everything better. You could ask for directions to something and then follow it up with more questions that the NPC should know the answer to. Etc.

Now inconsequential filler characters, like a ramen shop owner in the example, become something potentially memorable but explicitly useful in a way that could never possibly be hand crafted.

This article is shitting on an incredible early attempt to allow for this by taking the fact that it's not done yet and crossing that with their biased opining and producing a kotaku-style click bait from it.

[–] 50gp@kbin.social 9 points 10 months ago (1 children)

you do know that quality over quantity right? nobody likes bethesdas radiant fetch quests and this is that but with exposition dumping npcs

[–] bionicjoey@lemmy.ca 3 points 10 months ago

Not even just exposition. An NPC could easily go off script and start talking about stuff that breaks immersion. Like imagine you're sitting in a tavern in Skyrim and then some NPC comes up and is like "hey, you see any good movies lately?"

[–] Psionicsickness@reddthat.com -5 points 10 months ago

Did you watch the demo? The player literally told the bartender to break out the good stuff and he did just that…

[–] Kbin_space_program@kbin.social 8 points 10 months ago

You mention the trick yourself.

AI can augment a real actor and script. Not replace.

Skyrim has mods that add AI voice to non voiced mod NPCs or lines. Works great. But its only augmenting what is already there.

[–] ram@bookwormstory.social 4 points 10 months ago* (last edited 10 months ago) (2 children)

AI has the potential to enshittify what would be immersive worlds in video games. Nobody wants their crafted NPC dialogue turned into ChatGPT garbage. Your comment is just propaganda.

[–] kakes@sh.itjust.works 1 points 10 months ago

I would love to have a game - or a genre of games - using AI NPCs.

What I absolutely don't want is every game using AI NPCs.

[–] blindsight@beehaw.org 6 points 10 months ago

Just one thing to add re: the quip at the end about industry layoffs:

Layoffs are happening in the tech sector almost entirely because of interest rates. When interest rates go up, it's more expensive to invest in growth, so companies scale down their operations. That's exactly the point; central banks are trying to reduce aggregate demand across the entire economy to reduce the demand-side pressure on inflation.

It absolutely sucks that corporate greed and regulatory capture have fucked the economy and are screwing over workers, but that's just the general situation and isn't the cause of recent layoffs.

Anyway, not at all surprised to hear that AI chatbots suck in games even more than they suck on text boxes; it's an even harder problem to add voice and animations.

LLMs aren't really capable of creating immersive text, at least not on their own. We'll at least need some sort of companion software to guide the LLM with what text to create. Sending raw user input into a LLM is never going to work well; it's just not how the technology works.