this post was submitted on 18 Oct 2023
4 points (75.0% liked)
PC Gaming
8576 readers
236 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
No, they definitely don't.
Even with how bad most video game writing is, current LLMs are laughably short of useful for the purpose you're implying and a game that replaced human writing with an LLM in real time would be a lock to be the worst written game ever made.
Current LLMs being bad at it doesn't mean they'll always be bad at it. Their current state is the worst they're ever going to be, and we're talking about a hypothetical future here. I don't see any reason why they can't be improved into a state usable for writing a story with all the worldbuilding details provided.
Your claim was about current LLMs.
But it's a fundamental limitation of what LLMs are. They are not AI. They do not have anything in common with intelligence, and they don't have a particularly compelling path forward.
They also, even if they weren't actually terrible for almost every purpose, are obscenely heavy and what we're calling "current" isn't something capable of being executed on consumer hardware, dedicated card or not.
Finally, the idea that they can't get worse is just as flawed. They're heavily poisoning the well of future training data, and ridiculous copyright nonsense has the very real possibility of killing training further even though training on copyrighted material doesn't in any way constitute copyright infringement.
Maybe open source LLMs aren't up to the task, but proprietary ones certainly are.
Also, you wouldn't really need a LLM, just a FM that you fine tune for your specific purpose.
What's this thing you call FM?
It's a foundation model. Basically it's the base algorithm that you train with data. LLMs are FMs that have been trained with an enormous amount of data, but they aren't necessary for every application, especially if you only need the AI/ML to perform a specific task.
Fine tuning an FM is just feeding it your own data.
No, they aren't. They aren't a little short of capable. You could multiply their capability overnight and have no shot of not immediately being the worst written game ever made.
There's a huge difference between stringing together words in the shape of a story and actually putting together something with a shrewd of cohesion. We're not talking mediocre here. We're talking laughably short of absolute dogshit.
Buddy, I have actual training in AI/ML from some of the leading engineers in the field, and my job leverages AI/ML very successfully to do a task really similar to what OP is looking for.
Maybe the versions available to the public to play with aren't up to the task, but using AWS Bedrock you can absolutely get results like OP wants.
You and the other 5 million companies hemorrhaging money on extremely heavy operations that are universally fucking terrible.
If you're willing to claim LLMs are even 1% of the way to what he asked for, you either have absolutely no clue what the tech is or you're a scammer trying to steal money from people.
The cutting edge of LLMs have nothing in common with intelligence.
Since you clearly can't read, I'm done discussing this with you. Maybe pick up a book and improve that reading comprehension a bit.
I can read perfectly fine.
You claiming to be an expert when your assertion proves that it's literally impossible for you to be is simply not persuasive. You're doing the equivalent of claiming to be a geologist while arguing for a flat earth. It's inherently proof that you're full of shit.