this post was submitted on 28 Aug 2024
50 points (100.0% liked)
Gaming
30555 readers
157 users here now
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In a couple sentences? In a way that doesn't approach, equal or exceed the effort of training the model with that data to begin with?
You insist these models can do new things out of nothing, and you keep saying "all you have to do, is give them something".
You keep moving the goal posts and putting words in my mouth. I never said you can do new things out of nothing. Nothing I mentioned is approaching, equaling, or exceeding the effort of training a model.
You haven't answered a single one of my questions, and you are not arguing in good faith. We're done here. I can't say it's been a pleasure.
My argument was and is that neural models don't produce anything truly new. That they can't handle things outside what is outlined by the data they were trained on.
Are you not claiming otherwise?
You say it's possible to guide models into doing new things, and I can see how that's the case, especially if the model is a very big one, meaning it is more likely that it has relevant structures to apply to the task.
But I'm also pretty damn sure they have insurmountable limits. You can't "guide" and LLM into doing image generation, except by having it interact with an image generation model.