this post was submitted on 03 Jun 2024
82 points (95.6% liked)
PC Gaming
8555 readers
417 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
While itself consuming a metric ton of electricity. The system works 🤪
Not really, AMD's FSR upscaling can increase visual quality/fidelity while using less power than rendering at full resolution. This can be easily seen in Steam Deck's battery life improvement when enabling it. Scaling this to millions of devices can indeed reduce energy usage.
When you read about "AI power consumption", its mostly about training the models, not as much the usage after it's trained.
Training an AI is intensive, but using them after the fact is relatively cheap. Cheaper than traditional rendering to reach the same level of detail. The upfront cost of training is offset by the savings on every video card running the tech from then on. Kinda like how railroads are expensive to build but much cheaper to operate after the fact.
It's pretty simple. If you can't understand delayed gratification, then you're right: school did fail you.
Ps.: the railroad comparison really breaks down when you consider that they're cheaper to build than the highways that trucks use and that we don't, in fact, need to truck in the resources anyway. We've been building railroads longer than trucks have existed, after all.
Thanks for the totally made up figures. I’m glad we agree that training itself is quite costly. No data on how much energy AI will save vs rendering (as we don’t know how much we can avoid rendering; there has to be a cap) so can’t really keep riding that horse.
You’re right tho, the rail analogy sucks. Not for the reasons you list tho, but rather because they will never stop training AI. Unless you feel AI will stop learning and needing to evolve.
FSR in this case doesn't need to be trained more. It's already a complete dataset, so now it can be released to run on MILLIONS of devices and reduce their load. And then you knock railroads which are one of the most efficient forms of land transportation we have. Just full of bad takes here.
No ...
No, I'm saying you are fundamentally misunderstanding what technology they're talking about and are thinking every type of AI is the same. In this article she is talking about graphics AI running on the local system as part of the graphics pipeline. It is less performance and therefore power intensive. There is no "vast AI network" behind AMDs presumptive work on a competor to DLSS/frame generation.