this post was submitted on 11 Sep 2023
807 points (98.2% liked)
Games
32572 readers
1777 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is how games and drivers have been for decades.
There are huge teams at AMD and nVidia who's job it is to fix shit game code in the drivers. That's why (a) they're massive and (b) you need new drivers all the time if you play new games.
I read an excellent post a while ago here, by Promit.
https://www.gamedev.net/forums/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/5215019/
It's interesting to see that in the 8 years since he wrote it, the SLI/Crossfire solution has simply been to completely abandon it, and that we still seem to be stuck in the same position for DX12. Your average game devs still have little idea how to get the best performance from the hardware, and hardware vendors are still patching things under the hood so they don't look bad on benchmarks.
I'll give a different perspective on what you said: dx12 basically moved half of the complexity that would normally be managed by a driver, to the game / engine dev, which already have too much stuff to do: making the game. The idea is that "the game dev knows best how to optimize for its specific usage" but in reality the game dev have no time to deal with hardware complexity and this is the result.
To attribute this most recent failure to an overabundance of hardware variety is a joke. This issue persists on all Nvidia and Intel cards. Why? Because it's an oversight pertaining to one thing they all share in common: their shared interaction with DirectX.
Let me repeat myself for the people in the back. The number of items they had to account for with this failure is one. One API.
This sounds more like hardware manufacturers haven't provided a good enough abstraction layer across their devices, or they did (vulkan) but everyone is just stuck on bad apis that don't properly map to the abstractions for the hardware. Or even more likely the publishers cheaped out and pushed something to release when it wasn't ready like they have been forever.
It's also a lack of specialized talent. There's lots of great "talent" at game devs and even middleware devs. There's just not much great talent that deals with renderers and API development. The vast majority of devs just lean on the middleware developer to push out the renderer codebase. In a situation like Bethesda running their own studio engine, they just don't have the right people for it. This plagued the 90's when people were trying to code for Glide, OGL, DX5,6,7,8, and 9. Many studios folded because they couldn't get their tech to work with hardware acceleration.
*for current wage
Excellent point.
Lol
Pc gaming is and forever will be way better then games on consoles.
Why?
I've 3 letters for you.
R G B
( ͡° ͜ʖ ͡°)
tbf pc gaming was always a fight for performance, I never felt superior back in the day fighting with qemm, irqs for the soundblaster or glide3d, it's always had been a shitshow. It was a super shitshow in the nineties, it was a bit better in the zero's and nowadays it again became a tad better.
But somehow I enjoyed that shitshow. Still do.