MudMan

joined 6 months ago
[–] MudMan@fedia.io 1 points 19 hours ago

Right, but now you're giving me arguments for using original hardware, not for modifying it.

See, that's the part that loses me. I use original hardware all the time, I have tons of original hardware and software for a whole bunch of platforms, including ones that are trivial to run in cycle-accurate emulators and FPGA reproductions. All good there. I even have some flashcarts and softmods to allow cross-region usage or to consolidate libraries. No problem with that.

But that is based on using the original hardware, unmodified. Once you start gutting it for mods then you're working against your argument that complexities and sublteties of original hardware are important. I mean, yeah, I do care to at least have a way to go back to sanity check the sublte ways in which original hardware parses the code in a rom. But for that same reason I want to see how the default composite or RF signal subtly interacts with that output and with a period-accurate CRT display. I want to hear the CD spinning when it's supposed to spin and the original loading times.

To be clear, I think this is just a case mod, but I'm talking about the modding scene more generally. I don't see why you would think "total accuracy" is important in the interaction between the CPU, VDUs and RAM but not on the I/O. Wouldn't the CD drive and the video signal be part of "total accuracy"? Wouldn't the form factor of the shell and the controllers be a part of that accurate experience as well? If you push me I'd even say I consider a MiSTer FPGA solution with a correct analogue out signal and an original controller feeding into a CRT is far more accurate to the original NES than the original Analogue NT that was made from gutted NES parts, or even than an original console pushed through an HDMI scaler or mod.

I guess there is no accounting for taste, but I do struggle to follow the logic where running the original CPU and video chips on completely different I/O is justified by trying to maximize for accuracy.

[–] MudMan@fedia.io 3 points 1 day ago (1 children)

Okay, but you can do that with a softmod or a flash cart on the Saturn, too. You don't need to rip out its guts to transplant them to a different case. Even if you had a Saturn with a faulty drive you can add a ODE solution without having to sacrifice the original form factor.

Plus, yeah, Saturn emulation is harder and less accurate than other systems, but we're pretty much there these days for most of the stuff you want to play. You can do all sorts of cool cases and consolized devices to play old games these days, why break apart an original Saturn for that?

[–] MudMan@fedia.io 11 points 1 day ago

Let me agree with you explicitly on loving the return to a sane power configuration here. I was watching Hardware Unboxed's retest of this after the patches and it takes almost fifteen minutes of them reiterating that the 9700X and the 14700K are tied for performance and price before they even mention the bombshell that the 9700X is doing that with about half the wattage.

The fact that we keep pushing reviews and benchmarks focused strictly on pedal-to-the-metal overclocked performance and nothing else is such a disgrace. I made the mistake to buy into a 13700K and I have it under lower than out of box power limits manually both to prevent longevity issues and because this damn computer is more effective as a hair dryer than anything else.

We don't mention it much because Intel was in the process of catching on actual fire at the same time, but the way this generation has been marketed, presented to reviewers, supported and eventually reviewed has been a massive trainwreck, considering the performance of the actual product.

[–] MudMan@fedia.io 3 points 1 day ago (2 children)

Okay, people will likely bash me for this hot take, and if this is for you feel free to enjoy it, but...

...why do this instead of using a FPGA or emulation-based solution?

If you don't want to run original media or don't want to output original video out signals, why bother to use original hardware at all? I can see it as a way to upcycle heavily damaged Saturn units, but there aren't that many of these in the wild in the first place, why dismantle an existing piece to make what is effectively a completely different product? I don't see the point.

[–] MudMan@fedia.io 4 points 1 day ago

I don't know. I mean, he does sound like he catches himself, and he isn't that good of an actor. But then, who the hell has that just... ready to go to the point where it just blurts out by itself? Like, how often do you have to say that out loud for it to just hijack your train of thought? It's almost less damning if he did it on purpose, honestly.

[–] MudMan@fedia.io 14 points 2 days ago

Well, that's a new one. I wasn't expecting the leopard to be resentful about all the face eating.

[–] MudMan@fedia.io 1 points 2 days ago

See, that I can get behind. Credit where it's due.

[–] MudMan@fedia.io 3 points 2 days ago (1 children)

MSAA is pretty solid, but it has its own quirks and it's super heavy for how well it works. There's a reason we moved on from it and towards TAA eventually. And DLSS is, honestly, just very good TAA, Nvidia marketing aside.

I am very confused about the concept of "fake perfromance". If the animation looks smooth to you then it's smooth. None of it exists in real life. Like every newfangled visual tech, it's super in-your-face until you get used to it. Frankly, I've stopped thinking about it on the games where I do use it, and I use it whenever it's available. If you want to argue about increased latency we can talk about it, but I personally don't notice it much in most games as long as it's relatively consistent.

I do understand the feeling of having to worry about performance and being hyper-aware of it being annoying, but as we've litigated up and down this thread, that ship sailed for PC gaming. If you don't want to have to worry, the real answer is getting a console, I'm afraid.

[–] MudMan@fedia.io 1 points 2 days ago

Yeah, optimizing for scalability is the only sane choice from the dev side when you're juggling hardware ranging from the Switch and the Steam Deck to the bananas nonsense insanity that is the 4090. And like I said earlier, often you don't even get different binaries or drivers for those, the same game has to support all of it at once.

It's true that there are still some set targets along the way. The PS5 is one, the Switch is one if you support it, the Steam Deck is there if you're aiming to support low power gaming. But that's besides the point, the PS5 alone requires two to three setups to be designed, implemented and tested. PC compatibility testing is a nightmare at the best of times, and with a host of display refresh rates, arbitrary resolutions and all sorts of integrated and dedicated GPUs from three different vendors expected to get support it's outright impossible to do granularly. The idea that PC games have become less supported or supportive of scalability is absurd. I remember the days where a game would support one GPU. As in, the one. If you had any other one it was software rendering at best. Sometimes you had to buy a separate box for each supported card.

We got used to the good stuff during the 900 series and 1000 series from Nvidia basically running console games maxed out at 1080p60, but that was a very brief slice of time, it's gone and it's not coming back.

[–] MudMan@fedia.io 3 points 2 days ago

Yeah, although I am always reluctant to quantify visual quality like that. What is "65% better" in terms of a game playing smoothly or looking good?

The PS5 Pro reveal was a disaster, partially because if you're trying to demonstrate how much nicer a higher resolution, higher framerate experience is, a heavily compressed, low bitrate Youtube video that most people are going to watch at 1080p or lower is not going to do it. I have no doubt that you can tell how much smoother or less aliased an image is on the Pro. But that doesn't meant the returns scale linearly, you're right about that. I can tell a 4K picture from a 1080p one, but I can REALLY tell a 480p image from a 1080p one. And it's one thing to add soft shadows to a picture and another to add textures to a flat polygon.

If anything, gaming as hobby has been a tech thing for so long that we're not ready to have shift to being limited by money and artistic quality rather than processing power. Arguably this entire conversation is pointless in that the best looking game of 2024 is Thank Goodness You're Here, and it's not even close.

[–] MudMan@fedia.io 3 points 2 days ago

Yep. The thing is, even if you're on high end hardware doing offline CGI you're using these techniques for denoising. If you're doing academic research you're probably upscaling with machine learning.

People get stuck on the "AI" nonsense, but ultimately you need upscaling and denoising of some sort to render certain tier of visuals. You want the highest quality version of that you can fit in your budgeted frame time. If that is using machine learning, great. If it isn't, great as well. It's all tensor math anyways, it's about using your GPU compute in the most efficient way you can.

[–] MudMan@fedia.io 8 points 2 days ago (2 children)

I mean... OK, but AMD just revealed a new set of AI-powered upscaling libraries along with Sony for the PSPro and is on record saying they're backing out of high end gaming hardware to pivot to data center hardware, so... I hope you have more reasons than this, because I don't think they disagree.

view more: next ›