this post was submitted on 09 Sep 2024
297 points (99.0% liked)

Technology

59143 readers
2986 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] finley@lemm.ee 154 points 1 month ago* (last edited 1 month ago) (9 children)

Fucking do it. Anything that takes down Nvidia’s CUDA Monopoly has my full support.

load more comments (9 replies)
[–] henfredemars@infosec.pub 78 points 1 month ago (1 children)

Squish them like bug. Show me you can do it, AMD.

[–] TheFeatureCreature@lemmy.world 30 points 1 month ago* (last edited 1 month ago) (3 children)

Anything to help them take on Nvidia and stay competitive is a good move. However, I wish they would also announce a recommitment to driver and software stability. I had to move to Nvidia for my workstation rig after having constant stability issues with numerous AMD cards across multiple builds. I can handle a few rough edges or performance that isn't top-of-the-line but I can't put up with constant crashes ad driver timeout errors. It's annoying in games and devastating when I'm working.

I wish their GPU line received even a portion of the polish and care that their CPU line did.

[–] refurbishedrefurbisher@lemmy.sdf.org 14 points 1 month ago (1 children)

As a Linux user, I had to trade in my Nvidia laptop for one with an AMD GPU due to how unstable the Nvidia drivers were and how many problems they were giving me. With the AMD laptop, I have had zero issues.

[–] russjr08@bitforged.space 2 points 1 month ago

I did the same move for similar reasons! Although I still keep windows around on another SSS - and even the Windows Nvidia drivers were being funky for me.

Nvidia shares a lot of logic between their Windows and Linux driver as far as I'm aware, so I suppose it makes sense.

[–] TheGrandNagus@lemmy.world 13 points 1 month ago (1 children)

Damn, I've had the exact opposite experience. I had to move away from a 1080 Ti that I was having constant instability with, even after I went back to the retailer and got a new card.

Unfortunately at the time, AMD didn't have anything performance competitive. But it was worth the downgrade for the better drivers.

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 2 points 1 month ago (2 children)

I had to move away from a 1080 Ti that I was having constant instability with, even after I went back to the retailer and got a new card.

Was it the card or was it something else? Any chance you have a 13th or 14th gen Intel CPU?

[–] TheGrandNagus@lemmy.world 10 points 1 month ago

It was the card, and nah, it long predates 13th/14th gen.

[–] GreyEyedGhost@lemmy.ca 3 points 1 month ago

Isn't it at meme levels when YouTube games have their screen go black and they mention Nvidia crashing?

[–] schizo@forum.uncomfortable.business 7 points 1 month ago (2 children)

The annoying part is their drivers are stable....sometimes.

Its an endless game of seeing if any specific version is broken in a way that annoys you and rolling back if you find an issue.

Not exactly a premium experience.

[–] ayaya@lemdro.id 2 points 1 month ago (1 children)

Even on Linux where their drivers are supposed to be better, my 7900XTX has been crashing randomly for at least a month and it was only fixed in the latest 6.10.9 kernel release yesterday.

Yeah I've heard the 'AMD drivers are better!' thing for Linux and have always been confused since I've had no issues with nVidia cards on Linux or Windows related to driver issues.

AMD stuff on the other hand, has been a mess non stop, except for my ROG Ally for some reason which is fine?

In short: computers suck and are unpredictable, or something.

[–] TheFeatureCreature@lemmy.world 1 points 1 month ago

Yeah, this too. My dad's last GPU was AMD and he had to flip flop between versions to fix crashes. I wasn't as lucky as no driver version was able to calm the crashing.

[–] jaxiiruff@lemmy.zip 22 points 1 month ago (1 children)

This checks out after what they recently did to the ZLUDA project. As an owner of an AMD gpu I agree that ROCM support is really bad. It works half of the time and fairly poorly.

[–] DarkThoughts@fedia.io 4 points 1 month ago (11 children)

Even worse on Linux. Even worse on more exotic distros like Bazzite where I still can't get koboldcpp to run, which was already kind of a hassle on my previous distro.

load more comments (11 replies)
[–] AmidFuror@fedia.io 17 points 1 month ago (1 children)

CDNA is actually DNA made using RNA as a template. Very important for the viral ecosystem.

[–] thisfro 2 points 1 month ago (1 children)

Don't forget mRNA in the process!

[–] AmidFuror@fedia.io 6 points 1 month ago

Could be mRNA. Could be gRNA. But I think this article wants them in all caps.

[–] lustyargonian@lemm.ee 8 points 1 month ago (1 children)

When AMD moved on from its GCN microarchitecture back in 2019, the company decided to split its new graphics microarchitecture into two different designs, with RDNA designed to power gaming graphics products for the consumer market while the CDNA architecture was designed specifically to cater to compute-centric AI and HPC workloads in the data center.

I wonder if CDNA will be more akin to Tensor Cores on RTX GPUs, leading to better ray tracing performance of gaming.

[–] barsoap@lemm.ee 2 points 1 month ago (1 children)

Tensor cores have nothing to do with raytracing. They're cut-down GPU cores specialising in tensor operations (hence the name) and nothing else. Raytracing is accelerated by RT cores, doing BVH traversal operations and ray intersections, the tensor cores are in there to run a denoiser to turn the noisy mess that real-time RT produces into something that's, well, not messy. Upscaling, essentially, the only difference between denoising and upscaling is that in upscaling the noise is all square.

And judging by how AMD has done this stuff before nope they won't do separate cores, but make sure that the ordinary cores can do all that stuff well.

[–] lustyargonian@lemm.ee 1 points 1 month ago (1 children)

Oh I see. So DLSS and especially ray reconstruction uses tensor core, would that be right?

I guess then it may be better to keep expectations low.

[–] barsoap@lemm.ee 2 points 1 month ago

Yep that's what nvidia marketing seems to be calling their denoiser nowadays. Gods spare us marketing departments.

[–] Disaster@sh.itjust.works 1 points 1 month ago (1 children)

Will I even be able to afford one? Still rocking a Radeon VII here..

[–] iopq@lemmy.world 4 points 1 month ago (1 children)

If you bought it for $700 back in the day, that's $1000+ in current dollars

[–] Disaster@sh.itjust.works 2 points 1 month ago

Yep. My paycheck has sadly not scaled in a similar manner :(

Fermi 2.0?

Or is this going to be more like kepler where the consumer grade stuff doesn't have all of the power hungry features, and the datacenter stuff gets them?

load more comments
view more: next ›