GlowHuddy

joined 1 year ago
[–] GlowHuddy@lemmy.world 10 points 4 months ago

Unfortunately, their story didn't end well.

https://en.m.wikipedia.org/wiki/Katyn_massacre

[–] GlowHuddy@lemmy.world 20 points 6 months ago

And ofc it's just to raise their stock a little bit :) Big corpos are real life monsters

[–] GlowHuddy@lemmy.world 14 points 7 months ago (1 children)

Could be both of those things as well.

[–] GlowHuddy@lemmy.world 13 points 7 months ago (1 children)

Yeah, I'm currently using that one, and I would happily stick with it, but it seems just AMD hardware isn't up to par with Nvidia when it comes to ML

Just take a look at the benchmarks for stable diffusion:

[–] GlowHuddy@lemmy.world 4 points 7 months ago (1 children)

Now I'm actually considering that one as well. Or I'll wait a generation I guess, since maybe by then Radeon will at least be comparable to NVIDIA in terms of compute/ML.

Damn you NVIDIA

[–] GlowHuddy@lemmy.world 7 points 7 months ago (6 children)

Yeah, was just reading about it and it kind of sucks, since one of the main reasons I wanted to go Wayland was multi-monitor VRR and I can see it is also an issue without explicit sync :/

75
submitted 7 months ago* (last edited 7 months ago) by GlowHuddy@lemmy.world to c/linux@lemmy.ml
 

I have currently a RX 6700XT and I'm quite happy with it when it comes to gaming and regular desktop usage, but was recently doing some local ML stuff and was just made aware of huge gap NVIDIA has over AMD in that space.

But yeah, going back to NVIDIA (I used to run 1080) after going AMD... seems kinda dirty for me ;-; Was very happy to move to AMD and be finally be free from the walled garden.

I thought at first to just buy a second GPU and still use my 6700XT for gaming and just use NVIDIA for ML, but unfortunately my motherboard doesn't have 2 PCIe slots I could use for GPUs, so I need to choose. I would be able to buy used RTX 3090 for a fair price, since I don't want to go for current gen, because of the current pricing.

So my question is how is NVIDIA nowadays? I specifically mean Wayland compatibility, since I just recently switched and would suck to go back to Xorg. Other than that, are there any hurdles, issues, annoyances, or is it smooth and seamless nowadays? Would you upgrade in my case?

EDIT: Forgot to mention, I'm currently using GNOME on Arch(btw), since that might be relevant

[–] GlowHuddy@lemmy.world 4 points 9 months ago

Interesting thought, maybe it's a mix of both of those factors? I mean, I remember using AI to work with images a few years back when I was still studying. It was mostly detection and segmentation though. But generation seems like a natural next step.

But definitely improving image generation doesn't suffer a lack of funding and resources nowadays.

[–] GlowHuddy@lemmy.world 10 points 9 months ago (2 children)

I mean, we didn't choose it directly - it just turns out that's what AI seems to be really good at. Companies firing people because it is 'cheaper' this way(despite the fact, that the tech is still not perfect), is another story tho.

[–] GlowHuddy@lemmy.world 3 points 10 months ago

wow TIL sth as well I guess

[–] GlowHuddy@lemmy.world 9 points 10 months ago (3 children)

The frequency is not directly proportional to the wavelength - it's inversely proportional: https://en.wikipedia.org/wiki/Proportionality_(mathematics)#Inverse_proportionality

Think of this as this: The wavelength is the distance that light travels during one wave i.e. cycle. Light propagates with the speed of light, so the smaller the wavelength, it means the frequency must increase. If the wavelength gets two times lower, the frequency increases two times. If wavelength approaches 0, then frequency starts growing very quickly, approaching infinity.

The plot is not a straight line but a hyperbola.

[–] GlowHuddy@lemmy.world 19 points 10 months ago (6 children)

Super + T my favorite

[–] GlowHuddy@lemmy.world 4 points 1 year ago (1 children)

For Logitech devices there is also Solaar.

You can check if it has the functionality you want (not sure, since I haven't used it much and only for basic stuff).

 
 

On a sidenote I believe this meme did not foresee webasm ;-;

view more: next ›