this post was submitted on 22 Jan 2025
103 points (91.2% liked)

Not The Onion

12852 readers
1047 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

Couldn't make this shit up if I tried.

you are viewing a single comment's thread
view the rest of the comments
[–] unmagical@lemmy.ml 15 points 1 day ago (1 children)

24GB VRAM desktop

That's minimum a $1000 GPU if you go red team or $1500 for green.

[–] brucethemoose@lemmy.world 7 points 1 day ago* (last edited 1 day ago) (1 children)

Dual 3060s are an option. LLMs can be split across GPUs reasonably well.

3090s used to be like $700 used, but ironically they've gone up in price. I got mine for around $800 awhile ago, and stuffed it into 10L PC.

Some people buy used P40s. There are rumors of a 24GB Arc B580. Also, AMD Strix Halo APU laptops/mini PCs can host it quite well, with the right software setup... I might buy an ITX board if anyone ever makes one.

Also, there are 12GB/6GB VRAM distillations too, but 24GB is a huge intelligence step-up.

[–] unmagical@lemmy.ml 5 points 1 day ago (1 children)

Totally forgot the 3090 had 24GB. It's definitely still enthusiast territory though.

[–] brucethemoose@lemmy.world 3 points 1 day ago

For sure.

The 14B distillation is still quite good, and usable on like 10GB GPUs. Maybe 8 with the right settings.