this post was submitted on 15 Dec 2023
34 points (100.0% liked)

Quark's

1108 readers
72 users here now

Come to Quark’s, Quark’s is Fun!

General off-topic chat for the crew of startrek.website. Trek-adjacent discussions, other sci-fi television, navigating the Fediverse, server meta (within reason), selling expired cases of Yamok sauce, it’s all fair game.


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] n3m37h@lemmy.world 5 points 1 year ago (1 children)

Can NASA start making motherboards for PCs? Average lifetime for one now is under 10 years

[–] lurch@sh.itjust.works 5 points 1 year ago (1 children)

They usually become outdated before they break, ie. an important component like the CPU can't be upgraded any more, because it won't fit, so you need to upgrade the whole thing.

[–] n3m37h@lemmy.world 2 points 1 year ago (2 children)

Yes I know Intel works on a Tic Tock system so every 2 generations they change the pinouts. AM4 lasted 4 generations, hopefully AM5 will too but unlikely. Still a CPU will last well over 10 years. Motherboards on the other hand, dont even though they should last a lot longer esp considering all the tantilum / other SMD capacitors.

Voyager also only uses mW of power compared to the 25-300w of current CPUs

Alson I was joking about NASA doing anything other than space stuff

[–] hips_and_nips@lemmy.world 5 points 1 year ago* (last edited 1 year ago) (1 children)

Just FYI, the tick-tock model followed by Intel doesn’t directly have anything to do with sockets and pin outs.

The tick-tock model meant that after each change of the microarchitecture was followed by a die shrink. While a new socket is likely a consequence of these changes, it is a necessary byproduct rather than an intentional change.

Furthermore, Intel hasn’t used the tick-tock model since 2016.

However, trying to compare terrestrial consumer hardware with rugged radiation hardened hardware is futile. They have drastically different design/engineering specs that have hard limits with respect to physics, even special process nodes for true radiation hardening (RHBP). I think they’re only 150nm, I want to say there were some RHBP 65nm FPGAs recently, but I’m not 100%.

I have a feeling though if NASA were to make components, they’d all just be specialized embedded systems rather than anything consumer or enterprise. After all, computers are but tools to do different jobs.

[–] n3m37h@lemmy.world 3 points 1 year ago (1 children)

Can ya not take a joke? I even stated as much lmao

[–] hips_and_nips@lemmy.world 3 points 1 year ago (1 children)

Hahaha, yeahhhhhh, sorry mate. I get going about space electronics and there goes the rest of the day!

[–] n3m37h@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

All good, I get a kick on showing my knowledge too haha or lack of in most cases.

I still want a mobo that can outlast my fkn CPU! HELP NASA! ;)

PS thanks for correcting me about the end of Intel's tic tock model with their inability to shrink a node for 5 or so years

[–] Cort@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (1 children)

While am4 lasted 4 generations, you can't put 4th Gen chips in a 1st Gen board or vice versa. There are even some 1st Gen boards that can't do 3rd Gen chips and vice versa.

I was fairly disappointed when I found out AMD blocked Asus from updating 1st & 2nd Gen motherboards to be able to use pcie 4.0 with an agesa update on the BIOS. Blocked is probably the wrong word here though as Asus had already released the boss updates that unlocked pcie 4 on 1st Gen boards with 3rd Gen CPUs. /Rant

Edited: verb tenses

[–] n3m37h@lemmy.world 2 points 1 year ago (1 children)

Sadly there is actually a good reason for gen 3 boards not being able to support gen 4 even though its a firmware lock. Signal integrity.

[–] Cort@lemmy.world 2 points 1 year ago (1 children)

No, like Asus tested and certified half of their existing motherboards and released it and it worked fine for a couple weeks before AMD removed that ability. I get why some people may not want to risk signal integrity, but that should be my choice, not AMD's.

[–] n3m37h@lemmy.world 1 points 1 year ago

Never heard about that. Pretty shady shit.