this post was submitted on 14 Feb 2024
20 points (95.5% liked)
Permacomputing
655 readers
2 users here now
Computing to support life on Earth
Computing in the age of climate crisis is often wasteful and adds nothing useful to our real life communities. Here we try to find out how to change that.
Definition and purpose of permacomputing: http://viznut.fi/files/texts-en/permacomputing.html
XMPP chat: https://movim.slrpnk.net/chat/lowtech%40chat.disroot.org/room
Sister community over at lemmy.sdf.org: !permacomputing@lemmy.sdf.org
There's also a wiki: https://permacomputing.net/
Website: http://wiki.xxiivv.com/site/permacomputing.html
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Is it just me or does this article seem a bit over simplified, slightly inaccurate, and rant-y? The author makes a lot or claims I feel need verifying so that's why I haven't answered my question yet.
For example:
The author goes on to discuss Moore's law being dead and Amdahl's law which is sort of a law of diminishing returns on core counts.
While we may not be getting performance improvements like we used to, saying that computers now aren't much faster than a decade ago is sort of sensafionalized.
What does "much" mean in numbers? Take a look at CPU benchmarks and you can see just how much processor speed has improved. I find it noticeable.
And the author didn't mention microcode optimisation work that's been done since the mid 10's, sort of an odd oversight on the author's part.
More to the point of the article, another example is the comparison of Project Oberon vs modern Ubuntu meant to demonstrate how bloated code is. The facts cited may well be perfectly accurate and the claim that modern software is bloated is quite likely true, but the comparison is disingenuous for several reasons.
First, the project was specifically dedicated to making a very lean, understandable system--in the mid 80s when system memory was measured in kilobytes or maybe megabytes.
Multitasking operating systems of that era couldn't do anywhere near as much as they can now. And computers were almost all text based. Or had very low resolution graphics. Many modern protocols didn't exist then. No HTTP so no web servers. No ssh clients and demons. Etc. So comparing modern Ubuntu to an ancient project intended to be lean for its day isn't useful.
There's also no mention of size in terms of machine instructions, just source code lines... Between two different languages. So that's also an unfair comparison.
And there's no mention of what was or wasn't included in the line counts for Ubuntu. Maybe it is the whole distro, maybe all packages, who knows.
The main claim that systems are too big to understand. I mean that's the whole raison d'etre of a systems engineering and software engineering approaches: Break down complex problems and solutions into manageable chunks and distribute work.
If we limited ourselves to solutions that a single person could understand in its entirety we would be missing a lot of modern technologies. Like cars, satellites, airplanes, televisions, smartphones, etc.