this post was submitted on 03 Jul 2023
14 points (93.8% liked)

Technology

34920 readers
161 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Scooter411@lemmy.ml 2 points 1 year ago (16 children)

Someone smarter than me tell me if this is amazing and game changing or boring and niche.

[–] Slambert@lemmy.ml 2 points 1 year ago* (last edited 1 year ago) (10 children)

I wouldn't say I'm smarter than you, rather I just know some stuff about how computer components work, but what you're looking at is the latter.

The problem with trying to move to another type of computer is that modern software is designed solely for digital machines. Considering what's been stated above, how do you port these programs to another type of computer?

The answer is that you don't. Porting to different CPU architectures can already take some time for most programs, but asking for a port to a fundamentally different type of computer will take an incredibly long amount of time.

That is, if you can even port anything. Considering that digital and analogue computers are completely different, functional clones would have to be made instead by referencing source code. If you don't have the source, you're outta luck.

TL,DR: We've over-invested in digital computers and there's no going back.

[–] Scooter411@lemmy.ml 0 points 1 year ago (3 children)

Thank you. That helps clear it up!

[–] Slambert@lemmy.ml 1 points 1 year ago* (last edited 1 year ago)

No problem! I'm sorry if I came off as hostile towards analogue machines here. I actually think they're cool, just not in the way people think they are ("unraveling Moore's law" is a bit far-fetched, Microsoft.)

Oh, and some advice for anyone who isn't too well-versed in technology: The tech industry isn't known for honesty. For them, hype comes before everything, even profitability. Take any "revolutionary" or "innovative" new piece of tech with a grain of salt, especially now that tech companies are getting a bit goofy with their promises due to investors realizing that unprofitable companies aren't sustainable.

EDIT: The two deleted comments are dupilcates of this one that were posted due to a bug in Jerboa. Sorry for any confusion!

load more comments (6 replies)
load more comments (11 replies)