this post was submitted on 21 Jan 2024
250 points (97.7% liked)
Technology
59340 readers
5895 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Also, lots of users aren't gonna want the main system memory on the CPU die. Aside from the fact that it creates a clear path for vendors to artificially inflate prices through pretended scarcity via product segmentation and bundles, it also prevents the end users from upgrading the machines.
I'm pretty sure this even goes against the stated goals of the EU in terms of reduction of e-waste.
I have no doubt that a handful of vendors cooperating could restrict their offer and force the hand of end users, but I don't think this would be here to stay. Unless it provides such a drastic performance boost (like 2x or more) that it could be enough of an incentive to convince the masses.
Outside of DIY, end users don’t care. See: Apple.
Also, if you have a laptop with LPDDR5, it is soldered. If it has DDR5 or some variant of DDR4, it is likely also soldered as most OEMs did away with DIMM slots.
I don’t like or agree with the practice.
Even people who build their own computers usually buy all the RAM they want at the time that they're building it.
The biggest difference to them is likely the feeling that they're losing their ability to upgrade, more than the actual upgrade itself. I still think that feeling is an important factor, though.
Biggest difference is that defective RAM can cost you a lot more imo.
Yes, but statistically it’ll be caught during the return or warranty period, and then RAM failures are extremely rare after that.
Frame.work laptops have non soldered, upgradable DDR5 memory. In fact, you can buy a laptop with no memory and just buy it somewhere else and install it yourself.
Yeah, but it is regular DDR5, which is less power efficient.
I do love Framework, however. They are at the top of my list when I eventually upgrade my laptop.
Hopefully they give us CAMM2 modules with LPDDR5 at that point.
I always think of my old Asus eePc netbook from 2010 that had a special compartment that was accessible from outside without opening up the notebook itself, just so that users would be able to upgrade their RAM. How did times change from "help the user to get what he needs" to "help the user get what we need". Personally I blame Apple for this tbh.
This is how this looked: http://images.bit-tech.net/content_images/2007/12/add_more_storage_space_to_your_asus_eee_pc/panel.jpg
And the best part: My son is using this netbook now with a light weight linux. I actually switched the RAM 2 month ago. It even plays Minecraft and he draws on it with my drawing tablet.
In the case of LPDDR5, we don’t have removable memory due to tight signaling requirements and the fact that the DIMM slots themselves take up too much space when populated.
LPCAMM2 solves this, so I hope it is widely adopted going forward because LPDDRR5 offers a huge upgrade over previous gen.
But even soldered ram isn’t as bad as in-cpu ram. Soldered ram can be replaced/upgraded by skilled technicians. I don’t think that’s possible at all with in-cpu ram.
Ok i know it isn't the point of your comment and i agree with the whole premise but who, i say who is soldering their own ram? I admit that it should be possible but the limited upgradeability imitations not to mention the skill you'd need... I say it puts soldered ram into the same echelon of "not upgradeable"
Can anyone speak to this? Am i wrong about the difficulty and hardware limits?
Exactly. Few people are willing to deal with the adhesive used in Macs and smartphones. Even fewer will deal with solder.
Yeah, I agree.
As for who those few are, well, I wouldn’t myself… probably… but I’d definitely like the option of taking my laptop to someone like Louis Rossmann who can do such work. He’s even shown that sometimes the ram gets destroyed by apples weird circuit designs and if it was just soldered on, the laptop and all your data would actually be salvageable.
Hmm, interesting. Thank you!
Yeah, but at least for now, we can still buy laptops with unsoldered RAM and storage.🤞
Besides, Apple is more of a cult than a tech company, so I am not convinced their customers should be taken as an example of a natural customer's behavior.
And I agree that most users don't care, although, this is mostly true in corporate environments, where computers have an expected lifespan of 3 years tops. In that case having the RAM soldered or not does not change anything, as the machine will get spec'ed according to what the company needs, and will get replaced before it ever reaches obsolescence.
For the end users, many still consider keeping a machine 5+ years, and if you check the average "long lasting" (~2k USD) machine from 5 years ago, it is an 8th gen i5 (4 cores, 8 threads) with 8GB of DDR4 and 256GB, or at best 512GB SSDs. Not that those are terrible specs by today's standard, but the people who spent 2k on a machine back then will probably want to have at least 16G of RAM now. And 1TB SSDs. And if at all possible, more than 8 threads. Heck, I just got a workstation for 550 bucks that has a ryzen 7 with 16 threads...
And that's where companies like framework come in. I advocate for them as much as possible, along with companies like system76 and purism. If we keep voting with our wallets for such companies, even if the CPUs becomes a SoC entirely, we will still get to have upgradability paths thanks the modularity of their laptops.
Edit: as expected, religious people got offended about me calling out their religion, thus proving my point. 🥲
Edit 2: don't get me wrong, I'm not denying that Apple has a good tech stack (as a BSD lover, that would be silly), and that the Lemmy audience is likely aware of that too. But it is also abundantly clear that the overwhelming majority of the Apple customers have absolutely zero idea what makes their "must have" tech stand out, and are merely in for the cult part. If Apple would stop making sense technologically, it wouldn't make the slightest difference to them.
The iPhone I’m currently on right now feels more like a piece of tech then a religious symbol but how would I know?
On CPU RAM does provide much faster performance. That’s the reason they are going that route.
It's part of the reason why RAM was always placed close to the CPU on the motherboard anyway. The farther they are apart, the more time and energy is used to transfer data and instructions between them.
Right, it s a physics issue, not greed. I mean, they’re going to make a margin off of it for sure but that’s not the sole reason to do this.
I'm imagining a world with desktops and laptops that have On-CPU-RAM and On-Motherboard-RAM with the traditionally slotted RAM acting as a swap for the On-CPU-RAM.
I mean, isn't that in principle how old swaps traditionally work? They take up some space on your slower disk drive to "swap" data from RAM onto when out of RAM. On-Motherboard-RAM, since it's slower than On-CPU-RAM, could achieve the same purpose, meaning limited On-CPU-RAM wouldn't be as impactful.
Greed might not be the main driving force, but it's absolutely there too. I predict on-cpu ram costing more than it should in the future due to lack of competition. (yes I know there aren't that many manufacturers of the actual chips even today when the consumers can choose from many brands of ram sticks)
Which makes a lot of sense as RAM speed is the one big bottleneck.
People who really care about computers buy handmade artisanal transistors.
I know you're kidding, but: http://www.homebrewcpu.com/
wasnt there a guy who made his own GPU too?
edit Yes, https://www.youtube.com/watch?v=-vHwZhWoWkk