this post was submitted on 09 Nov 2023
284 points (100.0% liked)
Technology
37708 readers
452 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I agree with you, but you know how Apple operates, slapping a shiny new name on an already existing concept and making it sound premium.
It's not virtualization. It's actually booted and runs on bare metal, same as the way Windows runs on a normal Windows computer: a proprietary closed UEFI firmware handles the boot process but boots an OS from the "hard drive" portion of non-volatile storage (usually an SSD on Windows machines). Whether you run Linux or Windows, that boot process starts the same.
Asahi Linux is configured so that Apple's firmware loads a Linux bootloader instead of booting MacOS.
Apple's base configurations are generally cheaper than similarly specced competitors, because their CPU/GPUs are so much cheaper than similar Intel/AMD/Nvidia chips. The expense comes from exorbitant prices for additional memory or storage, and the fact that they simply refuse to use cheaper display tech even in their cheapest laptops. The entry level laptop has a 13 inch 2560x1600 screen, which compares favorably to the highest end displays available on Thinkpads and Dells.
If you're already going to buy a laptop with a high quality HiDPI display, and are looking for high performance from your CPU/GPU, it takes a decent amount of storage/memory for a Macbook to overtake a similarly specced competitor in price.
For the most part, it isn't. The typical laptop you buy from the major manufacturers (Lenovo, HP, Dell) have closed-source firmware. They all end up supporting the open UEFI standard, but the implementation is usually closed source. Having the ability to flash new firmware that is mostly open source but with closed source binary blobs (like coreboot) or fully open source (like libreboot) gets closer to the hardware at startup, but still sits on proprietary implementations.
There's some movement to open source more and more of this process, but it's not quite there yet. AMD has the OpenSIL project and has publicly committed to open sourcing a functional firmware for those chips by 2026.
Asahi uses the open source m1n1 bootloader to load a U-boot to load desktop Linux bootloaders like GRUB (which generally expect UEFI compatibility), as described here:
If you compare the role of iBoot (proprietary Apple code) to the closed source firmware in the typical Dell/HP/Acer/Asus/Lenovo booting Linux, you'll see that it's basically just line drawing at a slightly later stage, where closed-source code hands off to open-source code. No matter how you slice it, it's not virtualization, unless you want to take the position that most laptops can only run virtualized OSes.
No, I mean that when you spec out a base model Macbook Air at $1,199 and compare to similarly specced Windows laptops, whose CPUs/GPUs can deliver comparable performance on benchmarks, and a similar quality display built into the laptop, the Macbook Air is usually cheaper. The Windows laptops tend to become cheaper when you're comparing Apple to non-Apple at higher memory and storage (roughly 16GB/1TB), but the base model Macbooks do compare favorably on price.
Ah, I see where some of the disconnect is. I'm comparing U.S. prices, where identical Apple hardware is significantly cheaper (that 15" Macbook Air starts at $1300 in the U.S., or £1058).
And I can't help but notice you've chosen a laptop with a worse screen (larger panel with lower resolution). Like I said, once you actually start looking at High DPI screens on laptops you'll find that Apple's prices are actually pretty cheap. 15 inch laptops with at least 2600 pixels of horizontal resolution generally start at higher prices. It's fair to say you don't need that kind of screen resolution, but the price for a device with those specs is going to be higher.
The CPU benchmarks on that laptop's CPU are also slightly behind the 15" Macbook Air, too, even held back by not having fans for managing thermals.
There's a huge market for new computers that have lower prices and lower performance than Apple's cheapest models. That doesn't mean that Apple's cheapest models are a bad price for what they are, as Dell and Lenovo have plenty of models that are roughly around Apple's price range, unless and until you start adding memory and storage. Thus, the backwards engineered pricing formula is that it's a pretty low price for the CPU/GPU, and a very high price for the Storage/Memory.
Well, that's becoming less common. Lots of motherboards are now relying on soldered RAM, and a few have started relying on soldered SSDs, too.
Yes, but you're not addressing my point that the price for the hardware isn't actually bad, and that people who complain would often just prefer to buy hardware with lower specs for a lower price.
The simple fact is that if you were to try to build a MacBook killer and try to compete on Apple's own turf by matching specs, you'd find that the entry level Apple devices are basically the same price as other laptops you could configure with similar specs, because Apple's baseline/entry level has a pretty powerful CPU/GPU and high resolution displays. So the appropriate response is not that they overcharge for what they give, but that they make choices that are more expensive for the consumer, which is a subtle difference that I've been trying to explain throughout this thread.
Why not? Half of the software I use is available on both Linux and MacOS, and frankly a substantial amount of what most people do is in browser anyway. If the software runs better on one device over another, that's a real world difference that can be measured. If you'd prefer to use Passmark or whatever other benchmark you'd like you use, you'll still see be able to compare specific CPUs.
I think the history is such that a "PC" is a computer compatible with the "IBM PC" which Macs were historically not and modern ones aren't either.
But I still like "Windows computer", we can abbreviate that to "WC".
Another complication was that DOS-using machines weren't always running Windows at one point in time.
True by the letter but not really by practice. PC is synonymous with a computer running Windows, or Linux at a push. I don't know whether that's because of Microsoft's early market dominance or because Apple enjoys marketing itself as a totally different entity, or some combination of the two. But yeah, usage determines meaning more than what the individual words mean in a more literal sense.
Originally "PC" was IBMPC or PC Compatible (as in compatible with IBM without using their trademark). An IBMPC could have run DOS, Windows or even OS/2
poop. it's poop.
I doubt it’s the last time. also while “PC” means personal computer, it was a very specific brand name by IBM, not a general purpose term. their computers (and clones later) became synonymous with x86-windows machines.
Even apple themselves have always distanced themselves from the term (I’m a Mac, and I’m a PC…).