"When Apple's servers go down you lose the ability to do low-level recovery on these machines anyway, since DFU flashing requires phoning home to get a ticket for your machine as well as low-level configuration data"
For comparison, note that you can't do low level recovery at all on e.g. Google Pixel phones. Lose the contents of your storage there and the phone is bricked, as Google does not provide the low level flashing tools and signed images to do that. Apple does, with the phone home caveat.
On x86 PCs, you can usually download a BIOS image, but to flash it you need hardware tools on most computers. So it's better in that you don't need to phone home, and worse in that you need specialized tools to do it. You might also lose serial numbers and other personalization information.
Note: "Factory" images aren't this (and that's a misnomer, they aren't what is used at the factory). Those are OS images. To flash them your bootloader needs to be intact. We're talking about recovering from complete loss of writable storage here - that requires OEM tools to boot low level recovery images from the boot ROM alone. Some of these have leaked, but for very few devices. Apple just gives all of this to you on macOS and we have an open source reimplementation of their DFU tooling (idevicerestore) that you can run on any OS to restore any M1 Mac or iDevice to any (signable) OS version. To this day, all non-beta macOS for M1 releases are signable; I imagine they'll only revoke old ones if a major security flaw is found that impacts device backdoorability or allows SEP compromise or so, and even then you could still install an older OS in reduced security mode, just not as the first OS installed during low level recovery.
Google Pixel phones are phones though, and not personal computer devices like M1 Macs are. Phones have traditionally been more locked down. A fairer comparison would be to Chromebooks, which do have a screw to let you flash a different BIOS.
Chromebooks have a different threat model; they give up once an attacker has physical access. M1s do not; their security model covers the case of preventing an attacker from installing a backdoor on a machine using physical access. That's why Chromebooks have the screw; once you have the machine in front of you, sure, go ahead and install your own firmware. They can afford that in their threat model.
We in security like to throw around that "once the attacker has physical access it's game over", but that isn't true in practice, as evidenced by the success of locked platforms like iPhones. You can be secure against physical access, at least assuming your attacker can't decapsulate your CPU, rewire it with a FIB workstation, and recapsulate it. Apple's security model aims for this, while still allowing the user to install their own OS.
> We in security like to throw around that "once the attacker has physical access it's game over", but that isn't true in practice, as evidenced by the success of locked platforms like iPhones.
People find iPhone jailbreaks all the time. Apple patches them, but if you want a jailbroken iPhone, they exist.
Moreover, the vulnerabilities they keep finding were there before they find them, and the ones they'll find tomorrow are there right now, if you have the resources to look.
And this is distinct from the level of resources (generally considered impractical for anyone) for other forms of computer security, e.g. AES replaced 3DES in 2001 but still to my knowledge nobody -- not even the NSA or other state actors -- has ever even broken 3DES.
If you ask whether a large corporation could break into an iPhone today given physical access, the answer is going to be yes. And without decapsulating your CPU. The physical vulnerabilities often end up being related to USB and things like that.
Remember the whole farce where the FBI wanted to get into some iPhone and "couldn't" because Apple refused to help them? Then they just paid somebody to break into it.
> You can be secure against physical access, at least assuming your attacker can't decapsulate your CPU, rewire it with a FIB workstation, and recapsulate it
Or steal your phone and leave you with an identical looking one that transmits your authentication to the real one as soon as you try to unlock it.
And that's really the physical access problem. If you have a device with full disk encryption and have cleared the keys from memory, the attacker might not be able to get the contents even with physical access. But if they've had physical access and then return the device to you, can you still trust it? Nope.
> People find iPhone jailbreaks all the time. Apple patches them, but if you want a jailbroken iPhone, they exist.
Most of which are tethered, which means they do not persist through reboots. And even if they're untethered, you can recover the platform with a DFU restore.
Those jailbreaks only exist due to flaws in Apple's software, which is completely outside the premise that physical access means game over. If Apple's software were bug free there would be no practical jailbreaks, and physical access has nothing to do with it. Some of those jailbreaks work through remote access, like the ones used by NSO and company.
And again, this is why they have the signing feature when installing an OS - so you can't install an older version vulnerable to a jailbreak and then give the phone to someone.
> The physical vulnerabilities often end up being related to USB and things like that.
Those aren't physical, they're software - checkm8 was a bug in the boot ROM. And this is why iBoot on M1 Macs doesn't support USB host mode to boot from USB disks, because the attack surface there is massive and Apple decided to find another way to allow macOS boots from USB without exposing their more critical bootloader to that.
Nobody jailbreaks iPhones via physical vulnerabilities, but if they tried, Apple have been much more proactive in this area than pretty much everyone else too. They have things like memory encryption and authentication for the SEP memory area, environmental monitors, hardware design that avoids exposing things to software that could have a security impact, etc.
> Remember the whole farce where the FBI wanted to get into some iPhone and "couldn't" because Apple refused to help them? Then they just paid somebody to break into it.
And every new iPhone and every bug fixed, this gets harder. The attack they used for that phone no longer applies to newer ones. It's not just a game of whack-a-mole either; Apple have been at the forefront of attack mitigations and iDevices are some of the most secure consumer devices in existence (the Xbox One might be close).
Doesn't mean they get it perfect; I have an Apple security report open for a bypass for one of their mitigations. But they try harder than pretty much everyone else in the industry.
> Or steal your phone and leave you with an identical looking one that transmits your authentication to the real one as soon as you try to unlock it.
That might work to get your PIN, but if nothing else you'd immediately notice it doesn't work properly or all your data is gone. That sure beats a silent backdoor in your phone.
The screw hasn't been a thing for years, you prove ownership by sitting there for like 15 minutes pressing the power button when told :D
But it's a lot more than just "go ahead".
For stock users: enabling dev mode wipes storage and reinstalls CrOS, so trying to "attack" via dev mode a) won't give the attacker any data and b) will be extremely noticeable because the data is gone (and there will also be dev warning screens all the time unless they then make fully custom FW and CrOS that aren't secure but look exactly like stock)
For custom users: you can lock the debug interface with your password! Plus do your own vboot if you're concerned about.. ehh.. flashing over SPI directly bypassing the debugger (not easy on some devices) or somehow some vulnerability allowing internal flashing from the CPU.
So Chromebooks do have decent measures against physical access and allow for full CPU & EC firmware customization.
Thanks for pointing this out. My Chromebook is from the era when they still had a screw... :-)
Can you recover a modern Chromebook that has had all of its Flash memories tampered with or wiped externally, with just a USB cable? With a debug cable? That seems to be one of the major things that sets apart iDevices from others.
At least this design doc still says local attacks involving e.g. replacing writable storage are out of scope (which they very much aren't for iDevices):
From block diagrams it seems Cr50 security chip in recent Chromebooks just interfaces with AP Flash and the write protect pin, and thus verified boot isn't tied to silicon tamper resistance. That puts Chromebook at around the same level as T2 Macs in terms of tamper-resistance, in fact a bit worse (on T2s the T2 chip is the flash from the AP's point of view, so you can't just tamper with an external flash memory. You could MITM it in principle, though I heard the T2 Mac boards are also designed to make this difficult to pull off with buried traces).
Digging a bit more (the docs on this are really scattered...), it seems Cr50 RW storage can be recovered from the RO loader by using a physical USB-UART cable, but that requires taking apart the machine, since it won't work over USB, only UART. There is no mention of any Boot ROM recovery mechanism or of how the RO firmware is authenticated; RO is supposed to be "Read Only" in on-chip Flash, but eliminating every avenue for compromise of writable Flash is very hard (e.g. glitching the Boot ROM; I've seen too many "secure" microcontrollers compromised to bypass read or write protect in various ways...), so this approach is quite a bit weaker than Apple's: on these devices, you need specialized hardware to go to the lowest level of recovery, and you can't even recover from certain kinds of volatile storage loss or compromise, while on Apple devices any end user with no specialized knowledge can follow a few simple documented steps with another machine and a USB cable and an app in the App Store (or idevicerestore) to fully, securely, and confidently restore all the writable firmware on the machine to factory condition, and the process relies only on truly immutable mask ROM*.
* I can't say for sure there isn't any auxiliary writable flash that would let you brick the machine in some other way, but I haven't found anything yet. The prime suspect, the USB-PD controller chip that's in the path of the DFU USB connection (and thus could break it if it doesn't work), smells like it boots from ROM until the OS instructs it to launch the application image from Flash. It reports being in "DFU" mode until we issue the first power state switch command to bring it up, and we know these chips are custom Apple versions with Apple ROMs and the ROM does contain most of the USB-PD stack... so this sounds like the kind of thing Apple would get TI to do, since they really care about being resistant to bricks and persistent compromise.
(if you have unlocked CCD and didn't lock yourself out of it — you could e.g. forget the password)
you can do whatever you want using flashrom with the SPI flash chips for both AP and EC.
> this design doc
Hmm, there's no timestamp for the page itself but the picture attachments are dated 2009.
> Cr50 RW storage can be recovered from the RO loader by using a physical USB-UART cable
That sentence sounds weird, not sure why it wouldn't be recoverable via the SBU cable — the RO firmware is just an earlier build of the same firmware that does offer the GSC/Cr50 UART among its consoles via SBU.
"RO public key in ROM, RW public key in RO" so they thought of that, the root of the root of everything is truly immutable silicon.
> e.g. glitching the Boot ROM
There are likely measures against this, since this is a "secure chip" designed in modern days by Google rather than some shitty vendor.
But either way, attaching probes for voltage-glitching the root of trust is far beyond the capabilities of the proverbial "evil maid" :)
And the glitching question should apply to Apple and whoever else as well.
> any end user with no specialized knowledge can follow a few simple documented steps with another machine and a USB cable and an app in the App Store (or idevicerestore) to fully, securely, and confidently restore all the writable firmware
What Google does for the "normal people" AP flash recovery: only the RW CBFS partition is writable from the AP, and for recovery it boots from an RO partition: https://doc.coreboot.org/security/vboot/index.html#recovery-... (there's a theme to everything the CrOS team does haha… surprise, the EC also uses the RO+RW mechanism)
Yes, this doesn't protect against the "I'm a normal regular user with no debug access and someone desoldered my SPI flash, tampered with it and soldered it back" but… come on :) Especially considering hard-to-open laptop cases and WSON-8 flash chips, that's just a ridiculous scenario to care about.
Tell that to all the governments who keep trying to get Apple to break into their own devices... It's not ridiculous at all, the NSA does this all the time (supply chain intervention). And WSON-8 really isn't that hard to swap, a hot air rework station will do it in a couple minutes. I've done much worse than that ;)
Good to hear the RO partition is also authenticated, though. My mention of glitching was mostly around whether there was trust placed in Flash at any point, since then a one time glitching attack could compromise the machine. You could still brick it by screwing up RO, but at least not compromise it. And again, those attacks aren't nearly as hard as you think. People have productionized these things for console modchips; that's at the level where someone with a cheap soldering iron can do it.
As for chip design, I'm sure Google did a better job than most random vendors (they all keep making the same mistakes...), but Apple has been doing chip security for much longer so they also have had quite a lot of time to build up experience in this area. I would trust Apple's chips over Google's (at least design quality wise), and I say this as a xoogler ;)
Actually, I think I can come up with an apt comparison. Google has crazy good production system security, probably better than anyone else; they've just been doing it for long enough, have been targeted by governments (China incident, NSA wiretaps), and have the motivation to just keep pushing things forward. Apple is on a very similar level with system/silicon security, in part for similar reasons. They're both just in their own league in those categories.
No, it's just Flash so it's a point of persistency. If you rely on on-chip flash integrity for security, a one-time glitch can compromise the system forever. If you rely on it for availability, then a random glitch (or even just cell decay) can brick your system forever.
The screw meme just doesn't die… Modern Chromebooks use a debug USB-C cable for flashing and serial console access, and a "sit around for like 15 minutes pressing power button when told" process to prove ownership to unlock dev mode.
Thanks for that information. I am not very involved in chromebook development, so I didn't know that. Do you have a link to documentation that outlines the current steps?
> On x86 PCs, you can usually download a BIOS image, but to flash it you need hardware tools on most computers.
Even then, it isn't always obvious how to write that image to flash. Sometimes you get lucky and the downloadable BIOS is a full image of the serial flash chip, but sometimes it's a delta update, or the flash chip contains other data (like configuration) which isn't part of the public image.
And even then, there's a ton of other updatable firmware in a typical PC that isn't part of a BIOS update -- most video adapters and network devices will have at least a small flash chip on board, which may not be repairable at all if it's damaged.
GPUs are often the easiest to recover actually – boot with another GPU, keeping the screwed-up one plugged in – it will still be flashable, just run atiflash or whatever.
Also, high end "gamer" hardware often includes "dual BIOS" (on both mainboards and graphics cards) and/or "BIOS flashback" (special button makes a microcontroller flash an image from a USB stick inserted into a marked port, even without a CPU present).
Yup. On M1s Apple tries to make all external chips either use runtime loaded firmware or be bootloadable, both for security and recoverability reasons. I don't know if it's possible to truly brick an M1, but if it is you have to try really hard.
Unless, of course, you need to perform a low level recovery when Apples servers are unavailable, for whatever reason. Then, it seems that it would be quite the problem.
The details seem ok, but the nomenclature isn’t: iBoot only handles hardware initialization and kernel bootstrapping. Once you’ve got system logging set up the kernel is in control and you’re into XNU, which is doing all the MACF, BSD setup and whatnot. You’re not going to get iBoot logs unless you set up a serial console.
This is correct. All the logs shown there are from XNU. The author confused verifying the root filesystem seal hash and the auxiliary kernelcache (containing kexts), which is what iBoot2 does, with the processes of validating the APFS snapshot and metadata against that hash and loading dexts, which is what the kernel and other parts of macOS do.
FWIW, you don't get iBoot logs even if you set up a serial console on M1s; these days it seems Apple only log over their "dockchannel" feature in their proprietary USB debug protocol, and we haven't figured out how that works yet. Even if you were to get them, on retail builds the logs are just message hashes, not readable text.
(I tried pointing this out in a comment to the article, but I guess it must be caught in some moderation queue)
It does not, not fresh and without boot time actions to take. Certainly not the laptops.
The Mac Mini used to have a larger delay dependent on the monitor attached, due to monitor detection times; about 7 seconds to the XNU kernel being launched, or 13 without a monitor (I guess it waits for monitor detection to time out), but now that Apple removed boot-time display support on those I imagine it no longer matters. The laptops have a much shorter time since they only have to deal with a well controlled internal display.
So it's possible that an M1 Mac Mini with no monitor attached pre macOS 12.1 took 20 seconds to boot to a login screen, but that's the worst case configuration.
(Yes, removing the bootloader framebuffer is stupid; I have a bug filed with them about this, but haven't gotten a response yet)
Just to follow up, it's about 16 seconds on a MacBook Air. 7 seconds of that are iBoot, and 9 are macOS. If you turn off the startup chime sound (which is a checkbox in system settings), that cuts out 2 seconds of iBoot time and makes it 14 seconds total.
For comparison, a ThinkPad X230 running Arch takes 7 seconds to GRUB (which I skipped through as fast as possible), FDE unlock prompt at the 12 sec mark, and after typing in my password it takes a further 13 seconds to get to a graphical login prompt (so 25 seconds total time).
Honestly, adjusting for the fact that Linux does fewer things before the FDE unlock than macOS, and more things after it and before the login prompt (I've got a few extra daemons and thing installed), I'd say they're quite comparable. The MacBook is a bit better (2s faster firmware without the chime).
Of course, half the point of these machines is they can run weeks while in sleep mode (or even while powered on as long as the CPU is idle and the display is off) and they have practically instant wake. So boot time isn't really relevant, since you rarely have to shut down.
> If you turn off the startup chime sound (which is a checkbox in system settings), that cuts out 2 seconds of iBoot time and makes it 14 seconds total.
Lol, wait, so the startup chime blocks the rest of the boot process?! I would have expected the system to be working on booting while the chime was playing.
Now I'm not sure whether I should keep the chime turned on. I like the chime, but I'm not sure I want to pay two extra seconds for it. :)
I think it's mostly that by the time the system is ready to play the chime, it takes less than two seconds to do whatever is left before transferring control to iBoot2, and of course you can't leave audio playing as you call a new piece of code... so even if the playback happens in the background, it would still have to wait for it to finish.
How does the boot wallpaper get selected in big sur or monterey? If you boot an m1 imac, it will use a wallpaper that matches the colour of your mac.
I can see all the wallpapers are part of the blessed / sealed volume, what i can’t figure out is how it’s choosing with wallpaper to use?
To be clear I’m talking about the very early boot wallpaper before file vault is unlocked. This is not the (user configurable) login screen wallpaper. This is a fixed choice.
My best guess is an nvram key but i didn’t spy an obvious one.
As I understand it product information and iBoot1 are stored on an SPI flash chip in the device (which is hard to mess with from an OS unless you're quite determined to, but basically unrecoverable if you do), iBoot2 reads this information and uses it to select and fill in a device tree that is passed to your kernel.
I think you might be able to find this information if you do:
This is correct. As I understand it, Apple have a database of product information and if you do wipe your SPI flash, it will be restored by phoning home during the DFU restore process (by identifying your machine based on its fused in ECID), though I've yet to try this. So it should be in fact fully recoverable.
This information is in the system configuration area, and gets inserted into the device tree template by iBoot2. You can see the template DT and what sysconfig entries it pulls data from if you extract it from an OS restore image instead of looking at the runtime DT with ioreg.
There are mac keyboards that don't have function keys? I suppose the ones without numeric keypads don't - still boggles my mind those abominations exist...
Could anyone explain how the firmware (implementing UEFI, I assume) interacts with the components described here? My knowledge is limited to Tianocore edk2 on amd64 platforms.
Well, if you want to distrust Apple software you probably shouldn't be trusting their hardware, either.
That being said, I actually think this is a reasonable way to do secure boot. The default OS the device ships with can be validated, but there's still a proper owner override so you can boot into Linux or whatever. They even use the SEP to validate that the owner override has been tripped by the owner. The first user account you make gets handed a key generated by the SEP that can be used to sign kernels, so only that account can actually use the owner override. This is a good way to stop evil-maid attacks in their tracks while still not locking the user out of their property.
My only real complaint is that Apple's gone to great lengths to ensure the iOS side of their business is completely unaffected by owner overrides:
- If you boot into an owner-signed OS volume, macOS disables it's iOS support
- iPad-fused M1s won't generate or respect owner keys
This is silly. If individual iOS applications are sensitive to owner overrides, then they already have devicecheck APIs to get a cryptographic attestation that they haven't been tampered with. The SEP could flag those attestations as coming from an owner-signed kernel and picky banking apps[0] could check for that.
[0] And Pokemon GO, because it's easier to blacklist jailbroken users than to enforce a rate limit on GPS jumps
For the record, I'm in favor of legal mandate that hardware owners have the buy-time option to enable adding their own keys to any root trust stores on their devices. However, that'd be in addition to Apple's keys and wouldn't be about the security of Apple's keys, because Apple is part of the fundamental trust foundation if you buy a Mac or iDevice. Period. The devices are massively vertically integrated, right down to the core silicon which is completely custom. Apple has absolutely unfettered ultimate low level access opportunity up and down the stack. If you completely don't trust Apple, then you absolutely should not use their hardware at all. So some level "trust Apple" is simply a security axiom on this platform.
And they've shown that to be not unreasonable at least when it comes something like root private keys. Fact is they've been operating for a long time now and like the rest of the big players that hasn't been a leak issue. It's not that big a deal for a big player to physically secure such things to a high enough degree that it's unlikely to be a limiting factor. Dedicated rooms, full offline, hardware backed Shamir's secret sharing for m-of-n key signing ritual requirements etc etc.
>If you completely don't trust Apple, then you absolutely should not use their hardware at all. So some level "trust Apple" is simply a security axiom on this platform.
It is not about trusting Apple or any other company for that matter. It is about tendency and attempt to make it a norm/legalize to sell personal computers without respecting right of the owner to have a full control over their own computer. If owner cannot fully control own computer this computer cannot be called 'personal' anymore.
This practice needs a push back as it completely unacceptable. It should be made illegal to sell such devices if that is not already the case because you can be left without working computer just because link to the company isn't available for some reason.
Company goes away and you are left without a working computer. Internet isn't available and you have brick instead of your computer. This is crazy and even more crazy that there are bunch of people brainwashed enough to the level that they do not even perceive it as a problem. Probably because they can't think 3 steps forward.
We’re talking about Apple, one of the most valuable companies in the world, sat on over $100bn in cash just “going away”, in what, the lifetime of a laptop? For me that’s 3-5 years, for others maybe 10. That’s an absurd premise. The probability of that is so close to zero it doesn’t bear consideration.
What if it's broken by legislators and the pieces are named something differently. Want to bet no apple.com links get broken? And their certificates?
The point is, if I want to buy a personal computer and stuff it in the closet for 50 years to use later, that's between me and the creator. Not Tim Cook.
> It is about tendency and attempt to make it a norm/legalize to sell personal computers without respecting right of the owner to have a full control over their own computer. If owner cannot fully control own computer this computer cannot be called 'personal' anymore.
I have bad news about Intel CPUs.
>[Intel] processors are running a closed-source variation of the open-source MINIX 3. We don't know exactly what version or how it's been modified since we don't have the source code. We do know that with it there Neither Linux nor any other operating system have final control of the x86 platform.
And this isn't the case on M1 machines. On M1s, all blobs that remain after you launch Linux are sandboxed behind IOMMUs, so they cannot take over the system. Ignoring hardware backdoors (which you can never be sure don't exist, on any system), you can be reasonable confident that an M1 system doesn't have a (functional) backdoor running while you're running your own OS on it. Very few systems have this property; mostly only fully open boot systems like the Pinebook or Talos workstations. ~No x86 system does, not even the ones running Libreboot since they almost always have hardware with full DMA access running blobs.
The full control of devices you own is absolutely essential. It requires a complete transparency of basic components like cpu micro-code, firmware and hardware otherwise it can and will be abused.[0]
.. unless everything is absolutely transparent including microcode and hardware it is not acceptable as freedom respecting solution.[1]
then I've got unexpected opposition from the one who is making linux for M1 ( marcan_42). If even him fail to understand the consequences of accepting such hostage situation with Apple devices and claim "Freedom isn't the answer." [2]. If even he is ready to downgrade discussion to the personal disrespect toward people like me [3] who merely trying to point out the the danger of the hostage situation while go 'easy' on Apple and ready to justify all of their current mistakes then we have a serious problem. I do not wish to use the term "doomed" but probably we observe limited ability of highly technical minds to resist to the primitive brainwashing and manipulation the big companies provide by presenting it as a norm to trade 'freedom' for the 'safety' . Some people can't even think a few steps forward and understand that by helping companies to promote such agenda we'll end up with loosing both 'safety' and 'freedom'.
I love how every time you bring this up you link every post except the one where you compared Apple to a dictatorship, then went on to insult every user who chooses their hardware over others.
I'm happy to debate the pros and cons to different security approaches, and I want every prospective buyer of these machines to be informed about the decisions and trade-offs that went into their deaign, and what to expect. I'm not interested in debating someone who immediately dismisses all technical discussion and just invokes references to authoritarianism, brings up boiling frogs with no evidence, immediately dismisses my arguments as wrong and assumes they need correction, and ends with ad hominem attacks.
As I said, go buy a Pinebook and please leave the rest of us alone. We're trying to give the users of these machines choice. You're trying to take away our choice to use them through moral arguments.
marcan seems to be part of a new breed of hacker, less interested in the "why" we do it and more interested in the "how" of it. Works pretty well for tackling a challenge like blindly picking at a black-box ISA/SIP, but I don't think his project has the kind of ideological understanding that keeps the libre desktop alive. Getting it to work is one thing; building a community to maintain your work is another.
Unfortunately, that's going to constitute a lot of the people you encounter these days. Half-measures are better than no-measures, but I really do miss the days of vigilant software development instead of cleaning up Apple's scraps.
You're not giving Hector Marcan enough credit. He was on Team Twiizers and fail0verflow; groups that did a lot of hacking to open up closed systems. It's not like he's unaware of the customer abuse that happens in the proprietary world.
The "look beyond freedom" quote probably should also be looked at with the context that he's talking about the FSF, which has an odd habit of being extremely absolutist in ways that actually hurt the user. Like, they'll point out that Wi-Fi cards with proprietary firmware are bad, but then endorse very similar hardware where the firmware blob is in ROM or some features are lasered off just to conform to the "proprietary ROMs don't count" rule. Marcan is arguing for creating a gradual sliding scale of "proprietary, user-hostile, and/or insecure" to "Free, user-respecting, and/or secure" and then looking at the trade-offs between them, rather than just creating a really high bar based on what made sense in the late 1980s and sticking to it forever.
I'm giving the dude all the credit he deserves. fail0verflow is amazing, the stuff they did with Nvidia Tegra/Nintendo Switch was nothing short of miraculous and insane; that doesn't change the cards at the table though, and it doesn't make me any less skeptical of where all this leads. Again, I've got no intention of stopping people who are making progress, even if it's progress I disagree with, but he still has to prove himself here, and I'm not entirely confident that we're going to end up with "Linux, but on the M1" without a number of asterisks trailing the statement. That was the case with the Switch, that was the case with the PS4, and it's unfortunately crawling in that direction for the M1 as well.
I fail to see how you can equate the Switch and PS4, platforms designed to disallow the user from running their own OS, and where the manufacturer actively works to stop any such attempts, to the M1, where the manufacturer actively invested in developing the infrastructure required to allow users to securely boot their own OS.
If you're talking about building a sustainable community so the end product is polished and upstreamed and ready for end users, I'd say we have that with the M1. Things are already getting upstreamed and there is more than enough momentum. This wasn't the case with PS4, which was just me and a few other fail0verflow folks putting together a proof of concept. It helps when you don't have to spend cycles finding exploits and can focus on delivering a working OS :)
> less interested in the "why" we do it and more interested in the "how" of it.
I mean, to be honest, someone only knowing “whys” alone is kind of disappointing — especially combined with the often seen narcissism of developers and you get someone who does not understand the problem domain spewing bullshit about it with confidence. We can see plenty of examples to that under any firefox, wayland or systemd threads.
I agree it's "your own device", but Apple's EULA makes it really clear it's only your own device insofar as you can choose to destroy it. They retain a residual right over the hardware, a partial ownership if you will, when it comes to what software is on it. You aren't buying hardware. You're buying an experience. You don't have the right to experience arbitrary software running on it, even if you trust it.
It's one of the reasons I'm not using Apple products anymore.
Apple's EULA explicitly allows you to replace the open source components of the OS with your own. One such open source component is the XNU kernel itself. This is what allows Asahi Linux to exist not just technically, but also in full compliance with Apple's EULA.
> hardware owners have the buy-time option to enable adding their own keys to any root trust stores on their devices
Would you really be more comfortable knowing that your hardware vendor had the capability to produce machines with a low-level, unremoveable backdoor? I'm not sure I would. A feature like that can be used against users more easily than it can be used by those users.
>Would you really be more comfortable knowing that your hardware vendor had the capability to produce machines with a low-level, unremoveable backdoor?
What? They do. Apple absolutely has the capability to build any or all machines with low-level unremoveable backdoors, like, in the freaking processor if they wanted. I'm not clear on what your issue is here. The current state of affairs is that for devices like the iPhone, the manufacturer can setup a secure software tree where the root of trust contains only their keys. And for many (if not most) of their customers that's a good thing, because in their threat model running their own arbitrary code is of lower utility and much higher risk then getting social engineered into bypassing key protections or the like. There is important power in grouping together buying decisions in an unbypassable way, it's why the likes of Facebook for example cannot insist on bypassing iOS privacy protections. They can't pick people off, because people literally do not have the choice. Facebook must deal with Apple for the ~97% (or whatever it is who don't/can't jailbreak) majority of users.
However there are real issues with that too for a sizable number of owners. So all I want is that there be an option at purchase time which allows owners to load their own root keys. The whole chain of trust infrastructure is still there, but technical users or those with specific needs can then run their own (and still be better off). Making it buy-time means that users who want to ensure they cannot be compelled later can still do that too. Nobody loses.
A feature like that can be used against users more easily than it can be used by those users.
How? Most people will stick with defaults, and I'd be ok with Apple or whomever qualifying an open device with reduced software support or some small charge for example too. And once it's been sold, it's the same as currently. I think that's a reasonable tradeoff.
You're missing my point. What I'm saying is that, as things currently stand, all of the CPUs that Apple ships in products are functionally identical -- all of them share the same root of trust in ROM (afaik?), and it would take a significant effort for Apple to produce devices which differ from that specification.
> How? Most people will stick with defaults
By having an attacker deliver a system to a user with a custom root of trust -- which could mean anything from a state-sponsored attacker to an abusive partner.
The OP statement was about insecurity that comes with signing code with anyone other than the owner.
It doesn't matter how secure communication between Apple and Apple device because even if it's perfect the owner is not secured from the Apple itself and those who Apple would love to communicate with. For instance oppressive governments. (here the result of such communication: blocked app that oppresive government didn't like https://apps.apple.com/us/app/%D0%BD%D0%B0%D0%B2%D0%B0%D0%BB...)
It really depends on the threat you are planing against. If for some reason I'm target of US government - I'm screwed anyway. If my concern is trusting the laptop after I left it in train station and got it back from some random dude - it's good enough.
>It really depends on the threat you are planing against.
What about oppressive let's say Russian government while you travel let's say in Ukraine and then occupation occurs. Not a fantastic scenario by the way ...
It really doesn't depend on the threat at all. It's about the model of the society you wish to have and what values you promote.
It's about who you wish to be responsible : the 'big company' caring about your safety and taking your freedom on the way or you caring yourself about own safety and preserving freedom on the way. I do not really think there is a choice here because the first option will always be abused at some point.
Freedom does matter and it comes with responsibility. THIS is the main issue here. THIS is what separates society with responsible citizens from the society with 'irresponsible people' who wish to trade their freedom for 'safety' resulting in loosing both (and democracy itself after some time).
All sentiments like this one and those similar to it elide the facts that 1) we’ve tried relying on “user responsibility” before, and excusing the comically bad outcomes through victim blaming doesn’t change them; and 2) we didn’t get together and vote Apple the only manufacturer of computers.
If you don’t like their model, choose someone else. Why should average users who would otherwise be served perfectly well by Apple’s solution be required to be “responsible” for some subset of personal security you think denotes a “responsible” citizen from an “irresponsible” one?
>If you don’t like their model, choose someone else.
Many follow their example and without push back there will be no someone else because average users my not understand consequences unless they are educated by people who do understand them. Like with many other areas requiring certain level of expertise to understand consequences of certain desicions.
> we’ve tried relying on “user responsibility” before,
>Why should average users who would otherwise be served perfectly well by Apple’s solution be required to be “responsible”
Do you believe in choice? If you do then average users should have a choice whether to rely on Apple or switch such functionality off. Without having such choice people become less and less responsible. You can say they choose by buying such machines but I do not think this could be qualified as a choice just like accepting EULA. It's not really a choice.
User responsibility and device safety are not mutually exclusive. You can keep the iPhone exactly as-is and add a developer mode that would pretty much shut up every nerd this side of the Mississippi.
Which is exactly what they did with M1 (add a developer mode that doesn't put their normal users at risk by allowing for persistent supply chain compromise attacks), but it doesn't seem to be enough to make some people happy...
To be fair to the other side of the argument, I think people are mostly upset about the iPhone. There's an implicit fear (which I don't agree with!) that if Apple is so insistent on keeping the iPhone locked down, that must be their ultimate goal for their other platforms as well.
I think if Apple was to add a developer mode to the iPhone, 99% of people would actually shut up.
People seem to forget that the iPhone is 15 years old, and Macs are still an open platform. It hasn't happened yet, but somehow it's always "going to happen"...
I 100% agree with you and have argued the same point! I expect to get an Apple Silicon Mac at some point and to put Asahi Linux on it. :)
I just don't think that the counterargument is completely spurious. Craig Federighi taking the stand in court and saying that Mac security is at a place they "don’t find acceptable" doesn't exactly make me feel all warm and fuzzy about Apple's future plans. And so if someone says they don't want to buy an M1 Mac, even if it's open today, because they see the iPhone as indicative of the direction Apple is going, I think that's fair, even if I disagree about Apple's intentions.
By contrast, if Apple added a way to unlock the bootloader on iPhones tomorrow, this argument would immediately evaporate. :)
Well, there's always the part where trying to lock down existing devices would run afoul of various consumer protection and warranty laws. Sony already got sued and lost for locking down the PS3, and this would be a much higher profile case, especially once Asahi Linux gets to the point where we have a significant user base...
Apple could certainly choose to lock down future Mac iterations (though I don't think they will), but I think fears that they might retroactively lock down existing Macs are just unfounded and ignore the realities of the situation.
Of course people are free to buy or not buy machines for whatever reason; that's why I want everyone to be informed about the details. My beef is with those who disagree with this stance, and think people shouldn't buy these machines period because Apple is evil and those who buy their devices sheep, and anyone who thinks otherwise is mistaken, and there is no room for having different priorities when choosing hardware because Freedom™ is the only priority that matters. For whatever definition of Freedom™ they feel like using that day.
I think the irony would be less dramatic if the free software wasn't running on hardware constructed with slave labor. Put your eggs in whichever baskets you choose, but people's worries about the conflict of interests here is fully justified. Apple and open source are not friends, and while they'll be happy to tip their hat every once in a while (much like Microsoft's platitudes with WSL and GitHub), their ultimate goal is to stomp you out and expand control. Once again, nobody here has much interest in stopping you here; just don't be surprised when your blood, sweat and tears ultimately end up being used to grease the gears of their production line. We're talking about a trillion-dollar company that doesn't release their own device drivers or schematics; it's ridiculous that we even need to finish the job for them in the first place. It's hard to see this work as "noble" in the same way other free software projects are, at least to me.
> I think the irony would be less dramatic if the free software wasn't running on hardware constructed with slave labor.
Why do you feel the need to direct criticism of Apple's business practices at me? I do not work for Apple.
> but people's worries about the conflict of interests here is fully justified.
Then don't buy the machines and move on with your life?
> Apple and open source are not friends
Neither are they enemies. The world isn't binary. I do not exclude from my life everything that is related to everyone who isn't explicitly my friend, do you?
> their ultimate goal is to stomp you out and expand control
Ah yes, stomp us out by... building machines we can use to run our own OS? They could've just not done that and we wouldn't exist.
> just don't be surprised when your blood, sweat and tears ultimately end up being used to grease the gears of their production line.
So which is, are they going to stomp us out or are they going to embrace the extra business we bring? You can't have it both ways, you know.
> We're talking about a trillion-dollar company that doesn't release their own device drivers or schematics; it's ridiculous that we even need to finish the job for them in the first place.
I find it fun finishing the job for them. This is exactly the kind of project I enjoy doing. If you don't, then choose a different free software to contribute to.
> It's hard to see this work as "noble" in the same way other free software projects are, at least to me.
So our project is inherently morally inferior to others because you simultaneously think Apple should've done the work for us and Apple are our enemies. ????????
Seriously, this makes no sense. You know Linux itself started out as a hobby OS with no corporate backing and tons of other drivers are reverse engineered, right? Do you also think Nouveau isn't noble? What about LineageOS? What about XBMC/Kodi when it started? Freedreno, Panfrost, and friends? All of those projects are or were about bringing free software to devices designed and manufactured by giant corporations without any support.
Heck, I got sued by a multinational for bringing Linux back to the PS3, and I still don't regret it. That work got upstreamed, by the way.
> Well, there's always the part where trying to lock down existing devices would run afoul of various consumer protection and warranty laws. Sony already got sued and lost for locking down the PS3, and this would be a much higher profile case, especially once Asahi Linux gets to the point where we have a significant user base...
How about much simpler scenario, no threat at all. Just dumb bug in software that puts your computer in DFU mode that says, please connect it to another Mac. Nice isn't it? And then you should run and find 'another mac'. What if there are no other macs around? What if you travel and have no connection to the internet or it's limited ? This is not a hypothetical situation, this is exactly what have happened in my case. And then you are stuck in the field without any way to recover your machine. Nice isn't it?
"When Apple's servers go down you lose the ability to do low-level recovery on these machines anyway, since DFU flashing requires phoning home to get a ticket for your machine as well as low-level configuration data"
> Just dumb bug in software that puts your computer in DFU mode that says, please connect it to another Mac. Nice isn't it? And then you should run and find 'another mac'.
If your fundamental firmware-stuff is screwed up on any platform, you are going to have a bad time. Being able to plug into an off-the-shelf machine and fix it, or to plug into another PC running special software, is much better than I'm accustomed to.
>If your fundamental firmware-stuff is screwed up on any platform
Sure I just have an impression after some googling that this DFU happens much more frequently then one would expect. Certainly I didn't expect it to happen in the first day after purchase but it did. So perhaps this pleasing 'much better' ability to fix it by just connecting it with another device that you probably do not possess(in my case) comes with another pleasure of having to do it more frequently. If that is the case then I really prefer the state to which you are accustomed to.
I have never had to deal with firmware on Apple hardware (excepting "zapping the PRAM" on classic Macs). I've had to deal with it dozens of times on other platforms.
We have 3 Apple Silicon based Macs in the house, and there's 4-5 others that I support. So far 0 incidents in about 3 device years. I don't think it's tremendously common like you imply.
In the same time period, I built two Ryzen machines, and had to swap in older processors to run BIOS updates on each, and the laptops in my wife's classroom all decided to take themselves out of service for an hour one day to do BIOS updates that were delivered by Windows update and then only triggered on the second reboot after update when we all thought we were safe.
I've bought one of every major M1 model for testing purposes and have done all kinds of crazy things to them, and the only time something weird happened was with the original firmware version where I managed to break recovery mode by messing with diskutil, but I was able to fix it from macOS without requiring a DFU flash. It's never happened again and I've done the same thing dozens of times, so I think that was some silly bug in the shipping firmware version that has long since been fixed. I never actually had to resort to DFU recovery (though I still tested it a bunch as part of improving support for it in idevicerestore).
Yes, if you don't have internet access you have a problem, but I'm personally happy enough with the benefits of this security model that I'm willing to accept the tradeoff.
The Mac has existed for 37 years and the iPhone for 15 of those and the Mac is still open to running whatever OS users choose. You really need to find an argument other than an unqualified "the future is doom and gloom" when after all this time that future hasn't come and the platform remains open.
>The Mac has existed for 37 years and the iPhone for 15 of those
So iPhone is closed for 15 years already and thus "the future is doom and gloom" is happening for 15 years already. The more important question is what will be next.
>You really need to find an argument other than an unqualified "the future is doom and gloom" when after all this time that future hasn't come and the platform remains open.
Argument can be qualified or unqualified depending on the topic. It is unclear which topic assumed here.
> So iPhone is closed for 15 years already and thus "the future is doom and gloom" is happening for 15 years already. The more important question is what will be next.
"I disagree with one product's direction, therefore all other products from the same company are doomed to that direction" is not a valid argument, especially not after 15 years of it not happening. Companies are capable of producing products targeting different markets and use cases.
"Domino's added a pizza I don't like to their menu, what will be next? Their entire line up will be a horrible inedible mess in a few years!"
See how stupid that sounds?
All you have to do is not buy an iPhone (like I didn't either) and stop spreading FUD about Macs.
There is a difference between topic of 'independence and freedom' and the 'topic of security' . They are somewhat related but not the same. If deviation toward the 'topic of security' from the 'topic of independence and freedom' observed and if 'independence and freedom topic' is presented as useless and unrelated to 'real life'. Intentionally or not - in both cases it can be considered as contributing to the "doom and gloom". So I am not sure I need to find other argument as it is provided each time when switching occurs from 'independence request' to ' it's good security'.
>... not a valid argument, especially not after 15 years of it not happening.
It's not about trusting or not particular company. The idea is to protect and insist on certain level of respect toward owners of the computers that no company would dare/able to diminish without consequnces.
>"Domino's added a pizza I don't like to their menu, what will be next? Their entire line up will be a horrible inedible mess in a few years!"
>See how stupid that sounds?
"what will be next" question was addressing the issue of trusting to any company and about level of dependency from any company for any owner of any computer. The context of the question was not about trusting particular company. When the context is switched then anything may appear "stupid" but then it is interpretation to blame not the question.
>All you have to do is not buy an iPhone (like I didn't either) and stop spreading FUD about Macs.
The issue I am discussing can not be resolved by marked and buying preferences and therefore 'just buy something else' would not work and cannot help.
Like I said above, it's not about trusting Macs or company. It's about pushing back against the tendency to make personal computers more dependent and less personal.
It's about respecting independence for the owner not about security, not about trust to a particular company. It's about making trust to the company irrelevant enough. It's about preserving level of respect to the owner which reduces dependency and importance of the trust to a particular company.
> It's about respecting independence for the owner not about security, not about trust to a particular company. It's about making trust to the company irrelevant enough. It's about preserving level of respect to the owner which reduces dependency and importance of the trust to a particular company.
I already explained to you how you have to trust the company if you're using their silicon. There is no way around that for modern devices. You're trying to fight a fight against the laws of physics.
Again, if you need absolute trust, then I suggest you order a Precursor, and then you'll have to be content doing all your computing on a 100MHz CPU.
> The idea is to protect and insist on certain level of respect toward owners of the computers that no company would dare/able to diminish without consequnces.
So you're saying companies shouldn't be allowed to have secure systems that benefit the average user; they should all be forced to do what you say is "respect" users, which means requiring that they develop their own secure boot infrastructure, in your world, since that's the only way they can be in control (or just give up security altogether).
Do you realize how hypocritical it is for someone advocating for freedom to try to take away everyone's freedom of choice by making it illegal or immoral to trade off your own personal vision of freedom for something they might care more about? Seriously, you brought up authoritarian governments, but you're the one using their propaganda tactics here. You're not just saying users should be able to buy devices they control; you're saying they shouldn't have the choice but to be in "control", even if it hurts them in other ways. The Way of the Freedom Party is the One True Way.
Have you considered that maybe, just maybe, there are actual practical consumer-protecting arguments to be made here without resorting to ridiculous extremist positions? Here's one: companies should be required to allow users to run their own software on devices that have reached end of support and are no longer receiving security updates. See? That is the kind of useful policy position that'll get people interested and might even have a chance at becoming law. Not "iPhones are evil and should be illegal".
Additionally, many of these security measures are put in place to prevent that rootkits/malware can compromise the firmware, boot loader, or operating system.
How do you secure something when other's know the secret? There has to be some "secret" (aka key) that some definition of "you" only knows, that the system then tests against (hopefully via some kind of asymmetric system or hash).
> BoorishBears - What key is shared between you and the manufacturer here? There's signing keys and there's passcodes, which ones are you "not the only one with"?
because you don't even have the key? not sure where passcodes came from
https://news.ycombinator.com/item?id=29704923