Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Pixel Binary Transparency: verifiable security for Pixel devices (googleblog.com)
205 points by transpute on Aug 21, 2023 | hide | past | favorite | 114 comments


On the one hand, this is fantastic — vendors should absolutely publish proper commitments like this of their firmware images.

On the other hand, this is completely missing a proof that your device is running the firmware it claims to be running. The check of the device firmware is:

    FINGERPRINT=$(adb shell getprop ro.build.fingerprint)
    VBMETA_DIGEST=$(adb shell getprop ro.boot.vbmeta.digest)
This verifies nothing.

The Pixel is (or at least should be [0]) capable of attesting to its running firmware. There would be additional bonus points for having the bootloader stages flash QR codes containing the hashes of the next stages, which would enable very straightforward verification.

With a secure element-based approach, the device would need to be able to convince adb that it has a genuine Google secure element and that the secure element says the fingerprint is such-and-such. The former would likely need additional data in the Merkle tree to avoid a situation in which Google pushes out a targeted, properly signed, but malicious secure element firmware that lies about the fingerprint. (And the entire point of binary transparency is to make this type of attack difficult.)

With a bootloader approach, in principle the entire verification could be chained from ROM and no secure element would be needed.

[0] I haven’t dug through exactly what the Google secure element does. I’m quite confident that it can attest, to Google, something about the running firmware, because this is useful for DRM and the various (horrible and mostly useless) safety checks and attestations available through the Android APIs. But maybe it actually can’t attest to the firmware fingerprint to any party other than Google.


I worked in Android Security. Pixel phones (and all other phones running Tiramisu+) can attest to the full DICE chain's integrity to any app that requests it. This can be done through the KeyStore API.


>> Pixel phones (and all other phones running Tiramisu+) can attest to the full DICE chain's integrity to any app that requests it. This can be done through the KeyStore API.

I do not see this - the KeyStore API available to apps still only returns an attestation as a normal X509 chain anchored into a local key, which is certified by Google. That is not DICE. Actually, there is no mention of DICE at all in any recent Android API docs.

Or it this documented somewhere else?


IIRC when your phone requests the attestation cert from Google it uploads its DICE chain. Then Google verifies the chain and gives you a 30 day cert (with a chain starting from a Google root, then an intermediary, then your leaf).

You might be able to see some custom Google extensions on the X.509 cert which will have some extra info. But that might get stripped when the cert is shown to an app.

I don’t remember all of the details. I worked on the infrastructure for the key acquisition but most of it was already set up when I joined and I was only on the team for a few months.


It is entirely irrelevant because cell phone companies add “analytics” (spyware) as part of their firmware stack… which Google and anyone else cannot block.

Use no cell towers or agree to be tracked. Lol.


There's a BIG difference between "My mobile carrier, who knows how much I spend with them on a monthly basis and my home address, probably date of birth and other customer details, also happens to know where I physically am" and "This piece of malware on my phone is intercepting every message I send"


> It is entirely irrelevant because cell phone companies add “analytics” (spyware) as part of their firmware stack… which Google and anyone else cannot block.

Who is that third party cell phone company adding spyware when I buy my Pixel phone from Google?

(Of course, Google can add spyware themselves; but they can also block it.)


If you buy your phone from a carrier. If you buy directly from a vendor I don't see how they (carriers) would be able to inject it. Radio firmware however is, as I understand it, a completely different thing...


Radio firmware has been a misnomer for a very long time, because the so-called baseband processor it runs on has access to and can control much more than just the radio. As far as I know, radio firmware on current phone SoCs has full, unlimited read/write access to all of the available RAM and can preempt and modify any process running on the application processor, so any indications regarding authenticity of any software running on the application processor - as provided by software running on that same processor - are effectively moot.

TL;DR: He who controls the radio (firmware) pwns your phone.


That's not true anymore, at least not for all phones: https://grapheneos.org/faq#baseband-isolation


Pretty sure Pixel already verifies the entire boot chain including firmware.


Yes, but the point of binary transparency is to assure device owners that their devices are running official Google firmware, even if Google’s signing key is compromised or if a court orders Google to sign a malicious image.

A verified boot chain does not provide this assurance.


You need source access to verify this. Much of the Pixel firmware is not available to users.


That’s a slightly different issue. Right now, Google could push a customized malicious firmware just to you. No one else would have a copy, and no one would be able to reverse engineer it to detect that it’s malicious.

With a fully functioning binary transparency system, Google would have to publish the fingerprint of the malicious firmware. An even stronger system would require that they publish the entire image as well. Then the attack could be detected.


https://www.wired.com/story/google-pixel-binary-transparency...

According to Wired, Android Verified Boot (AVB) prevents the computer owner from changing their mind about an OS update. No going back to the previous version. Leave no escape from from the latest improvements in Google's data collection to support its advertising services racket.


The point of anti-rollback technology is to make it so security holes are patched permanently: an attacker can't unwind updates back to the point at which the known exploit works.


"Anti-rollback technology" apparently cannot distinguish between an "attacker" and the computer owner.


No technology can.


Such is the nature of evil maid attacks and the like


I believe that's only the case if the update increments the anti-rollback counter.


The details page [1] of the transparency log explains the exact threat model that they are trying to address with this:

> Transparency systems can be used to detect—and thus deter—supply chain attacks. Let's address some examples:

Suppose an attacker maliciously modifies a Pixel image and even manages to sign it with the key that Google owns. Anyone who receives the malicious image can query the binary transparency log to verify the image's authenticity. The user will find that Google has not added the corresponding image metadata to the log and will know not to trust the compromised image. Since publishing to the log is a separate process from the release process with signing, this raises the bar for the attacker beyond just compromising the key.

So effectively, this seems to secure against malicious actors messing with Google's (or AOSP's) own build process, i.e. by somehow inserting an MITM between the build and the signing stage.

I don't know how Google's or AOSP's build systems are set up, but I'd suspect that not many entities are able to mount a successful supply chain attack on internal networks. So (conspiracy hat on), I wonder if there is something more behind this, i.e. some recent hacking incident or a warning of one.

[1] https://developers.google.com/android/binary_transparency/ov...


> I don't know how Google's or AOSP's build systems are set up, but I'd suspect that not many entities are able to mount a successful supply chain attack on internal networks.

This classic article might be worth your while:

https://medium.com/@alex.birsan/dependency-confusion-4a5d60f...


It’s much better than that.


Too bad it's paywall'ed


Here you go, it wasn't always like that:

https://web.archive.org/web/20210227105146/https://medium.co...

:)


One option is that it's not Google protecting you against Google, it's Google attempting to drive other Android vendors towards the same infrastructure in order to protect users against vendors being compelled to produce targeted backdoored firmware images.


not only that, but it also protects you from law enforcement or any state actor.

This mainly protects journalists and human right lawyers. I think this is the idea here after NGO shitstorm.


Maybe just lessons learned from SolarWinds



I wonder if there is something more behind this, i.e. some recent hacking incident or a warning of one.

They may just be trying to get people to "trust" them more.

Remember that according to Google and the rest of Big Tech, "malicious actors" also includes the user.


Presumably there's something special about ro.boot.vbmeta.digest that would prevent a malicious ROM from lying about it? As in the ADB protocol being served by literal ROM code?


Yeah this seems to be a fairly important missing piece. Is there some special boot mode with a ROM-hosted unmodifiable adb?


That value comes from the firmware and is not writable by Android. As for whether the daisy-chain of strcpy() calls are robust, and without a rich history of 0-days, that is something I don't know.


> That value comes from the firmware and is not writable by Android.

But it's returned to the host by adbd that's part of Android. What's stopping someone from creating a modified adbd that would lie about this value?


I don't know that much about the security model. But I think if you really want to verify device integrity you should use an attestation key. That is supposed to be un-spoofable (although bugs can still occur in any software, which would defeat the integrity guarantees).


SMH at the lack of though or understanding some ppl have for layered device security...

This has an obvious clear benefit: all of the people who have said "oh well Google could be compelled to sign a malicious update for a single user"... This is an attempt at solving that via a transparency log.

Granted, I think for this to matter much all-up, it would need to apply to PSF, Apex, general app updates, etc.... Which I'm pretty sure this doesn't even attempt to touch.

I'd love to hear Google speak to that but that seems like a huge can of worms compared to the image based hashing, signing, verification that is already part of the tooling, ecosystem and consciousness.


Recently I developed a presentation* about immutability as a design concept in computer security. As part of it, I have slides which cover Certificate Transparency implementation[0], which uses Trillian[1] as a distributed ledger. Part of Trillian's documentation includes a Firmware Transparency[2] example. For the year or so I've been aware of it, I've thought that it's a great idea, and wondered if it would ever grow as a project/practice. Digging through the links in this announcement, it appears Trillian is the basis for the distributed ledger here. Glad to see the idea has been taken further by Google!

* Sorry, the presentation is non-public.

[0] https://certificate.transparency.dev/

[1] https://transparency.dev/

[2] https://github.com/google/trillian-examples/tree/master/bina...


Am I getting old if for me Trillian is an instant-messaging app? 0o


You're not alone. My initial reaction was the same.


Yeah, it was irc or icq something, like 15 years ago :)


Well, any security aware owner of Pixel phone is already on GrapheneOS, not only to be protected from attacks but also to be protected from Google itself. So I dont consider this as some huge benefit.

https://grapheneos.org/features


This is cool for Pixels, but the problem with the Android ecosystem is that most people are running customized OS images from the manufacturer (Samsung, Huawei etc). These images will also frequently contain insecure bloatware from telcos that can't be installed.


Agreed. This really is the biggest issue in regard to security consistency on Android-based phones. It's really quite a shame.

Personally, I use Pixel devices and install GrapheneOS on them.


* ...can't be _un_installed


Yes, ty


My concern is over e-waste. If it becomes impossible to install custom firmware on a device then a lot of electronics will just become garbage when support ends or the company disappears.


This is firmware verification done right: it lets you verify that you have Google firmware, but it does not force you to run Google firmware.


Yeah, but it's enforced to run stock firmware in other ways. SafetyNet for once.

Having mechanisms like this in place unfortunately makes it easier for others to abuse them for DRM.


How many people actually extend the lifetime of their phone via custom firmware? I imagine most people with old phones simply continue using their devices after support ends.


I used the OnePlus One way past its EOL. Replaced battery and screen multiple times, custom ROMs based on linageOS tuned by the community to revive important features like RAW, 60s Shutter Speed and introduce some new ones not unlocked by the OEM, like manual focus. This included firmware modifications of various subsystems to improve the QoL, like introducing Stereo Playback via the speaker and tuning the Speaker curves to sound acceptable.

There is a lot that can be done and there is a lot that people do.


That's only enthusiast do in developed world. I'm curious does poorer people in developing world uses alternative ROMs. Is there any store that supports ROM flashing?


We'll never be able to judge that from the POV of our fast moving first world tech sphere and it's one-time use consumerism. That makes it all the more important to keep those options and avenues of modification not only alive, but strong.

As individuals we can of course offer mere anecdotal evidence. Coming from Belarus, I can say that the amount of talented hackers repurposing old tech was always strong there, though I don't have a strong connection there anymore. I recently covered the QuadCore mod for ancient thinkpads and the firmware replacement that entails: https://youtu.be/Fs4GjDiOie8 I received multiple requests on Details and Walking through individuals performing similar upgrades from Latin America. To me, the spirit for reviving trash grade tech to our modern sensibilities is alive and well.


Going off how people treat cars, the majority would do this the practice is ever allowed on a wide scale (to facilitate commercial services that assist them for a small fee).

The issue is that most phone manufacturers don't allow owners to do this, and most that do, do pull shenanigans that make it unpredictable and difficult.

Poorer people would absolutely choose to pay 25 dollars to extend their phone's life (replace battery + reflash to a supported firmware/os) vs buying another 200 dollar phone.

I'm hoping the EU (and the rest of the world) steps up and does what the US should have done a long while ago.


I used to do this for as long as the batteries and soldered flash would last. Though I've given up trusting randos on XDA devs and fiddling with ROMs to keep everything working.


That's kind of a circular reasoning, nobody uses them because Google and manufacturer made them as inconvenient as possible.


More people would do this if it was easier to do.


>Pixel Binary Transparency responds to a new wave of attacks targeting the software supply chain—that is, attacks on software while in transit to users. These attacks are on the rise in recent years, likely in part because of the enormous impact they can have.

They say its "on the rise", but the linked report in the blog talks about transient OSS dependencies (among other related things), and not binary/firmware level tampering. Can someone explain how this would help avoid the Log4j vulnerability?


I wonder how much of the inspiration for these append-only merkle trees of attestations came from blockchains.

We now have certificate transparency and also this that work this way. They seem to be direct logical descendants.


I think it could be argued that the basic form of this was around as X.509 certificates far before blockchains. I'm not sure what the history would be before that, though.

Also, in my mind at least, the idea for doing secure attestable history comes from git. Once you have a crypto-DAG it's a fairly easy transition into other data structures although ordering becomes an issue with trees. I'm almost sure something else must have existed because it's not a far leap to a crypto-DAG when you are doing things like HMAC.


That sounds like a nothingburger. The bootloader always shows a warning if the image isn't signed by Google.

Google itself should be able to sign whatever code they want and mount whatever attack they want.

Could someone explain if this provides any value over signature check in the bootloader?

I believe that the bootloader can't be updated with a non-Google-signed version. And if there is a vulnerability and a malicious actor does that there would be no way to safely get the hash to verify against the log.


> Could someone explain if this provides any value over signature check in the bootloader?

If every release has its checksum entered into an immutable log, and can't be installed if it's not in the log, it makes it somewhat detectable if someone infiltrates, tricks or forces Google into signing a backdoored version for a targeted attack.

It's unlikely anyone would infiltrate Google to make a custom-signed image to target me - but if you were Obama or Trump or Snowden or Khashoggi you might be worried about that.

I say "somewhat detectable" because if there was an unexplained signed update logged, Google could just say "sorry, bug/misclick/new guy" and that'd sound plausible to a lot of us.


> if there was an unexplained signed update logged, Google could just say "sorry, bug/misclick/new guy" and that'd sound plausible to a lot of us.

Precisely. And, practically, you won't be able to audit such update, at least audit it quickly. Even if you find some malicious code they can always blame it to a rogue engineer.


> Even if you find some malicious code they can always blame it to a rogue engineer.

And good luck finding anything with something as big as an android rom...


This is awesome and a great job by google's android/pixel devs.


I love on the one side that Google takes security serious. On macOS, Windows and Linux I can easily set up a device in a way that makes it (a decent passphrase required) all but impossible for any attacker to retrieve the data on the device or modify the OS if the device is at rest. LUKS/Bitlocker/FileVault encryption and UEFI Secure Boot (x86)/SIP (Apple) make sure of that.

The problem is that way too many mobile applications these days take extraordinary difficult steps to make sure it's impossible to exercise the freedoms granted by the GPL in practice. If you "root" your phone, it's a constant game of whack-a-mole to keep banking, Netflix, Google Pay and other applications running.

On top of that it's impossible to put a new root of trust in place... I don't want a secure boot warning message if a firmware is running that I flashed on the device, I want it visible when someone else placed a manipulated firmware on it. And that point applies to Apple just as well.


> I don't want a secure boot warning message if a firmware is running that I flashed on the device, I want it visible when someone else placed a manipulated firmware on it.

And how exactly do you propose achieving that, when that someone else might have tampered with the phone before you got it?

The goal of Google's security architecture is that a dodgy phone seller/repair shop can't pre-root the phone and siphon all your private data to Mr Evil unless they have access to a silicon fab to remake the main CPU with a new trust root.


> And how exactly do you propose achieving that, when that someone else might have tampered with the phone before you got it?

Wipe the device as a condition of unlocking the bootloader root trust keyset. Easy, and more secure than any classic x86 UEFI bootloader. That gets rid of the threat of dodgy repair shops.

The only issue will be manipulating devices before they're sold the first time, but tamper-proof packaging resolves that.


Tamper-proof packaging is a poor replacement for a first-time boot replacement warning. Not to mention the sheer impracticality of properly implementing tamper proof packaging (the factory would have to cover the packaging in shiny nail polish or something, encrypt and send a high-res picture of that somehow to the final buyer across the supply chain, at which point the final buyer makes sure the glitters align). Much better to do it the way it's currently done


If a repair shops wipes someone's phone they'll be pissed, but they aren't going to throw out the phone. As soon as they get back that phone they'll reinstall all their apps and log back into all their accounts and any malicious firmware added by that repair shop will wreak havoc.

I 100% agree that we should have ways of getting rid of these warnings on our own devices, but this isn't a simple problem.


> If a repair shops wipes someone's phone they'll be pissed, but they aren't going to throw out the phone. As soon as they get back that phone they'll reinstall all their apps [...]

This depends on whether consumers are made aware that a repair shop that "accidentally" wipes your phone might be trying to steal your bank account etc.

While education is difficult, the consumer has an advantage in this scenario because the event itself is impossible to miss and very disruptive and could lead them to start searching on the internet for advice.


Apple frequently tells customers that their data would be wiped if they send their devices in for repair, I don't see why customers would challenge a repair shops assertion - it doesn't seem implausible either!


I guess the lesson is/would-be less "all resets are signs of nefarious intent" and more like "if seems reset, always reset it again yourself to be safe."


Or maybe just have the phone tell you it’s been tampered with?


Depends on what "wipe the phone" means. That could involve clobbering early-stage bootloaders and firmware on daughter microcontrollers - the kinds of things that can only be replaced through JTAG and a good bit of tribal knowledge. It doesn't stop the most sophisticated attackers, but it certainly would disincentivize a large-scale attack of this variety, especially when you consider the wild variations that exist between Android phones at a hardware level.


>And how exactly do you propose achieving that

A signed-by-google first-stage bootloader could display a message warning the user before handing off to an unsigned second-stage bootloader.

>The goal of Google's security architecture is that a dodgy phone seller/repair shop can't pre-root the phone

I'm curious how big a problem this was with refurbished second-hand laptops that often come with a pre-installed OS. At the very least, I have the freedom to reinstall Windows/Linux.

We need to find real solutions to the e-waste problem, it's unacceptable to be throwing away so many working phones simply because their manufacturer has decided to stop publishing OS updates after 2/3/4 years. I own a few older computers that are almost a decade old and run the latest version of Debian/Ubuntu. There is no reason phones should be treated any different.


That's an easily solved problem. We already have the pre-boot warning. It fixes that problem just fine. Add a reboot on initial setup and make it scarier if you're just setting up the phone and you'll be fine. A week after you've setup the phone there's no reason why you'd keep it.


> Mr Evil

Erm, it's Dr. Evil to you sir.


On Google Pixel devices you can load your own verified boot key into the "avb_custom_key" partition and then it will only boot OSes signed by it (it will also say that you are running a different operating system in the boot screen).

GrapheneOS for instance uses this mechanism.

Unfortunately you can only register one key and you have to wipe the device to change it, but that's still fine for most use cases.


Your phone will still display a warning during bootup, though.


Because it's also the way to preinstall malicious ROM. Fair warning.


Some devices allow you sign your own images and relock the bootloader [0].

This allows you to modify your image, sign it, flash it and relock your bootloader. If you have the infrastructure in place, future updates could be rolled out as OTA updates for your custom rom.

You'll still fail hardware attestation and afaik whatever api that returns the boot status differentiates between a vendor signed and a custom signed image.

So not perfect, but you lose the bootloader unlocked nag.

[0] https://android.googlesource.com/platform/external/avb/+/pie...


Sadly those devices are only Google and Nexus, so expensive.


Part of the issue there is that they’re focused on the 99% of users who aren’t flashing firmware onto their phones.


This is probably the largest (but far from the only) reason why I'm ditching smartphones entirely.


Perhaps you might like GNU/Linux phones, which have no such problem (and also solve many other problems).


None of them appeal to me for various reasons. Instead, I'm switching to a dumb phone and will be carrying a Linux pocket computer for my mobile computing needs.


Could you share which computer is that?


I haven't decided for certain, but it will very likely be one of my home-built ones. I know that's not very helpful to you, and I apologize.


It's the corporations that get to trust your device. You get nothing.


This is how the Jargon File defines "malware":

   malware
    n.
[Common] Malicious software. Software intended to cause consequences the unwitting user would not choose; especially used of {virus} or {Trojan horse} software.

What if a program installed by Google causes consequences the user would not choose. For example, "Google Play" or Chrome.

Google has been fined 4.125 billion euros because it forces computer manufacturers to install these programs by agreement. Imagine if Google had to pay computer owners (ad targets) not to modify/remove the spyware.

https://ec.europa.eu/competition/antitrust/cases/dec_docs/40...

https://curia.europa.eu/jcms/upload/docs/application/pdf/202...

https://theplatformlaw.blog/2022/10/03/general-court-largely...

https://www.clearygottlieb.com/news-and-insights/publication...

Also in India, Google was fined for these agreements.

https://pib.gov.in/PressReleseDetailm.aspx?PRID=1869748

https://www.thehindu.com/sci-tech/technology/indias-antitrus...

https://indianexpress.com/article/technology/tech-news-techn...

https://www.bqprime.com/business/google-loses-appeal-against...

Also in South Korea, Google was fined for these agreements.

https://www.reuters.com/technology/skorean-antitrust-agency-...

https://www.aljazeera.com/economy/2021/9/14/south-korea-fine...

Projects exist to remove the spyware, so-called "de-Googled" Android. Clearly some computer owners would not choose these programs.

These are "witting users" under the malware definition.

Proponents of Google's practices will sometimes argue that witting users, e.g., people commenting on HN of their dissatisfaction with Google's practices, are not relevant. Only the "majority" is relevant. They will frequently use the phrase "most people".

However, these are "unwitting users" according to the malware definition. They are not "choosing" Google Play or other Google spyware pre-installed on their computers. Rather, they are not presented with a choice.

"Millions of users trust Google." Well, considering Google pays other companies to pre-install their software on millions of computers and to set the default search to Google, that's not surprising. We are all forced to "trust" the things we cannot change. What other choice do we have.


People buy Android phones with the expectation of using the play store to install applications lol


> > Google has been fined 4.125 billion euros because it forces computer manufacturers to install these programs by agreement.

> People buy Android phones with the expectation of using the play store to install applications lol

Maybe the people having decided that fine did not thought about that and the corresponding lol-factor.


The elephant in the room, for Google and Microsoft, is verifiable security is worthless if no one actually trusts the organization that released the verified firmware.

Android should be have been split off from Google a long time ago.


If you're using Pixel with stock OS, you trust them anyways - it's about making sure your phone hasn't been tampered with by another party.


Trust isn't binary, there are several dimensions and varying levels of trust along those.


It doesn't mean that, it just means you selected it out of the available options.


How does this relate to verifying that the software running is the same as what google ships to everyone?


I'm responding to "you trust them anyways"

How does using a device mean you trust the vendors?

That's wrong. It's like saying you trust your governor because you live in a particular U. S. state.


If you don't trust them, why verify the integrity of the software blob they ship with their hardware at all?


that was the point of the original comment


I still want to make sure it's the correct download. And yes I think of the preloaded OS as a sort of download.


And who would fund that version of Android?


Hypothetical: Computer owner modifies software to make it more secure. Google falsely declares computer is now less secure.

Who should win the argument and why.

(Note: Google is not the owner of the computer in this hypothetical.)


Being more secure for Google and [ad-/tracking-supported] app-vendors is more-or-less independent of being more secure for computer owners.

If some change happens to benefit both groups, I assume it's a happy coincidence. I like to think that most Google employees try to implement win-win stuff like that, but it's pretty clear that they frequently worsen user security/privacy to improve their bottom line (nearly all their revenue comes from spying on people, and preventing them from effectively opting out).


Neither. The security profile is the same, device is still used or managed by a human, and (most?) humans hate rubber hoses.

(Therefore all security should be based on the users preferences, as you can't protect a user 24/7)


So the phone has none of Google's spyware on it? Finally, a secure phone! /s


Seems like a bit of a nothingburger, now you have two ways to verify your binaries came from google and are unmodified instead of one way, doesn't change much.


what is the other way?



Pretty soon Android will be as locked down as Android.


Perhaps, but this has little to do with it.

In what scenario is user-auditable traceability of firmware images anything but a good thing?


I think we're well past that point ;)


It already is


Call me a skeptic, but I see this as political theater. If Google themselves wanted to peek-at-you, they would never have to look as deep as the firmware. If a _foreign_ government wanted to, and they could 'poison the well' this obviously helps.

I feel like this is part of Apple's cover story for excessive serialization 'we just want to make sure the parts in your phone are the parts we own'.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: