Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It looks like finally users may have complete control over their Intel computers without Intel having the final say. I, for one, am quite happy about this.


This sentiment seems to be rooted in a misunderstanding of what trusted computing is trying to achieve on a fundamental level.

The idea is not to "take control over people's computers", i.e. your trust in your own computer. It is rather to enable somebody to gain some level of trust in the computations that are happening on somebody else's computer.

Yes, this technology is commonly used for DRM, and that was one of its earliest applications. But it's not limited to that. Trusted computing can switch the roles and give you as a user certainty over the computations a third party provider performs in the cloud on your behalf. The Signal team is doing a lot of very interesting experiments there [1].

If your concern is a hardware backdoor or something similar, this is less of a question of trusted computing, and rather one of trust in hardware vendors. Your hardware vendor can screw you over entirely without TPM, TEE, secure elements and the like.

On the other hand, Intel's trusted computing platform being horribly broken does not magically give you FOSS replacements for all the firmware, ROMs and microcode running on the dozens of peripherals in your computer.

[1] https://signal.org/blog/secure-value-recovery/


That's all correct, yet it doesn't consider the politics. The people on this thread are concerned about a huge power imbalance between customers and companies; specifically, customers have zero bargaining power so they should expect trusted computing to be used "against" them far more often than it's used "for" them.


> Your hardware vendor can screw you over entirely without TPM, TEE, secure elements and the like.

Yes they can, but as I understand it they are using TPM to screw us over, hence people celebrating its being popped. No misunderstanding of trusted computing necessary: there aren't, in practice, other vendors to choose from here.


Could you elaborate on the ways that people are being screwed over by TPM?

I do see that the existence of both trusted and untrusted systems could exert some pressure on consumers to adopt the latter, due to the unavailability of certain services on the former (e.g. DRM, banking apps on rooted Android phones etc).

The danger here is a loss of "freedom to tinker", which I do appreciate very much, and I share that concern. But has that actually happened with TPM?


Certain 4K/ HDR content is only available on PCs if you use edge.


Thats not TPM.


What about the fact that the newest Intel CPU that can run Coreboot is from 2012 since all the subsequent ones have been locked by Intel? Isn't the TPM directly responsible for loss of that freedom to tinker?


Coreboot runs on the latest Intel CPUs (and work is underway for CPUs that haven't even been released yet) but it uses binary blobs. Those blobs have nothing to do with TPMs, the ME, or whatever.


As already mentioned the coreboot bit is false, but also:

the TPM is a PASSIVE component. It only responds to your requests and you can do cool things with it.

https://media.ccc.de/v/36c3-10564-hacking_with_a_tpm


Just a few months ago I bought a comet lake laptop that shipped with coreboot, so I have no idea what you’re talking about here.


I'm curious, where are TPMs being used to screw people over in your opinion?


I think we need to come up with solutions to problems like key escrow (i.e. in Signal's case) that don't require trusted computing because a single root of trust for hardware is a single point of failure and depends on trusting the hardware manufacturer.

There are a lot of possibilities with distributed computing


Do you have any narrative of how to do key recovery safely without an enclave or a human in the loop?

I’ve spent a lot of time thinking about this and I don’t really know how to do it without one of those two things.

Edit: like I hear you saying there are possibilities in a distributed computing world, but I don’t have any idea what distributed computing enables for key recovery (except possibly k of n schemes but that’s just replication, not safety).

Edit 2: also, presume that users suck at key management and can’t remember long password strings, 24 words, or be trusted to store a key for a meaningful period of time.


It would still be an enclave of sorts, but white box cryptography is generally trying to achieve a similar goal as trusted computing, without relying on trusted hardware.


I don’t think the enclave you’re describing exists, nor do I believe there is an enclave that is untrusted hardware.

Do you have an example of such an enclave and how it would operate without a remote attention service in the model where a user can trust that a distributed network they don’t control is safeguarding their key?


If homomorphic encryption advances to the point where it's usable, that would be an example of security on untrusted hardware.

(But I suppose that just proves your point.)


Yes. I think if homomorphic encryption existed in a way that was super fast, we would be using that. It’s real far as far as I can tell.


It's definitely real, but as you stated, incredibly slow (and noisy, though that's dealt with by refreshing). It has limitations currently, but there's good work right now lifting those, so it's definitely something to keep an eye on moving forward.


There is a solution: Fully Homomorphic Encryption (FHE). However, it's unfortunately not practical yet with current hardware (and it's unclear if it will ever be). Meanwhile some partially homomorphic schemes might address specific applications (say for instance a 3rd party provides an encrypted list, I believe there are nearly-practical algorithms to sort this encrypted list information leaks).

Trusted platforms are pretty interesting IMO in their ability to essentially provide FHE by means of tamper-resistance instead of mathematical security. Objections should be more directed at the control of keys being with Intel; maybe some other orgs should be in charge, maybe there could be a number of trust vendors you could choose, or at least veto (and allow external users of your external platform to choose). Something more in line with TLS authentication: we all need to trust 3rd parties to use the internet, and nobody protests -- with good reason. It's a well designed, open, decentralized system with good oversight.


> like key escrow (i.e. in Signal's case)

Is signal moving towards some sort of key escrow policy?


that sounds vastly overgenerous. if that were truly the case, then TEE wouldn't be built into every consumer system. if it were actually to protect against malicious cloud providers, then TEE would be only available for special (read: expensive) processors. see: intel and ECC memory. the goal of TEE is to benefit big corp and fuck the user, and just because it can in theory be used for other purposes is barely a consolation.


TEE is practically used for mainly two things in current smartphones: DRM and hardware key storage.

DRM lets users watch Netflix on their phones while on an airplane.

Hardware key storage significantly decreases the attack surface for malware trying to extract them, compared to storing them on the application processor.

How is the average user being fucked here, exactly?


> DRM lets users watch Netflix on their phones while on an airplane.

No, DRM exists to restrict users, it does not enable anything for them.


That is definitely true. In a way, the user of DRM is the content provider, not the owner of the playback device.

But this is exactly the idea of trusted computing:

"Prove to me that I can trust your hardware to run my software according to my specifications, and I will use it to compute things (for our mutual benefit) that I would otherwise only compute on my own hardware."

DRM is the canonical example, but wouldn't it be nice to be able to actually know that cloud service provider has to adhere to their terms of service, rather than having to take their word for it?

(The big "if" here is that the terms of service are expressible and enforceable in the context of some piece of software.)


> "Prove to me that I can trust your hardware to run my software according to my specifications, and I will use it to compute things (for our mutual benefit) that I would otherwise only compute on my own hardware."

As a user - why should I trust your software with my computer?

There is great imbalance of power, the companies aren't necessarily the good guys. It already got unfair with DRM, for example together with DMCA is effectively blocking fair use, like ability to make own backup copy of purchased medium, or purchasing it once and being able to play the content on multiple devices.

It also prevents one from being able to sell their copy to someone else which also is allowed by law.


so what you're saying is that without TEE, Netflix would shut down? come on. Netflix would clearly keep operating with or without DRM, all TEE does is make it harder for the user to access their legitimate (non-Netflix) content in anything but the most approved way. it entrenches mainstream operating systems and makes it harder to use FOSS. sure, I'll concede that Netflix is not the most damaging to user freedom, but that's not what OP is about. nobody would give a shit about this vulnerability if it was just Netflix, because Netflix is broken against hardcore attackers anyways. TEE proponents want to expand its use to more user-hostile applications. that's my concern.

hardware encryption is arguably a better use of TEE, but as far as I know, no actual implementations use SGX for that purpose. the TPM is used, but it's not fast enough for actual encryption. the OS loads the keys from the TPM and does the encryption in regular software.


Does this announcement mean it's finally possible to run FOSS firmware like Coreboot on modern Intel hardware? If so, this is a huge finding


Coreboot already runs on modern Intel hardware. This vulnerability doesn't eliminate the blobs or the need for the ME to initialize hardware before Coreboot runs if that's what you're thinking of.


Well, I think people care about whether this can be used to bypass bootguard. If yes, it would enable coreboot support on every laptop.


Could this vulnerability be used to bypass the ME? I'd like to run coreboot on my thinkpad X1 carbon gen7


The ME is irrelevant for you. It's bootguard that is preventing you from running coreboot.


That's what I am wondering too! Perhaps Libreboot can make progress and we could have completely free systems with more up to date hardware after all? That would be so great.


Looks like it could make it easier to get around DRM also?


"the scenario that they feared most", yet the scenario everyone was sure would happen.


I wasn't sure it would happen, but I'm sure happy it has!




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: