Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The secret is held by the metadata server that the CI instance has access to

Or: the deployment service knows the identity of the instance, so its secret is its private key

Or, how PyPI does it: the deployment service coordinates with the trusted CI/CD service to learn the identity of the machine (like its IP address, or a trusted assertion of which repository it’s running on), so the secret is handled in however that out-of-band verification step happens. (PyPI communicates with Github Actions about which pipeline from which repository is doing the deployment, for example)

It’s still just secrets all the way down





> The secret is held by the metadata server that the CI instance has access to

But how does the metadata server know that the CI instance is allowed to access the secret? Especially when the CI/CD system is hosted at a 3rd. party. It needs to present some form of credentials. The CI system may also need permission or credentials for a private repository of packages or artifacts needed in the build process.

For me, a CI/CD system needs two things: Secret management and the ability to run Bash.


Yeah I was confused about that bit too. AWS and GCP's metadata servers know which instances were deployed, so they presumably have some way of verifying the instance's identity out-of-band, such as being tagged by an internal job or machine identifier.

As for deploying from a trusted service without managing credentials, PyPI calls this "trusted publishing": https://docs.pypi.org/trusted-publishers/

From the docs:

1. Certain CI services (like GitHub Actions) are OIDC identity providers, meaning that they can issue short-lived credentials ("OIDC tokens") that a third party can strongly verify came from the CI service (as well as which user, repository, etc. actually executed);

2. Projects on PyPI can be configured to trust a particular configuration on a particular CI service, making that configuration an OIDC publisher for that project;

3. Release automation (like GitHub Actions) can submit an OIDC token to PyPI. The token will be matched against configurations trusted by different projects; if any projects trust the token's configuration, then PyPI will mint a short-lived API token for those projects and return it;

4. The short-lived API token behaves exactly like a normal project-scoped API token, except that it's only valid for 15 minutes from time of creation (enough time for the CI to use it to upload packages).

You have to add your github repository as a "trusted pulbisher" to your PyPI packages.

Honetsly the whole workflow bothers me -- how can PyPI be sure it's talking to github? what if an attacker could mess with PyPI's DNS? -- but it's how it's done.


PyPI is sure that it’s talking to GitHub because it establishes trust in the GitHub’s IdP public keys over HTTPS. I guess you could then question the security of HTTPS, but that seems like a significant rabbit hole to jump down given that OAuth, etc. all depend on the same basic scheme.

It would be good if it could also scan build output like code coverage and test results. But that’s about all it should do.

I keep meaning to write a partially federated CI tool that uses Prometheus for all of its telemetry data but never get around to it. I ended up carving out a couple other things I’d like to be part of the process as a separate app because I was still getting panopticon vibes and some data should just be private.


That is secret management.

Yes, that's what I'm saying. I'm agreeing with your response to amluto.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: