Instead of wget-ing localhost, maybe it's better to use Frozen-Flask[1] instead. I use it to implement my own static site generator for my personal website and it's working great.
Heh, I worked on a site for a brand during the 2006 World Cup that did basically exactly this (although a PHP site, and a slightly rickety manual build process). They had bought pitchside advertising to be shown during the final, and anticipating enormous traffic which sadly never materialized. But we totally could have handled it!
Other over-optimisations on the same project included going all the way and including functionality to handle drawn lots as tie-breakers. This also never happened.
I was trying to look into GitHub actions but didn't really understand it. I just straight up called CLI commands eg. $git commit -m "..." and ran those after I put a key(in my case on a pi) and it does commits by CRON. I was just publishing sensor data to my GitHub readme. But it's cool that it works.
The site already existed, but they wanted to serve it statically to save on complexity and money.
If you dig into the repo you'll see that it's maintained using a YAML file, which is a smart way to deal with small sites that still contain structured content.
1. because the people writing the content are not developers.
2. there’s dynamic content, but when you know you’re going to get a large influx of traffic from a single region/event, you can precompute it for a short period.
It's smart because it's a solution to a problem (freezing dynamic content) in one line of code, the wget command. It should work for most cases and it's easy to understand and deploy.
1. Spin up a Python Flask web server on localhost
2. Run "wget --mirror" against it to crawl the site and save it as static files
3. Publish the resulting static files to GitHub Pages
The workflow is here. It's genius: https://github.com/pubstandards/pubstandards-london/blob/899...