Yes, is that (obvious) point being addressed in the paper? At first skimming, it just says that a "sufficiently souped up laptop" could, in principle, compute the future of the universe (i.e. Laplace's daemon), but I haven't seen anything about the subsequent questions of time scales.
You can have the data safely on-prem, connected to computers that are connected to the internet, or safely in the cloud, connected to computers that are connected to the internet. The threats are not that different.
I searched for "Sparkasse" on the German Apple App Store, the good old staid conservative public savings bank. The top result was an ad for Crypto.com: Buy BTC, ETC, ...
Yes, I know, I was there again very recently and it's still mostly as nice as I remembered, lovely city all around and fascinating buildings, infrastructure and overall city :) It's quite different than other metropolitan cities, and I didn't mean it as a jab or anything, was just trying to "paint a picture", no offense meant :)
In fairness here, when it comes to large distributed networks, this type of scaling is generally unacceptable.
But yes i agree its really sloppy for them to say exponential. I'd actually call it linear since what matters (mostly) is how many connections each node has to do, not the total number of connections in the system.
Nonetheless imagine if email worked by making a connection to every computer in the world to check if they had mail for you. It would obviously not work.
reply