My point is that the instruction set of these large CPUs (not AVRs etc) is itself just an abstraction to an inscrutable micromachine. That abstraction lets you develop a complex program without knowing the details.
Unix was specifically designed for small, resource starved machines and did not have the powerful abstractions of mainframe OSes like Multics or OS/360. It's OK, but as modern CPUs and IO systems have grown and embraced the mainframe paradigms that had been omitted from the minicomputers (e.g. memory management, channel controllers, DMA, networking etc) unix and linux have bolted on support that doesn't always fit its own fundamental assumptions.
That's fine, it's how evolution works, but "cloud computing" is a different paradigm, and 99.99999% of developers should not have to be thinking at a unix level any more than they think of the micromachine (itself a program running on a lower level instruction set) that is interpreting the compiler's output at runtime.
As I said in the other comment, maybe people though that "cloud computing" is a different paradigm back in the 70's but it turns out that no, it's all the same distributed stuff.
If you have two processes on same machine, locking a shared data structure takes microseconds. You can easily update hundreds of shared maps and still provide great performance to user.
If you have datacenters in NY and Frankfurt, the ping is 90mS and fundamental speed of light limit say it will never be below 40mS.
So "lock a shared data structure" is completely out of the question, you need a different consistency model, remote-aware algorithms, local caches, and so on.
There are people who are continuously trying to replace Unix with completely new paradigms, like Unison [0].. but it is not really catching up, and I don't think it ever will. Physics is tough.
Unix was specifically designed for small, resource starved machines and did not have the powerful abstractions of mainframe OSes like Multics or OS/360. It's OK, but as modern CPUs and IO systems have grown and embraced the mainframe paradigms that had been omitted from the minicomputers (e.g. memory management, channel controllers, DMA, networking etc) unix and linux have bolted on support that doesn't always fit its own fundamental assumptions.
That's fine, it's how evolution works, but "cloud computing" is a different paradigm, and 99.99999% of developers should not have to be thinking at a unix level any more than they think of the micromachine (itself a program running on a lower level instruction set) that is interpreting the compiler's output at runtime.