Suppose at some point of your universe you have some event the consequences of which you don't like. You'd like to roll it back.
Due to the speed limit, said rollback affects only a limited area of the Universe. The "people" inside this area won't feel anything unusual during the operation, b/c their memory will be rolled back, too. You also need reversible physical laws for that to be feasible :)
It's simply a consequence of our physical laws rather than something that's "enforced". And Relativity tells us that anything travelling faster than that would allow information to travel back in time, thus screwing up cause and effect thus our universe wouldn't exist as it does.
Do you know any simple thought experiment that shows this "ftl means information traveling back in time"?
What I've seen is that ftl travel breaks causality in our MODELS of spacetime.
The universe doesn't enforce logic in the form of twins paradoxes and all that tomfoolery. Spacetime is a physical object not a logically consistent human centric clone enforcer.
These are fairly good questions, and are the topics of study in a variety of fields touching on fundamental physics (e.g. physical cosmology, foundations of physics, ultra-high energy particle physics, etc.).
> Where does [the speed limit] come from?
Initial conditions. More on that in a moment.
> What enforces it?
We can parameterize c in our fundamental theories (or expansions thereof) and take a rigorous if mathematical approach to asking questions like: what if the (arbitrary) value of c were not constant everywhere -- for example, if it were different in the past of every point we can currently observer, or if it is different in one spacelike direction from another spacelike direction. We can also fix various sets of units and adjust c's arbitrary value up or down everywhere in spacetime. It turns out that astrophysical observables are highly sensitive to the universality of c, and that it would be virtually impossible for us to notice even a very small gradient in c since the very very early universe, and when we use just about any set of units to describe physics and then adjust the value of c in those units up or down we also get strongly different observables in astronomy and laboratory physics.
So, it's not so much that it's "enforced", but rather that a different value of c, or a non-universal value of c, is strongly constrained by physics achievable in Victorian-era laboratories or by modern amateur enthusiasts.
There are further types of "breaking" of the invariance of c, wherein one can have some fundamental interaction be constrained by a constant other than c. Most variable speed of light theories are directly written as (or clearly equivalent to) bimetric theories of gravitation, wherein some microscopic component of the Einstein Field Equation couples to a metric other than the standard one that everything else couples to.
A toy example would be some form of exotic matter moving superluminally in Schwarzschild blackhole spacetime, such that there is an "inner" horizon that affects this exotic matter, and at high energies an interaction between normal matter and this exotic matter that transfers information from the former to the latter between the two horizons, allowing that information to escape to infinity encoded in the exotic matter. There are other examples from cosmology designed to do away with some aspects of the observed universe that support Cosmic Inflation (e.g. some exotic matter couples to a metric that allows it to spread heat evenly across the very early universe faster than heat could propagate if constrained by "c").
Such examples again are highly constrained: in both cases the second metric has to decay away so as to avoid being readily detected by our modern instruments. In the cosmological case, it has to be gone well before primordial nucleosynthesis, or it would leave obvious fingerprints in the cosmic microwave background and in the distribution of galaxies on our sky; the BH toy requires at least a cutoff that depends on the mass of the black hole, and so suffers badly when trying to apply the "toy" to real astrophysical situations involving collapsing stars.
An anthropic argument answer is that the state of the universe around us humans is highly sensitive to conditions in our distant past, and thus our own existence is strong evidence supporting c as a constant everywhere in the past of the stuff that makes us human. Since that includes the views of objects in our sky as we make better and better telescopes, that [a] is further supporting evidence that [b] c is very likely a universal constant. Is that enforcement? That's probably more a metaphysical question than a physical one.
(We can tone down the anthropic argument a bit and ask for evidence for a statement like: if c takes on an experimental value in one point in a spacetime filled with fields like ours, it must take on the same value at every other point in that spacetime too).
Finally, "where does [the constant c and its value] come from": we don't know yet, but obviously there are scientists working in the subdisciplines listed in my first paragraph (and more) who are trying to find out. On the one hand, the parenthetical comment above suggests that we bend our own thinking and just accept that it doesn't "come from" anywhere, it just is; on the other hand we're pretty biased culturally with ideas about sequencing of cause and effect and about there being a real difference between past and future, so we like to slice up spacetimes into space and time and then think about how each space-like slice is related to its neighbours, and then to their respective neighbours, and so on. This cultural habit may be fruitful, or it may be a handicap, when it comes to answering questions about c. However, returning to "initial conditions", our present spacelike slice was determined by its immediate predecessor in the past, and that was determined by its immediate predecessor, and so on. If we keep regressing we might expect to come to "the start of time", and find some mechanism which sets c on that initial spacelike hypersurface.
However, there are lots of ways to avoid having such an initial spacelike hypersurface even in a big bang cosmology! So while "initial conditions" is culturally the most favoured answer, and is well supported by evidence from physical cosmology, that may not be a sufficiently full answer. And that's going to be a topic for scientific research for some years to come...
It comes from the inherent sloppy irrelevance of the notion of speed compared with the actual behavior of reality. If you cared about a different variable instead, it would not be so limited.
Like conservation of energy this is the modern mythology. Not sure who is insulting who here, surely our ancestors "knew" all kinds of now know to be nonsensical things. They too believed to be advanced beyond that point. Its actually rather funny this stacking of "if we assume"'s into facts.