The low level "what makes it all tick" of calculus is known as analysis, set theory and the topology of the real line, and is the subject of upper division college math classes. The stuff about "infinitely small pieces" is a mnemonic made to help you remember and intuit the tools of calculus.
You know, I never understood this. Limits (how a function behaves as dx approaches zero) are much easier (for me) to understand than infinitesimals (an idea which obviously can't even exist on the real line), and it's much harder to end up in contradiction.
Yet, the idea of infinitesimals is very popular in applications. For example, in undergrad physics courses, when deriving formulas (essentially, doing math!) they keep talking about a "small area" or a "small volume" (where a given function is assumed to stay constant).
When you're actually computing things, such as numerically approximating an integral, you do so by breaking it into small pieces and adding them together. Infinitesimals are the limit of that process as the smallness goes to zero. I think the key problem here is that students use the symbolic rules but never actually do numerical computations of this kind.