Those techniques were precursors of calculus. They're often taught in introductory calculus courses (at the university level). I consider them part of the field of mathematics we now call calculus. Indeed, if you study the history of mathematics you'll find that calculus isn't just something Newton invented out of whole cloth, as he himself was well aware:
"If I have seen further, it is by standing on the shoulders of giants."
You could argue that algebra was a precursor of calculus and that basic operations were precursors of algebra. This is a rather reductionist argument. They hadn't even invented zero or algebra when they found ways to produce square roots or that you could make interesting ratios from the sides of a triangle (yes, I know about the history of zero and that is a slight exaggeration, but no symbol for zero existed in Mesopotamia when they were calculating inverse proportion roots).
My real point is that these thing don't require calculus (though calculus does require them). Most people who take trigonometry (or "precalculus" as they usually call it today) couldn't tell you anything about limits let alone derivatives, integrals, and the fundamental theorem of calculus.
As to Newton inventing calculus, he was a vehement antagonist to Leibniz claiming that his work on calculus had been stolen. That's no the attitude of a man who believed calculus was obvious from looking at previous works.
Most people who take trigonometry (or "precalculus" as they usually call it today) couldn't tell you anything about limits let alone derivatives, integrals, and the fundamental theorem of calculus.
And to those people, the calculator is a magic “black box” that spits out cosines. My original point stands: if you are using a tool professionally you should understand how it works, at least on a basic level. For a calculator, that means calculus, Taylor series, Newton’s method, etc.
Did you know that lots of calculators (including the famous TI-83) actually use CORDIC and that the basics of that method predate calculus by a hundred years? Did you fully understand your tool? Did that keep you from using it?
If you know the purpose of a trig function, it doesn't matter HOW the answer is calculated so much as that you know the answer is accurate. This doesn't require calculus.
I took calculus. As an engineering major, I actually had real-world applications of calculus across my coursework. How many times have I found calculus essential outside of college? Surprisingly few. Meanwhile, I've found a LOT of use for trig or linear algebra. There are things where the underlying theory is very important, but in my experience, this is not one of them.
My original point used calculus as an example. You turned this whole discussion into a referendum on calculus which I have no interest in continuing. Substitute linear algebra or even the basic theory of electronics and my point still stands, which you agree with. Furthermore, you studied calculus so you understand the principles behind the tools you are using, even if you aren’t using those principles directly, and that is valuable. People who don’t understand their tools risk being owned by them.
"If I have seen further, it is by standing on the shoulders of giants."