Everything in life is estimation. A calculator that tells you the perfect answer in some highly unusual situation probably isn't fixing the most common source of error.
e.g I measure an angle and I am not sure about whether it's 45 degrees or 46 then an answer like this is pointless:
0.7071067811865476
cos of 46 (if I've converted properly to radians) is
0.6946583704589973
so my error is about 0.01 and those long lists of digits imply a precision I don't have.
I think it would be more useful for most people to tell them how much error there is in their results after guessing or letting them assign the estimated error in their inputs.
Examples include finance and resource accounting. Mathematical proof (yes sometimes they involve numbers), etc.
Even in engineering and carpentry it’s not true. The design process is idealization, without real world measurements. It’s conceptually useful for precise numbers to sum properly on paper. For example it’s common to divide lengths into fractional amounts which are expected to sum to a whole.
> tell them how much error there is
But once again, we know how to build calculators that do most calculations with 0 error. So why are we planning for an estimation problem we don’t have?
Read the article. yes if you want to put out sqrt(2) in decimal form, it will be an approximate. But you can present it as sqrt(2).
>> tell them how much error there is
>But once again, we know how to build calculators that do most calculations with 0 error. So why are we planning for an estimation problem we don’t have?
We have accepted lack of perfection from calculators long ago. I cannot think of a use-case which needs it from anyone I know. Perhaps some limited number of people out there really need a calculator that can do these things but I suspect that if they do there's a great chance they don't know it can handle that sort of issue.
I have more trouble with the positions of the buttons in the UI than with sums that don't work out as expected. The effort to get buttons right seems far less to me.
I can think of useful things I'd like when I'm doing real-world work which this feature doesn't address at all and I wonder why such an emphasis was put on something which isn't really that transformational.
I understand that if you use American units you might be calculating things in fractions of an inch but since I've never had to use those units it's never been necessary to do that sort of calculation. I suppose if that helps someone then yay but I can only sympathise to an extent.
Where I have problems is with things that aren't precise - where the bit of wood that I cut turns out a millimetre too short and ends up being useless.
e.g I measure an angle and I am not sure about whether it's 45 degrees or 46 then an answer like this is pointless: 0.7071067811865476
cos of 46 (if I've converted properly to radians) is 0.6946583704589973
so my error is about 0.01 and those long lists of digits imply a precision I don't have.
I think it would be more useful for most people to tell them how much error there is in their results after guessing or letting them assign the estimated error in their inputs.