Looking at and comparing the accuracy of computers and calculators.
One of the bugbears of developing calculator emulators is that the floating point arithmetic on a computer is actually fundamentally different to that in a pocket calculator. This is due to the general purpose nature of computers (and to some extent the vagaries of the IEEE floating point standard), and the very application specific way that calculator chips are designed. So if you rely on the computer's arithmetic functions (which nowadays are almost invariably implemented in hardware) you can get some nasty surprises.
But how accurate are "real" pocket calculators? You can do some interesting tests and get surprisingly different results between calculator models, even for very simple calculations. Try this:
1 / 9 X 9 =
You should of course get the answer 1. But many calculators, particularly the simpler four function ones, will give an answer like 0.999999... It depends on the number of digits of accuracy of the calculator and how many "guard" digits the calculator maintains, which are extra digits of accuracy that are not displayed. Generally scientific calculators will have no problem with this kind of calculation, but you can start to use up this reserve of accuracy if you repeatedly carry out operations that take their toll of the guard digits.
A way of finding the number of guard digits is to subtract a large number from a small number and multiply the result. More sophisticated expressions can be used to analyse the accuracy of a calculator and Mike Sebastian has made very thorough discussion of this elsewhere, to the point where the exact calculator chip used can often be deduced.