To add a new wrinkle to the discussion, this fact has made me question the process of "
rounding"
.
To "
round"
a number to a desired place value, look at the place value immediately to the right. If that digit is between 0 and 4 inclusive, truncate everything after the desired place value.
If the digit to the right of the desired place value is between 5 and 9 inclusive, add 1 to the digit in the desired place value, and truncate everything to the right of that digit.
For most numbers, rounding is easy. But then...
How would one round .149999999... to the nearest tenth? By the algorithm described above, one would necessarily round it to "
.1"
.
Yet, we know that this number is equivalent to "
.15"
which rounds to "
.2"
.
This is more of a qualm with the rounding algorithm than the 1=.(9) fact. But I *could* see a form of math where this is disallowed. Not that I'd believe it. There are a host of other problems that would arise from allowing it.
____________________________
Click here to view the secret text
×
First Delver! (I was the first non-tester/dev
to conquer TCB.)
d
/dy