Why don't computers store decimal numbers as a second whole number?

Posted by SomeKittens on Programmers See other posts from Programmers or by SomeKittens
Published on 2012-10-02T21:12:12Z Indexed on 2012/10/02 21:50 UTC
Read the original article Hit count: 343

Filed under:
|

Computers have trouble storing fractional numbers where the denominator is something other than a solution to 2^x. This is because the first digit after the decimal is worth 1/2, the second 1/4 (or 1/(2^1) and 1/(2^2)) etc.

Why deal with all sorts of rounding errors when the computer could have just stored the decimal part of the number as another whole number (which is therefore accurate?)

The only thing I can think of is dealing with repeating decimals (in base 10), but there could have been an edge solution to that (like we currently have with infinity).

© Programmers or respective owner

Related posts about numbers

Related posts about numeric-precision