Yeah. Back then they really didn't understand abstraction, and so they thought that if a computer's hardware did arithmetic a particular way, that that would be visible to the user (despite mediation by software). And business people were, I guess, thought to be too timid to handle the idea of binary. Or, well, I guess if you use binary floating point to represent an amount of money in dollars, then (because 100 isn't a power of two) one cent wouldn't be represented exactly, and the economy would crash and burn. (However your computer hardware does arithmetic, the right way to handle money is as an (exact) integer number of cents, then if you want to display it as dollars, you do a JOIN of floor(cents/100), ".", (cents mod 100). (Making sure to show an extra zero if cents mod 100 is less than ten.))