13 Jun
2020
13 Jun
'20
2:57 p.m.
As a computer guy, I consider real numbers and integers to be two very different things. The English language shares this distinction, in that real numbers go with "less" and integers go with "fewer." I'll use 0 to mean the integer zero and 0.0 to mean the real zero. I would say that 0^0 = 1, and 0.0^0.0 is undefined. If anyone disagrees, I'd like to know why. I'd also like to know if anyone has an opinion on 0^0.0 or 0.0^0. And if sticking a minus sign in anywhere would change anything. Thanks.