If I were explaining this to a skeptical student or a curious non-scientific friend, I would say that x^0 represents an expression in which x appears zero times, i.e. not at all, and thus must be the same for all x. Which would suggest 0.0^0=1 with 0^0.0 undefined. On Sat, Jun 13, 2020 at 2:57 PM Keith F. Lynch <kfl@keithlynch.net> wrote:
As a computer guy, I consider real numbers and integers to be two very different things. The English language shares this distinction, in that real numbers go with "less" and integers go with "fewer."
I'll use 0 to mean the integer zero and 0.0 to mean the real zero.
I would say that 0^0 = 1, and 0.0^0.0 is undefined. If anyone disagrees, I'd like to know why. I'd also like to know if anyone has an opinion on 0^0.0 or 0.0^0. And if sticking a minus sign in anywhere would change anything. Thanks.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun