-----Original Message----- From: Marc LeBrun [mailto:mlb@fxpt.com] Sent: Friday, March 07, 2003 12:24 PM To: math-fun@mailman.xmission.com Subject: Re: [math-fun] Why is e the "best" base? + A symmetrical way to treat digits 0,1,2 base e
=mcintosh@servidor.unam.mx Entropy is p ln(p) summed over alternatives. It vanishes for p = 0 and p = 1, with a maximum somewhere inbetween, say at p = 1/e.
An interesting thing about this formula is that the maximum is independent of the base used for the logarithm.
However, are you sure about that p factor? I thought the information content of a message was, roughly, how "surprising" it was, hence simply -ln(p) (which is why information gets called "negative entropy"). Getting an impossible message would be a miraculous epiphany, containing infinite information (alas of the form "everything you know is wrong!"<;-)
You get the quantity -ln(p) of information from a message that occurs with probability p, but you only get that information with probability p, so the expected amount of information received is the sum of -p(ln(p)) over all possible messages you might receive. Andy Latto andy.latto@pobox.com