In Volume 2 of Knuth's Art of Computer Programming, page 193, he mentions some research concerning questions about irrational radices and gives a reference to W. Parry, Acta Mathematica, Acad. Sci. Hung., 11, (1960), 401-416. LL I recently began wondering if anything special happens when one uses the scheme of decimal, binary, etc. representation -- as usual, via a sequence of integer coefficients no greater than the base -- but with an irrational base like e. Of course, the representation has much more leeway for nonuniqueness than for an integer base, unless an algorithm is specified making it unique. So, we use the obvious "greedy" algorithm to get a well-defined representation. With base e, heuristically, a generic number's representation would in the long run have fraction 1/e of its digits = 0, 1/e of its digits = 1, and (e-2)/e of them = 2. I've taken a look at a few numbers base e but haven't found any statistical or other abnormalities. For example, I don't know what distinguishes an integer = 2 from a noninteger, a rational from an irrational, a 1/prime from 1/nonprime, etc. Without evident pattern, it's still amusing to see pi expressed "base e". It starts: pi = 10.101002020002111120020101120001010200010012101202012100211200... Does anyone know of research on representing numbers via real irrational bases? --Dan