The distribution of the continued fraction partial quotients of a random number, say 2^(1/3), was worked out already by Gauss: p(a) = log_2(1+1/a) - log_2(1+1/(a+1)) If you compute the entropy of this distribution you get, numerically, H(A) = 2.3792… (A is my symbol for the random variable). On the other hand, the Lyapunov exponent of the continued fraction map x -> 1/x - floor(1/x) can be worked out analytically: L = zeta(2)/log(2) = 2.3731… Why are these numbers so close? Consider the joint probability of consecutive partial quotients (a, b): p(a, b) = log_2(1+1/(a+1/(b+1))) - log_2(1+1/(a+1/b)) I don’t know if Gauss worked this out too, but I’m sure someone (Knuth?) has. It should be obvious from the formula how it generalizes to triples, etc. In any case, from this you can compute, numerically, the mutual information between consecutive partial quotients: I(A,B) = 0.0056… That's most of the small difference — why? Consider the conditional entropy H(A|B) = H(A) - I(A,B) = 2.3736… This is closer to what we mean by the “entropy of partial quotients”, because partial quotient A (of a random number) always has a predecessor B, and the two are not independent. Surely someone — references welcome! — has proved L = H(A|BCD…) Trivia question: What real number, distinguished by its partial quotients, is also (like L) the ratio of two “special” functions at argument 2? -Veit