Here is a better assessment of the "information loss" when a random function F is iterated in some universe of large cardinality U. If Y=F(X) for k different pre-images X, then the probabilities of different k should essentially be a Poisson distribution with expectation=1, Prob(k) = exp(-1) / k! whose entropy is -Sum[ exp(-1)/k! * log(exp(-1)/k!), k=0..infinity ] / log(2) = 1.882489 bits except that if we restrict to k>0 only, then the probability is Prob(k) = (1-1/e) / k! whose entropy is -Sum[ (1-1/e)/k! * log((1-1/e)/k!), k=1..infinity ] / log(2) = 1.47443 bits So, I presume that means that each iteration, we lose about 1.47443 bits of information. In contrast a perfect coin flip has Prob=1/2 and entropy=1 bit. Re my conjecture about the hardness of cycle-membership (and cycle cardinality presumably also is hard) here is a specific case. Let x be a 64-bit unsigned integer with arithmetic mod 2^64. What is the cycle length of x := Rotate(3141592653589793239*x mod 2^64, 32); starting from x=1? It might be beyond the ability of humankind to answer this. And that example was not even information-losing. The answer is somewhere in [2^45, 2^64] probably toward the high end. A 128-bit info-losing example would be x := Rotate(31415926535897932384626433832795028841*x mod 2^128, 64)-x mod 2^128; and what is the period of that starting from x=12345678987654321? Is x=12345678987654321 a cycle member? Can anybody see how to answer such questions any faster than about 2^64 operations? Seems like they are harder than hell, and dwarfs the hardness of factoring a 128-bit or even a 256-bit number. -- Warren D. Smith http://RangeVoting.org <-- add your endorsement (by clicking "endorse" as 1st step)