Consider probability densities P(x) on the positive real half-line with given mean. The one maximizing the entropy-like functional integral P(x) * log(P(x) * x^k) dx is a general gamma distribution. I don't think I like this as much as the other maxent characterization in wikipedia, though: gamma distribution is maxent on the halfline given that E(X) and E(lnX) are specified. =========== An idea attributed to Chris Wallace for generating random normal or random exponential deviates, is as follows. RANDOM NORMAL: 1. Maintain a pool of N standard normal (mean=0, variance=1) deviates for some moderately large N such as N=256. 2. To update the pool: A. Choose a small subset among those N randomly. Most simply, choose 2 of them. B. Randomly rotate this 2-vector using a 2x2 rotation with random uniform angle in [0..2*pi), and/or random reflection in a line thru the origin of the XY plane. Other updates also are possible. For example, you might only use the 8 rotations and reflections in the dihedral group D4. Or you might choose 4, not 2, and perform a rotation/reflection in 4 dimensions, which has the advantage you can do that with quaternions and only rational operations. Indeed you can make other normals in the poll select the quaternions defining the 4D rotation for you, so you do not have to generate it yourself. RANDOM EXPONENTIAL: 1. Maintain a pool of N standard exponential (mean=1) deviates for some moderately large N such as N=256. 2. To update the pool: A. Choose a small subset among those N randomly. Most simply, choose 2 of them. B. Replace these two (X,Y) with R*S, (1-R)*S where S=X+Y and R is uniform random on (0,1). The point of these update operations is they preserve the sum of squares (for normal) or plain sum (for exponential) of the pool, while increasing entropy. As a result after a lot of updates we approach the maxent distribution with given mean (i.e. exponential) or given variance (i.e. normal). This kind of generator is extremely fast and simple. No transcendental functions are needed. Unfortunately a slight defect is that the sum of squares (or plain sum) of the entire pool remains constant. This is correctable because really, this sum should be a gamma-distributed random variable with known parameters, so every N updates you can alter the entire pool-sum (or sum of squares) by calling a gamma generator just once. If N is large this does not slow things down much. However this leads us to ask: "how do we generate random samples from gamma distributions?" ============================= Here is a gamma generator based on the same Wallace-like maxent+pool approach, and using wikipedia's maxent characterization. To generate gammas from some fixed gamma distribution (fix its parameters once and for all at start): 1. Maintain a pool of N gamma deviates for some moderately large N such as N=256. 2. To update the pool: A. Choose three (X,Y,Z) among those N randomly. B. Replace (X,Y,Z) with "random" (x,y,z) chosen to have the same sum S=X+Y+Z and same product P=X*Y*Z. Once x is known, 0<=x<=S/3, then y and z could be found by finding the two solutions of a quadratic equation. The set of these x,y,z form a closed approximately-circular-shaped curve X*Y*Z=P drawn on the subset X>0, Y>0, Z>0 of the 2-dimensional plane X+Y+Z=S and centered at X=Y=Z=S/3. One could pretend the distribution along this curve was uniform in angle and thus generate a random direction within this plane, then find the point in that direction on the curve (by solving cubic). Anyhow, it does not terribly matter how we generate a "random" point (x,y,z) on this curve -- virtually any reasonable method and any reasonable definition of random will do, because all we need is that this update be entropy-increasing. Incidentally, gammas can be used to generate betas... ================ In these maxent/pool methods, the parameters of the distribution are chosen by making the initial pool have the desired variance (for normal), mean (for exponential), and mean & mean-logarithm (for gamma).