="J.P. Grossman" <jpg@alum.mit.edu> There's a good paper on parallel random number generation here: http://www.thesalmons.org/john/random123/papers/random123sc11.pdf
Very interesting, thanks for the link! I will plan to read it more carefully, but already I'm intrigued that one of the authors is *that* David Elliot Shaw: http://en.wikipedia.org/wiki/David_E._Shaw Small valley!
It won the best paper award at SC11. The basic idea is to come up with a function f such that the nth random number is f(n), instead of generating the numbers sequentially, which makes the "parallel" part trivial. The paper describes how f can be chosen so that the random number generation is fast on modern processors and passes the BigCrush statistical test.
I don't know yet how this compares to what's in the cited paper, but another thing Bill Gosper noted about the word-vector shift-register upstream (Algorithm A) part of generator I described is that each step is effectively a multiplication of the vector by a sparse 01 matrix whose non-zero entries are simply the coefficients of the primitive polynomial (or the feedback taps in the delay line, to use signal-processing lingo). Thus one can jump n steps simply by multiplying by the n-th power of that matrix instead. Of course the downstream (Algorithm M) part is specifically intended to mess up even this vestigial "autocorrelation".