Your algorithm is about as serial as one could possibly get; processing photons one by one & checking the CDF. This procedure also has an ever so slight drift error depending upon which order you process the CDF; the higher order statistics will be slightly skewed. The procedures in this thought experiment are quite similar to those for computing more-or-less "optimum" halftone images in computer graphics. The traditional double-slit physics experiment produces a half-tone image of the underlying probability wave. The reason for the question about parallelism is fairly obvious. We assume that the universe operates in parallel; indeed, many discussions of special relativity appeal to the inherent parallelism of the universe, which is elegant precisely because it doesn't require any synchronization -- it is "embarrassingly parallel". But when you throw QM into the mix, you suddenly seem to lose all/most of the inherent parallelism. As I said before, perhaps this bug could be turned into a feature, by harnessing QM to handle whatever synchronization is required in a large-scale parallel computer. Since QM seems to handle this synchronization without much effort or fuss, perhaps synchronization may be the most useful feature of a quantum mechanical computer? At 10:51 AM 7/1/2014, meekerdb wrote:
On 7/1/2014 2:20 AM, Henry Baker wrote:
Brent> "But why not just decide, probabilistically, for each photon, where it will be absorbed?"
OK, I'm listening. How does this work, exactly?
You calculate the interference pattern and from that the probability at each point. You generate a random number and use the inverse of the cumulative probability with respect to position to assign an absorption point. But you knew that - so I'm not sure what you're asking?
Also, where is the parallelism?
I guess I don't understand the significance of parallelism. To make the calculation efficient? Once you've calculated the probability distribution you could easily parallelize the calculation of absorption points.
Brent
At 10:47 PM 6/30/2014, meekerdb wrote:
On 6/30/2014 9:24 PM, Henry Baker wrote:
Cramer's "transactional" interpretation of QM is very computer sciencey, at least to my eyes.
Consider a thought experiment in which you're going to construct a parallel computer simulation of the double slit experiment, and you compute the probability density of photons at a uniform sampling of points in space.
The problem comes when you have to flip a probability-weighted coin at each point of the photon absorber in order to decide if a photon will be absorbed at that point. The problem is that you can't decide _locally_ and _independently_ at each absorbing point whether to absorb a photon at that point, because you have to guarantee the conservation of energy. The problem is that energy isn't just conserved _probabilistically_, but _exactly_, so if you choose badly, you can have more photons being absorbed than were emitted -- a clear violation of the conservation of energy. But why not just decide, probabilistically, for each photon, where it will be absorbed? That guarantees conservation of photon number.