Re: [math-fun] Quantum Mechanical Pilot waves: they're baaaack !
Cramer's "transactional" interpretation of QM is very computer sciencey, at least to my eyes. Consider a thought experiment in which you're going to construct a parallel computer simulation of the double slit experiment, and you compute the probability density of photons at a uniform sampling of points in space. The problem comes when you have to flip a probability-weighted coin at each point of the photon absorber in order to decide if a photon will be absorbed at that point. The problem is that you can't decide _locally_ and _independently_ at each absorbing point whether to absorb a photon at that point, because you have to guarantee the conservation of energy. The problem is that energy isn't just conserved _probabilistically_, but _exactly_, so if you choose badly, you can have more photons being absorbed than were emitted -- a clear violation of the conservation of energy. What a computer simulation program would do would be to utilize _transactions_, which would guarantee that energy was exactly conserved on a global basis; such transactions would certainly slow down the parallel simulation program due to contention of the locking mechanisms on the shared resource which keeps the energy in exact balance. For a computer scientist, who expects the universe to be "embarrassingly parallel", this transaction/locking mechanism is extremely inelegant. Yet the possibility of _entanglement_ virtually guarantees that some sort of transaction mechanism will be required to guarantee consistency of the simulation. There are several ways to look at this problem. One is to assume that it is a wart, and attempt to remove it with Bohm-like and pilot wave-like models. The other is to turn this bug into a feature, and attempt to utilize QM itself to do the dirty work in computer simulations by having QM entanglement handle whatever transactional guarantees are required (I'm not sure exactly how this might be done, but there are lots of computer scientists looking into the usefulness of quantum computers). BTW, there are several different flavors of computer transaction implementations, which are also mirrored in discussions about the philosophy of QM. "Conservative" transaction implementations refuse to do any work until a process has _exclusive_ access to a shared resource, while "speculative" transaction implementations are willing to do quite a lot of work, so long as they are also willing to throw away work that proves not to be consistent when the transaction is "closed". QM seems to be quite content to do its thing obliviously until a "measurement" is made, at which point all transactions must be closed, so that the observer sees a consistent picture. At 02:12 PM 6/30/2014, Jeff Caldwell wrote:
I hope to better understand the curious case of the photon.
It has no frame of reference and, were it conscious, would perceive itself to be emitted and absorbed simultaneously, i.e. no time passing between the events.
With zero time between emission and absorption, whimsy allows me to think of emitter and absorber as in some sense touching, albeit one is an ancient star and the other a cone in my living eye.
Zero time means zero distance, in my book, although applying that rule to the no-frame-of-reference photon is probably a category error.
Cramer's transactional interpretation, inspired by Wheeler-Feynman time-symmetric theory, has both forward and backward-in-time waves between emitter and absorber, agreeing upon the transaction before (as? timey-wimey words ...) it takes place, which leaves precious little room for the free will electrons have if you or I do, John Conway and Simon Kocken say, and leaving no room at all for deciding whether or not to slide a detector into a photon's path after the photon has been emitted.
Emitters, detectors and photons have united, and their agreements will be kept!
On 6/30/2014 9:24 PM, Henry Baker wrote:
Cramer's "transactional" interpretation of QM is very computer sciencey, at least to my eyes.
Consider a thought experiment in which you're going to construct a parallel computer simulation of the double slit experiment, and you compute the probability density of photons at a uniform sampling of points in space.
The problem comes when you have to flip a probability-weighted coin at each point of the photon absorber in order to decide if a photon will be absorbed at that point. The problem is that you can't decide _locally_ and _independently_ at each absorbing point whether to absorb a photon at that point, because you have to guarantee the conservation of energy. The problem is that energy isn't just conserved _probabilistically_, but _exactly_, so if you choose badly, you can have more photons being absorbed than were emitted -- a clear violation of the conservation of energy.
But why not just decide, probabilistically, for each photon, where it will be absorbed? That guarantees conservation of photon number. The "interpretation" problem of QM arises where there is a transition to classical physics notions like "absorbed at a point". It's absorption isn't something we can see directly. The theory says something like: The photon strikes one of many silver halide molecules, with different probabilities and causes them to enter a superposition of states with different probability weights for each molecule. Then, somehow, we see only one spot of silver on the film. Brent
What a computer simulation program would do would be to utilize _transactions_, which would guarantee that energy was exactly conserved on a global basis; such transactions would certainly slow down the parallel simulation program due to contention of the locking mechanisms on the shared resource which keeps the energy in exact balance. For a computer scientist, who expects the universe to be "embarrassingly parallel", this transaction/locking mechanism is extremely inelegant. Yet the possibility of _entanglement_ virtually guarantees that some sort of transaction mechanism will be required to guarantee consistency of the simulation.
There are several ways to look at this problem. One is to assume that it is a wart, and attempt to remove it with Bohm-like and pilot wave-like models. The other is to turn this bug into a feature, and attempt to utilize QM itself to do the dirty work in computer simulations by having QM entanglement handle whatever transactional guarantees are required (I'm not sure exactly how this might be done, but there are lots of computer scientists looking into the usefulness of quantum computers).
BTW, there are several different flavors of computer transaction implementations, which are also mirrored in discussions about the philosophy of QM. "Conservative" transaction implementations refuse to do any work until a process has _exclusive_ access to a shared resource, while "speculative" transaction implementations are willing to do quite a lot of work, so long as they are also willing to throw away work that proves not to be consistent when the transaction is "closed". QM seems to be quite content to do its thing obliviously until a "measurement" is made, at which point all transactions must be closed, so that the observer sees a consistent picture.
At 02:12 PM 6/30/2014, Jeff Caldwell wrote:
I hope to better understand the curious case of the photon.
It has no frame of reference and, were it conscious, would perceive itself to be emitted and absorbed simultaneously, i.e. no time passing between the events.
With zero time between emission and absorption, whimsy allows me to think of emitter and absorber as in some sense touching, albeit one is an ancient star and the other a cone in my living eye.
Zero time means zero distance, in my book, although applying that rule to the no-frame-of-reference photon is probably a category error.
Cramer's transactional interpretation, inspired by Wheeler-Feynman time-symmetric theory, has both forward and backward-in-time waves between emitter and absorber, agreeing upon the transaction before (as? timey-wimey words ...) it takes place, which leaves precious little room for the free will electrons have if you or I do, John Conway and Simon Kocken say, and leaving no room at all for deciding whether or not to slide a detector into a photon's path after the photon has been emitted.
Emitters, detectors and photons have united, and their agreements will be kept!
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Brent> "But why not just decide, probabilistically, for each photon, where it will be absorbed?" OK, I'm listening. How does this work, exactly? Also, where is the parallelism? At 10:47 PM 6/30/2014, meekerdb wrote:
On 6/30/2014 9:24 PM, Henry Baker wrote:
Cramer's "transactional" interpretation of QM is very computer sciencey, at least to my eyes.
Consider a thought experiment in which you're going to construct a parallel computer simulation of the double slit experiment, and you compute the probability density of photons at a uniform sampling of points in space.
The problem comes when you have to flip a probability-weighted coin at each point of the photon absorber in order to decide if a photon will be absorbed at that point. The problem is that you can't decide _locally_ and _independently_ at each absorbing point whether to absorb a photon at that point, because you have to guarantee the conservation of energy. The problem is that energy isn't just conserved _probabilistically_, but _exactly_, so if you choose badly, you can have more photons being absorbed than were emitted -- a clear violation of the conservation of energy.
But why not just decide, probabilistically, for each photon, where it will be absorbed? That guarantees conservation of photon number.
On 7/1/2014 2:20 AM, Henry Baker wrote:
Brent> "But why not just decide, probabilistically, for each photon, where it will be absorbed?"
OK, I'm listening. How does this work, exactly?
You calculate the interference pattern and from that the probability at each point. You generate a random number and use the inverse of the cumulative probability with respect to position to assign an absorption point. But you knew that - so I'm not sure what you're asking?
Also, where is the parallelism?
I guess I don't understand the significance of parallelism. To make the calculation efficient? Once you've calculated the probability distribution you could easily parallelize the calculation of absorption points. Brent
At 10:47 PM 6/30/2014, meekerdb wrote:
On 6/30/2014 9:24 PM, Henry Baker wrote:
Cramer's "transactional" interpretation of QM is very computer sciencey, at least to my eyes.
Consider a thought experiment in which you're going to construct a parallel computer simulation of the double slit experiment, and you compute the probability density of photons at a uniform sampling of points in space.
The problem comes when you have to flip a probability-weighted coin at each point of the photon absorber in order to decide if a photon will be absorbed at that point. The problem is that you can't decide _locally_ and _independently_ at each absorbing point whether to absorb a photon at that point, because you have to guarantee the conservation of energy. The problem is that energy isn't just conserved _probabilistically_, but _exactly_, so if you choose badly, you can have more photons being absorbed than were emitted -- a clear violation of the conservation of energy. But why not just decide, probabilistically, for each photon, where it will be absorbed? That guarantees conservation of photon number.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Your algorithm is about as serial as one could possibly get; processing photons one by one & checking the CDF. This procedure also has an ever so slight drift error depending upon which order you process the CDF; the higher order statistics will be slightly skewed. The procedures in this thought experiment are quite similar to those for computing more-or-less "optimum" halftone images in computer graphics. The traditional double-slit physics experiment produces a half-tone image of the underlying probability wave. The reason for the question about parallelism is fairly obvious. We assume that the universe operates in parallel; indeed, many discussions of special relativity appeal to the inherent parallelism of the universe, which is elegant precisely because it doesn't require any synchronization -- it is "embarrassingly parallel". But when you throw QM into the mix, you suddenly seem to lose all/most of the inherent parallelism. As I said before, perhaps this bug could be turned into a feature, by harnessing QM to handle whatever synchronization is required in a large-scale parallel computer. Since QM seems to handle this synchronization without much effort or fuss, perhaps synchronization may be the most useful feature of a quantum mechanical computer? At 10:51 AM 7/1/2014, meekerdb wrote:
On 7/1/2014 2:20 AM, Henry Baker wrote:
Brent> "But why not just decide, probabilistically, for each photon, where it will be absorbed?"
OK, I'm listening. How does this work, exactly?
You calculate the interference pattern and from that the probability at each point. You generate a random number and use the inverse of the cumulative probability with respect to position to assign an absorption point. But you knew that - so I'm not sure what you're asking?
Also, where is the parallelism?
I guess I don't understand the significance of parallelism. To make the calculation efficient? Once you've calculated the probability distribution you could easily parallelize the calculation of absorption points.
Brent
At 10:47 PM 6/30/2014, meekerdb wrote:
On 6/30/2014 9:24 PM, Henry Baker wrote:
Cramer's "transactional" interpretation of QM is very computer sciencey, at least to my eyes.
Consider a thought experiment in which you're going to construct a parallel computer simulation of the double slit experiment, and you compute the probability density of photons at a uniform sampling of points in space.
The problem comes when you have to flip a probability-weighted coin at each point of the photon absorber in order to decide if a photon will be absorbed at that point. The problem is that you can't decide _locally_ and _independently_ at each absorbing point whether to absorb a photon at that point, because you have to guarantee the conservation of energy. The problem is that energy isn't just conserved _probabilistically_, but _exactly_, so if you choose badly, you can have more photons being absorbed than were emitted -- a clear violation of the conservation of energy. But why not just decide, probabilistically, for each photon, where it will be absorbed? That guarantees conservation of photon number.
On 7/1/2014 11:17 AM, Henry Baker wrote:
Your algorithm is about as serial as one could possibly get; processing photons one by one & checking the CDF.
But the interesting thing about Young's slit experiment is that you get the same interference pattern even when the photons go through one at a time. So what's wrong with simulating them one-at-a-time?
This procedure also has an ever so slight drift error depending upon which order you process the CDF; the higher order statistics will be slightly skewed.
Then why not alternate order? Brent Meeker
I'd rather not choose an order at all. In fact, I'd like the process to be so parallel that no order is required, and indeed, it might be difficult/impossible to determine whether there ever was an ordering. (As an aside, I'd like to mention the Bitcoin "blockchain" mechanism for establishing a _discrete linear order_ for transactional events. Bitcoin had to develop a quite significant mechanism to construct this linear ordering in real time. Perhaps there is a more elegant mechanism using QM to achieve the same effect.) I don't mind so much God playing dice with the universe, so long as he has 10^200 pairs of dice; serializing the entire universe through a single dice-throwing process really sucks! At 12:57 PM 7/1/2014, meekerdb wrote:
On 7/1/2014 11:17 AM, Henry Baker wrote:
Your algorithm is about as serial as one could possibly get; processing photons one by one & checking the CDF.
But the interesting thing about Young's slit experiment is that you get the same interference pattern even when the photons go through one at a time. So what's wrong with simulating them one-at-a-time?
This procedure also has an ever so slight drift error depending upon which order you process the CDF; the higher order statistics will be slightly skewed.
Then why not alternate order?
Brent Meeker
If I understand the transactional interpretation correctly, albeit in a very limited fashion, the transaction is "agreed to" "at the instant" the transaction takes place. In other words, and loosely, the atom that emits the photon has agreed with the atom that is to absorb the photon, at which point the transaction takes place. Since only a single agreement is made, energy is conserved. The parallelism takes place, in a sense, across time and space (space-time), not just across space at the time of arrival of the photon's wave function at a set of possible absorbers. Both emitters and absorbers were emitting waves, emitters emitting forward waves, absorbers emitting retarded waves, with some sort of constructive/destructive interference determining "the winner" for each quantum of energy. This structure avoids any need for faster-than-light decoherence mechanism, since there is no group of absorbers "transacting to make a decision" at the point in time of the photon's arrival. The agreement was made at the instant of the photon's emission. Given that photons have no frame of reference and thus experience no flow of time during flight, i.e. a zero-length time interval so far as the photon is concerned, there is no opportunity for the transaction to be spoiled, so no backout/reapply mechanisms are needed. That is, the transactions are atomic. (A pun!) On Tue, Jul 1, 2014 at 12:24 AM, Henry Baker <hbaker1@pipeline.com> wrote:
Cramer's "transactional" interpretation of QM is very computer sciencey, at least to my eyes.
Consider a thought experiment in which you're going to construct a parallel computer simulation of the double slit experiment, and you compute the probability density of photons at a uniform sampling of points in space.
The problem comes when you have to flip a probability-weighted coin at each point of the photon absorber in order to decide if a photon will be absorbed at that point. The problem is that you can't decide _locally_ and _independently_ at each absorbing point whether to absorb a photon at that point, because you have to guarantee the conservation of energy. The problem is that energy isn't just conserved _probabilistically_, but _exactly_, so if you choose badly, you can have more photons being absorbed than were emitted -- a clear violation of the conservation of energy.
What a computer simulation program would do would be to utilize _transactions_, which would guarantee that energy was exactly conserved on a global basis; such transactions would certainly slow down the parallel simulation program due to contention of the locking mechanisms on the shared resource which keeps the energy in exact balance. For a computer scientist, who expects the universe to be "embarrassingly parallel", this transaction/locking mechanism is extremely inelegant. Yet the possibility of _entanglement_ virtually guarantees that some sort of transaction mechanism will be required to guarantee consistency of the simulation.
There are several ways to look at this problem. One is to assume that it is a wart, and attempt to remove it with Bohm-like and pilot wave-like models. The other is to turn this bug into a feature, and attempt to utilize QM itself to do the dirty work in computer simulations by having QM entanglement handle whatever transactional guarantees are required (I'm not sure exactly how this might be done, but there are lots of computer scientists looking into the usefulness of quantum computers).
BTW, there are several different flavors of computer transaction implementations, which are also mirrored in discussions about the philosophy of QM. "Conservative" transaction implementations refuse to do any work until a process has _exclusive_ access to a shared resource, while "speculative" transaction implementations are willing to do quite a lot of work, so long as they are also willing to throw away work that proves not to be consistent when the transaction is "closed". QM seems to be quite content to do its thing obliviously until a "measurement" is made, at which point all transactions must be closed, so that the observer sees a consistent picture.
At 02:12 PM 6/30/2014, Jeff Caldwell wrote:
I hope to better understand the curious case of the photon.
It has no frame of reference and, were it conscious, would perceive itself to be emitted and absorbed simultaneously, i.e. no time passing between the events.
With zero time between emission and absorption, whimsy allows me to think of emitter and absorber as in some sense touching, albeit one is an ancient star and the other a cone in my living eye.
Zero time means zero distance, in my book, although applying that rule to the no-frame-of-reference photon is probably a category error.
Cramer's transactional interpretation, inspired by Wheeler-Feynman time-symmetric theory, has both forward and backward-in-time waves between emitter and absorber, agreeing upon the transaction before (as? timey-wimey words ...) it takes place, which leaves precious little room for the free will electrons have if you or I do, John Conway and Simon Kocken say, and leaving no room at all for deciding whether or not to slide a detector into a photon's path after the photon has been emitted.
Emitters, detectors and photons have united, and their agreements will be kept!
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (3)
-
Henry Baker -
Jeff Caldwell -
meekerdb