I was listening to some of Leonard Susskind's lectures on black holes & information & the fight with Hawking yesterday, and they seemed almost quaint. The reason is that information theory & computer science are 100% classical. Computer science and information theory are founded on the principle that information can be copied, while the "no-clone" theorem of quantum mechanics states that quantum states cannot be cloned. These two distinct views of the world cannot be reconciled, except at scales large enough whereby a quantum system can simulate (with great effort & inefficiency) a classical computer circuit. Szilard was making significant progress on Maxwell's Demon and information in the 1920's, when quantum mechanics knocked the foundation out from under Maxwell's entire enterprise. Shannon's theorems re the information-carrying bandwidth are entirely classical, and completely ignore the quantum nature of radio/light waves. Thus, de Broglie tells us that long wavelengths contain many more quanta than short wavelengths. (In fact, the number of quanta is essentially the number of Planck lengths contained in the wavelength.) But Shannon tells us that shorter wavelengths can contain _more_ information than longer wavelengths! I call this the "ultraviolet catastrophe" of information theory. The Bekenstein argument talks about dropping information into a black hole & calculating the change in size of the black hole that results. But you can't drop "information", per se, into a black hole; you can only drop actual quantum mechanical particles into a black hole. These quantum mechanical particles may be entangled with particles outside the black hole. Thus, the Bekenstein argument re how much classical Shannon "information" can be stored in a black hole is interesting, but ultimately irrelevant. Shannon information theory ultimately rests on probability theory, which is also quintessentially classical. A proper modern _quantum_ information theory would rest instead on a _quantum_ probability theory in which the standard 2-slit results fall out naturally. The two articles below re evading the "diffraction limit" with quantum entangled photons demonstrates once again how far from reality our standard classical viewpoint really is. The recent discussion here re P=NP and quantum systems misses an important point: whether P=NP (a classical notion in a classical framework) may be completely irrelevant in the real quantum world. Modern computer science is fundamentally built on Turing machines that _copy information_ from one place to another. This is the basis of simulation and universal computers. But the entire enterprise fails when we can't even copy one quantum bit. We even require a new type of logic to deal with this problem: so-called "linear logic", which deals with things that can't be copied -- e.g., resources like time, space, energy, etc. It will probably take the rest of the 21st Century to re-invent probability, information theory and computer science, but we need to start soon in order to make even this long deadline. --- http://www.technologyreview.com/view/524521/worlds-first-entanglement-enhanc... February 10, 2014 World's First Entanglement-Enhanced Microscope Physicists have long known that entangled photons can make more precise measurements than independent ones. Now Japanese physicists have built a microscope that proves it. One of the exciting possibilities of quantum mechanics is the ability to measure the world far more precisely than with classical tools. Today, Takafumi Ono and pals at Hokkaido University in Japan say theyÂve exploited this to create the worldÂs first entanglement-enhanced microscope. Their new toy produces images with entangled photons that are significantly sharper than those possible with ordinary light alone. Entanglement is the strange quantum property in which two particles share the same existence, even though they may be far apart. Ono and co say this is particularly useful for a type of imaging known as differential interference contrast microscopy. This works by focusing two beams of photons into spots next to each other on a flat sample and measuring the interference pattern they create after they have been reflected. When both spots hit a flat part of the sample, they travel the same path length and create a corresponding interference pattern. But when the spots hit areas of different heights, the interference pattern changes. It is then possible to work out the shape of the surface by analysing the change in the interference pattern as the spots move across it. The difference in phase of photons can be measured with huge accuracy, but even this has a limit, known as the standard quantum limit. However, physicists have known for some time that itÂs possible to improve on this by using entangled photons rather than independent ones. ThatÂs because a measurement on one entangled photon gives you information about the other, so together they provide more information than independent photons. Ono and co demonstrate this using entangled photons to image a flat glass plate with a Q-shaped pattern carved in relief on the surface. This pattern is just 17 nanometres higher than the rest of the plate and so tricky to resolve with ordinary optical techniques. Entangled photons significantly improve on this. Ono and co say the signal to noise ratio using their technique is 1.35 times better than the standard quantum limit. And the resulting image is noticeably improved, simply by visual inspection (the image with entangled photons is on the left in the above figure). ÂAn image of a Q shape carved in relief on the glass surface is obtained with better visibility than with a classical light source, they say. That should be useful in a number of different applications; when samples might be damaged by intense light, for example. Enhanced microscopy is just one of many applications for quantum metrology. It should also help improve the resolution of the interferometers used in gravitational wave astronomy, for example. So itÂs good to see a success like this in another area. Ref: arxiv.org/abs/1401.8075: An Entanglement-Enhanced Microscope https://medium.com/p/5f473cf5a4bc How To Build A Quantum Telescope Quantum optics has revolutionised microscopy. Now astronomers are planning to jump on the quantum bandwagon The Physics arXiv Blog in The Physics arXiv Blog The diffraction limit is astronomyÂs greatest enemy. The resolution of all telescopes is limited by factors such as imperfections in the optics and turbulence in the atmosphere. But these can be overcome using better equipment, adaptive optics and by getting above the atmosphere. But there is one limit that astronomers cannot overcome because it is set by the laws of physicsÂ-the diffraction limit. Every telescope on the planet, and all those orbiting it, are limited in this way. And until recently there was no known way to beat it. Now physicists have begun to develop various quantum techniques that can overcome the diffraction limit, at least in the lab. These techniques have begun to revolutionise microscopy, where the light source can be carefully controlled. But they have yet to be considered for astronomy because astronomers have little, if any, control over the sources of the light they are interested in. Today, however, Aglae Kellerer at the University of Durham explains how to build a quantum telescope. She says quantum techniques could dramatically improve the resolution of telescopes by beating the diffraction limit for the first time. When light from a point source enters a lens, it bends. This bending causes the light to spread out so that it interacts with itself generating an interference pattern. The result is that a lens always resolves a point source as a bright disc surrounded by concentric circles of light, a pattern known as an Airy disc. The size of this disc, which is determined by the wavelength of light and the size of the lens, determines the ultimate resolution of the instrument. In practice, most telescopes are limited by other factors, in particular, turbulence in the several kilometres of atmosphere above them. But techniques such as adaptive optics, which iron out the effects of turbulence, are allowing telescopes to get much closer to the diffraction limit. So how to go further? KellererÂs idea is to exploit the strange effects of quantum mechanics to improve things. Entanglement, for example. Entangled photons are so deeply linked that they share the same existence. Measure one and you automatically influence the other. That gives you information about the other photon, regardless of its distance from you. Last month, physicists used this idea to build the worldÂs first entanglement-enhanced microscope that dramatically increases its resolution over purely classical instruments. They created entangled photons and used one to illuminate the object. The second photon can then give them information about the first that they use to increase the resolution of the resulting image. ThereÂs an obvious problem in employing these kinds of techniques in astronomyÂ-the photons of interest arenÂt under your control, having travelled many lightyears from their astrophysical source. But Kellerer says there is a way round this. Her idea is to use the astrophysical photons to stimulate the production of an entangled pair, inside a telescope. The first of this pair then hits the detector, generating an image. But the other can be used to increase the information known about the first, thereby increasing the resolution and beating the diffraction limit. ThatÂs an interesting idea that has the potential to significantly increase the resolution of conventional telescopes. In fact, Kellerer has simulated the effect of this process on computer to enhance the resolution of a conventional astronomical image by a factor of six. But building an instrument that works in this way will be hard and the devil is in the detail. The first problem is in producing entangled photons in the first place. KellererÂs idea is to use a crystal of excited atoms that emit entangled photons when stimulated by passing astrophysical ones. The problem is the efficiency of such a process. With so few astrophysical photons to play with, any that are lost due to inefficiencies are a serious problem. Then there is the problem of spontaneous emission. Excited atoms have a nasty habit of emitting photons, even when they are not stimulated by passing photons. ThatÂs noise, which could end up overwhelming the signal from the photons astronomers are interested in. Both of these problems can be minimised but they cannot be removed completely. The question is whether the advantages of this technique can be made to outweigh the disadvantages in a real device. ThereÂs only one way to find out, of course. Today, the technology required to test this idea is in its infancy. But thereÂs good reason to think that significant advances will be made in the near future. And that means that future telescopes could be very different from the ones we have today. ItÂs a sobering thought that if the pioneers of telescopic imaging were around today, they would find that the worldÂs best telescopes work in more or less exactly the same way as their own from 400 years ago. But KellererÂs approach would be entirely alien to them and could finally take astronomy into a new era of quantum imaging. Ref: arxiv.org/abs/1403.6681 : Quantum Telescopes