Monday 8 September 2014

Overview of Quantum Entanglement - Einstein Versus Bohr





Quantum Entanglement



It’s a popular myth that identical twins, it's said, can sometimes sense when one of the pair is in danger, even if they're oceans apart. Tales of telepathy abound. Scientists cast a skeptical eye over such claims, largely because it isn't clear how these weird connections could possibly work. Yet they've had to come to terms with something that's no less strange in the world of physics: an instantaneous link between particles that remains strong, secure, and undiluted no matter how far apart the particles may be – even if they're on opposite sides of the universe. It's a link that Einstein went to his grave denying, yet its existence is now beyond dispute. This quantum equivalent of telepathy is demonstrated daily in laboratories around the world. It holds the key to future hyperspeed computing and underpins the science of teleportation. Its name is entanglement.


The discovery of entanglement



The concept, but not the name of entanglement, was first put under the scientific spotlight on May 15, 1935, when a paper by Einstein and two younger associates, Boris Podolosky and Nathen Rosen, appeared in the journal Physical Review.[1]


Its title – "Can a Quantum-Mechanical Description of Physical Reality Be Considered Complete?" – leaves no doubt that the paper was a challenged to Niels Bohr and his vision of the subatomic world. On June 7, Erwin Schrödinger, himself no lover of quantum weirdness, wrote to Einstein, congratulating him on the paper and using in his letter the word entanglement – or, rather, its German equivalent verschränkung – for the first time. This new term soon found its way into print in an article – sent to the Cambridge Philosophical Society on August 14 that was published a couple of months later.[2]

In it he wrote:

When two systems ... enter into temporary physical interaction ... and when after a time of mutual influence the systems separate again, then they can no longer be described in the same way as before, viz. by endowing each of them with a representative of its own. I would not call that one but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought. By the interaction the two representatives [the quantum states] have become entangled.
The characteristic trait of quantum mechanics ... the one that enforces its entire departure from classical lines of thought – here was an early sign of the importance attached to this remarkable effect.

Entanglement lay at the very heart of quantum reality – its most startling and defining feature and Einstein would have none of it.

For the best part of a decade, the man who revealed the particle nature of light (see Einstein and the photoelectric effect) had been trying to undermine Bohr's interpretation of quantum theory. Einstein couldn't stomach the notion that particles didn't have properties, such as momentum and position, with real, determinable (if only we knew how), preexisting values. Yet that notion was spelled out in a relationship discovered in 1927 by Werner Heisenberg. 

Known as the uncertainty principle, it stems from the rule that the result of multiplying together two matrices representing certain pairs of quantum properties, such as position and momentum, depends on the order of multiplication. The same oddball math that says X times Y doesn't have to equal Y times X implies that we can never know simultaneously the exact values of both position and momentum. Heisenberg proved that the uncertainty in momentum can never be smaller than a particular number that involves Planck's constant. 



In one sense, this relationship quantifies wave-particle duality. Momentum is a property that waves can have (related to their wavelength); position is a particlelike property because it refers to a localization in space. Heisenberg's formula reveals the extent to which one of these aspects fades out as the other becomes the focus of attention.



 In a different but related sense, the uncertainty principle tells how much the complementary descriptions of a quantum object overlap. Position and momentum are complimentary properties because to pin down one is to lose track of the other; they coexist but are mutually exclusive, like the opposite sides of the same object. Heisenberg's formula quantifies the extent to which knowledge of one limits knowledge of the other.(For More see my article on the life and work of Werner Heisenberg which has a more detailed description of the uncertainty principle and the context of wave particle duality)

Einstein didn't buy this. He believed that a particle does have a definite position and momentum all the time, whether we're watching it or not, despite what quantum theory says. From his point of view, the Heisenberg uncertainty principle isn't a basic rule of nature; it's just an artifact of our inadequate understanding of the subatomic realm. In the same way, he though, wave-particle duality isn't grounded in reality but instead arises from a statistical description of how large numbers of particles behave. Given a better theory, there'd be no wave-particle duality or uncertainty principle to worry about. The problem was, as Einstein saw it, that quantum mechanics wasn't telling the whole story: it was incomplete.


Einstein versus Bohr



Intent on exposing this fact to the world and championing a return to a more classical pragmatic view of nature, Einstein devised several thought experiments between the late 1920s and the mid-1930s. Targeted specifically at the idea of complementarity, these experiments were designed to point out ways to simultaneously measure a particle's position and momentum, or its precise energy at a precise time (another complementary pair), thus pulling the rug from under the uncertainty principle and wave-particle duality.

The first of these experiments was talked about informally in 1927, in hallway discussions at the fifth Solvay Conference in Brussels. Einstein put to Bohr a modified version of the famous double-slit experiment in which quantum objects – electrons, say – emerging from the twin slits are observed by shining light onto them. Coherent Photons bouncing off a particle would have their momenta changed by an amount that would reveal the particle's trajectory and, therefore, which slit it had passed through. The particle would then go on to strike the detector screen and contribute to the buildup of an interference pattern. Wave-particle duality would be circumvented, Einstein argued, because we would have simultaneously measured particlelike behavior (the trajectory the particle took) and wavelike behavior (the interference pattern on the screen).

But Bohr spotted something about this thought experiment that Einstein had overlooked. To be able to tell which slit a particle went through, you'd have to fix its position with an accuracy better than the distance between the slits. Bohr then applied Heisenberg's uncertainty principle, which demands that if you pin down the particle's position to such and such a precision, you have to give up a corresponding amount of knowledge of its momentum. Bohr said that this happens because the photons deliver random kicks as they bounce off the particle. The result of these kicks is to inject uncertainty into the whereabouts of the particle when it strikes the screen. And here's the crucial caveat: the uncertainty turns out to be roughly as large as the spacing between the interference bands. The pattern is smeared out and lost as the quantum mechanical wavefunction becomes decoherent. With this it disappears Einstein's hoped-for contradiction.

On several other occasions, Einstein confronted Bohr with thought experiments cunningly contrived to blow duality out of the water. Each time, Bohr used the uncertainty principle to exploit a loophole and win the say against his arch rival (and, incidentally, good friend). In the battle for the future of quantum physics, Bohr defeated Einstein and, in the process, showed just how important Heisenberg's little formula was in the quantum scheme of things.


These arguments between Bohr and Einstein were never truly resolved and got evermore technical. At the sixth Congress of Solvay, in 1930, the indeterminacy relation was Einstein's target of criticism. His idea contemplates the existence of an experimental apparatus which was subsequently designed by Bohr in such a way as to emphasize the essential elements and the key points which he would use in his response.

In this Einstein considers a box, sometime's called "Einstein's Box" or "Einstein's Box of Light". With this thought experiment, which was designed with Bohr's assistance, Einstein's was supposed to prove the violation of the indeterminacy relation between time and energy. The schematic of Einetin and Bohr's apparatus is shown below:

                 "Einstein's Box of Light" - Einstein's secret weapon to destroy quantum mechanics?

Einstein described a box full of light and said that it was possible to measure both the energy 'E' of a single photon and the time 't' when it was emitted. This was not allowed by a variant on Heisenberg’s uncertainty principle, namely .

Einstein said that the box could be weighed at first and then a single photon be allowed to escape through a shutter controlled by a clock inside the box. The box would then be weighed again and the mass difference 'm' determined. The energy of the photon 'E' is simply E = mc^2.

It appeared that both the photon's energy and its time of emission could be determined! This caused a bit of a shock when first seen by Bohr, he genuinely did not see the solution at once and Einstein seemed at first sight to have one the battle this time, meaning that the uncertainty of quantum mechanics was going to be finally wiped out!

Bohr, after sleeping on the problem, finally realized that there was a flaw in Einstein's reasoning. When the photon is released, the box will recoil (to conserve momentum) and the position of the box in the earth's gravitational field will be uncertain. Einstein's very own general theory of relativity said that this would cause a corresponding uncertainty in the time recorded.







An illustration of this, in context of Einstein's light box is shown on the left - critically it depends on the presence of a clock in the device recording the time at which particles are measured and weighted at precise time intervals by recoil from the light source; clocks are effected by gravity in special relativity and run slower in a gravitational field than in zero gravity. Hence, gravitation affects the measurement, and thus induces uncertainties in the weight of quanta - as quantum mechanics predicts. *

*(although this may not be a successful tool for disproving quantum mechanics, as Einstein intended, this may be an important way however to measure certain weakly interacting affects of gravity in context of the predictions of general relativity -such as local gravitational waves and instances of spatial warping.)


So Bohr had been saved by defeat by Einstein forgetting his own theory of general relativity!

This was to be the last serious assault – approximately 28 years after its inception at the hands of Planck, the foundations of quantum mechanics seemed to be complete and depended wholly on this uncertainty business which Einstein never accepted and always saw it as a kind of a magician's curtain, hiding the true mechanism of what appears to be a trick of nature itself.


Such is the version of this clash of 20th-century titans that's been dutifully repeated in textbooks and spoon fed to physics students for many years. But evidence has recently come to light that Bohr had unwittingly hoodwinked Einstein with arguments that were fundamentally unsound. This disclosure doesn't throw quantum mechanics back into the melting pot, but it does mean that the record needs setting straight, and that the effect that really invalidates Einstein's position should be given proper credit.

The revisionist picture of the Bohr-Einstein debates stems partly from a suggestion made in 1991 by Marlan Scully, Berthold-Georg Englert, and Herbert Walther of the Max Planck Institute for Quantum Optics in Garching, Germany.[3] These researchers proposed using atoms as quantum objects in a version of Young's two-slit experiment.



Atoms have an important advantage over simpler particles, such as photons or electrons: they have a variety of internal states, including a coherent ground state (lowest energy state) and a series of decoherent excited states. These different states, the German team reckoned, could be used to track the atom's path.

This two-state example of coherence and decoherence is what allowed this formulation of a quantum version of the famous Two-Slit Experiment.


We can label the probability-amplitude wave function passing through the left hand slit in the figure ψleft and the waves passing through the right-hand slit ψright. These are coherent and show the characteristic quantum interference fringes on the detector screen (a photographic plate or CCD array). This is the case even if the intensity of particles is so low that only one particle at a time arrives at the screen.

In a dramatic experimental proof of decoherence, physicist Gerhard Rempe sent matter waves of heavy Rubidium atoms through two slits. He then irradiated the left slit with microwaves that could excite the hyperfine structure in Rb atoms passing through that slit. As he turned up the intensity, the interference fringes diminished in proportion to the number of photons falling on the left slit. The photons decohere the otherwise coherent wave functions.[4]



The crucial factor in this version of the double-slit experiment is that the microwaves have hardly any momentum of their own, so they can cause virtually no change to the atom's momentum – nowhere near enough to smear out the interference pattern.

Heisenberg's uncertainty principle can't possibly play a significant hand in the outcome. Yet with the microwaves turned on so that we can tell which way the atoms went, the interference pattern suddenly vanishes. Bohr had argued that when such a pattern is lost, it happens because a measuring device gives random kicks to the particles. But there aren't any random kicks to speak of in the rubidium atom experiment; at most, the microwaves deliver momentum taps ten thousand times too small to destroy the interference bands. Yet, destroyed the bands are. It isn't that the uncertainty principle is proved wrong, but there's no way it can account for the results.

The only reason momentum kicks seemed to explain the classic double slit experiment discussed by Bohr and Einstein turns out to be a fortunate conspiracy of numbers. There's a mechanism at work far deeper than random jolts and uncertainty. What destroys the interference pattern is the very act of trying to get information about which paths is followed. The effect at work is entanglement.


Nonlocality



Ordinarily, we think of separate objects as being independent of one another. They live on their own terms, and anything tying them together has to be forged by some tangible particles, A and B, which have come into contact, interacted for a brief while, and then flown apart. Each particle is described by (among other properties) its own position and momentum. The uncertainty principle insists that one of these can't be measured precisely without destroying knowledge of the other. However, because A and B have interacted and, in the eyes of quantum physics, have effectively merged to become one interconnected system, it turns out that the momentum of both particles taken together and the distance between them can be measured as precisely as we like. Suppose we measure the momentum of A, which we'll assume has remained behind in the lab where we can keep an eye on it. We can then immediately deduce the momentum of B without having to do any measurement on it at all. Alternatively, if we choose to observe the position of A, we would know, again without having to measure it, the position of B. This is true whether B is in the same room or a great distance away.

From Heisenberg's relationship, we know that measuring the position of, say, A will lead to an uncertainty in its momentum, Einstein, Podolosky, and Rosen pointed out, however, that by measuring the position of A, we gain precise knowledge of the position of B. Therefore, if we take quantum mechanics at face value, by gaining precise knowledge of its position, an uncertainty in momentum has been introduced for B. In other words, the state of B depends on what we choose to do with A in our lab. And, again, this is true whatever the separation distance may be. EPR considered such a result patently absurd. How could B possibly know whether it should have a precisely defined position or momentum? The fact that quantum mechanics led to such an unreasonable conclusion, they argued, showed that it was flawed – or, at best, that it was only a halfway house toward some more complete theory.

At the core of EPR's challenge is the notion of locality: the common sense idea that things can only be affected directly if they're nearby. To change something that's far away, there's a simple choice: you can either go there yourself or send some kind of signal. Either way, information or energy has to pass through the intervening space to the remote site in order to affect it. The fastest this can happen, according to Einstein's special theory of relativity, is the speed of light.

The trouble with entanglement is that it seems to ride roughshod over this important principle. It's fundamentally nonlocal. A measurement of particle A affects its entangled partner B instantaneously, whatever the separation distance, and without signal or influence passing between the two locations. This bizarre quantum connection isn't mediated by fields of force, like gravity or electromagnetism. It doesn't weaken as the particles move apart, because it doesn't actually stretch across space. As far as entanglement is concerned, it's as if the particles were right next to one another: the effect is as potent at a million light-years as it is at a millimeter. And because the link operates outside space, it also operates outside time. What happens at A is immediately known at B. No wonder Einstein used words such as "spook" and "telepathic" to describe – and deride – it. No wonder that as the author of relativity he argued that the tie that binds entangled particles is a physical absurdity. Any claim that an effect could work at faster-than-light speeds, that it could somehow serve to connect otherwise causally isolated objects, was to Einstein an intellectual outrage.

A close look at the EPR scenario reveals that it doesn't actually violate causality, because no information passes between the entangled particles. The information is already, as it were, built into the combined system, and no measurement can add to it. But entanglement certainly does throw locality out the window, and that development is powerfully counterintuitive. It was far too much for Einstein and his colleagues to accept, and they were firmly convinced that quantum mechanics, as it stood, couldn't be the final word. It was, they suggested, a mere approximation of some as yet undiscovered description of nature. This description would involve variables that contain missing information about a system that quantum mechanics doesn't reveal, and that tell particles how to behave before a measurement is carried out. A theory along these lines – a theory of so-called local hidden variables – would restore determinism and mark a return to the principle of locality.

The shock waves from the EPR paper quickly reached the shores of Europe. In Copenhagen, Bohr was once again cast into a fever of excitement and concern as he always was by Einstein's attacks on his beloved quantum worldview. He suspended all other work in order to prepare a counterstrike. Three months later, Bohr's rebuttal was published in the same American journal that had run the EPR paper. Basically, it argued that the nonlocality objection to the standard interpretation of quantum theory didn't represent a practical challenge. It wasn't yet possible to test it, and so physicists should just get on with using the mathematics of the subject, which worked so well, and not fret about the more obscure implications.


David Bohm's View on The EPR Paradox

                                                       David Joseph Bohm, FRS, London

Most scientists, whose interest was simply in using quantum tools to probe the structure of atoms and molecules were happy to follow Bohr's advice. But a few theorists continued to dig away at the philosophical roots. In 1952, American Physicist David Bohm, at Birkbeck College, London, who had been hounded out of his homeland during the McCarthy "Red Scare" inquisitions, came up with a variation on the EPR experiment that paved the way for further progress in the matter.[5] Instead of using two properties, position and momentum, as in the original version, Bohm focused on just one: the property known as spin.

The spin of subatomic particles, such as electrons, is analogous to spin in the everyday world but with a few important differences. Crudely speaking, an electron can be thought of as spinning around the way a basketball does on top of an athlete's finger. But whereas spinning basketballs eventually slow down, all electrons in the universe, whatever their circumstances, spin all the time and at exactly the same rate. What's more, they can only spin in one of two directions, clockwise or counterclockwise, referred to as spin-up and spin-down.

Bohm's revised EPR thought experiment starts with the creation, in a single event, of two particles with opposite spin. This means that if we measure particle A and find that its spin-up, then, from that point on, B must be spin-down. The only other possible result is that A is measured to be spin-up, which forces B to be spin-down. Taking this second case as an example, we're not to infer, says quantum mechanics, that A was spin-up before we measured it and therefore that B was spin-down, in a manner similar to a coin being heads or tails. Quantum interactions always produce superpositions. The state of each particle in Bohm's revised EPR scenario is a mixed superposition that we can write as: psi = (A spin-up and B spin-down) + (A spin-down + B spin-up). A measurement to determine A's spin causes this wave function to collapse and a random choice to be made of spin-up or spin-down. At that very same moment, B also ceases to be in a superposition of states and assumes the opposite spin.

This is the standard quantum mechanical view of the situation and it leads to the same kind of weird conclusion that troubled Einstein and friends. No matter how widely separated the spinning pair of particles may be, measuring the spin of one causes the wave function of the combined system to collapse instantaneously so that the unmeasured twin assumes a definite (opposite) spin state, too. The mixed superposition of states, which is the hallmark of entanglement, ensures nonlocality. Set against this is the Einsteinian view that "spooky action at a distance" stems not from limitations about what the universe is able to tell us but instead from limitations in our current knowledge of science. At a deeper, more basic level than that of wave functions and complementary properties, are hidden variables that will restore determinism and locality to physics.


John Bell's inequality

John Stewart Bell at CERN


Bohm's new version of the EPR paradox didn't in itself offer a way to test these radically different worldviews, but it set the scene for another conceptual breakthrough that did eventually lead to a practical experiment. This breakthrough came in 1964 from Irish physicist, John S. Bell, who worked at CERN, the European center for high-energy particle research in Switzerland. Colleagues considered Bell to be the only physicist of his generation to rank with the pioneers of quantum mechanics, such as Niels Bohr and Max Born, in the depth of his philosophical understanding of the implications of the theory. What Bell found is that it makes an experimentally observable difference whether the particles described in the EPR experiment have definite properties before measurement, or whether they're entangled in a ghostlike hybrid reality that transcends normal ideas of space and time.

Bell's test hinges on the fact that a particle's spin can be measured independently in three directions, conventionally called x, y, and z, at right angles to one another. If you measure the spin of particle A along the x direction, for example, this measurement also affects the spin of entangled particle B in the x direction, but not in the y and z directions. In the same way, you can measure the spin of B in, say, the y direction without affecting A's spin along x or z. Because of these independent readings, it's possible to build up a picture of the complementary spin states of both particles. Being a statistical effect, lots of measurements are needed in order to reach a definite conclusion. What Bell showed is that measurements of the spin states in the x, y, and z directions on large numbers of real particles could in principle distinguish between the local hidden variable hypothesis championed by the Einstein-Bohm camp and the standard nonlocal interpretation of quantum mechanics.

If Einstein was right and particles really did always have a predetermined spin, then, said Bell, a Bohm-type EPR experiment ought to produce a certain result. If the experiment were carried out on many pairs of particles, the number of pairs of particles in which both are measured to be spin-up, in both the x and y directions ("xy up"), is always less than the combined total of measurements showing xz up and yz up. This statement became known as Bell's inequality. Standard quantum theory, on the other hand, in which entanglement and nonlocality are facts of life, would be upheld if the inequality worked the other way around. The decisive factor is the degree of correlation between the particles, which is significantly higher if quantum mechanics rules.

This was big news. Bell's inequality, although the subject of a modest little paper and hardly a poplar rival to the first Beatles tour of America going on at the same time, provided a way to tell by actual experiment which of the two major, opposing visions of subatomic reality was closer to the truth.[6] Bell made no bones about what his analysis revealed: Einstein;s ideas about locality and determinism were incompatible with the predictions of orthodox quantum mechanics. Bell's paper offered a clear alternative that lay between the EPR/Bohemian local hidden variables viewpoint and Bohrian, nonlocal weirdness. The way that Bell's inequality was set up, its violation would mean that the universe was inherently nonlocal*, allowing particles to form and maintain mysterious connections with each other no matter how far apart they were. All that was needed now was for someone to come along and set up an experiment to see for whom Bell's inequality tolled.

But that was easier said than done. Creating, maintaining, and measuring individual entangled particles is a delicate craft, and any imperfection in the laboratory setup masks the subtle statistical correlations being sought. Several attempts were made in the 1970s to measure Bell's inequality but none was completely successful. Then a young French graduate student, Alain Aspect, at the Institute of Optics in Orsay, took up the challenge for his doctoral research.


*  If we want to study the applicability of Classical Probability Theory, i.e. Bayes theorem, to all of quantum mechanics, beyond the Copenhagen interpretation, we have to recognize that quantum entanglement states, such as basic singlet states, are nonfactorizable and therefore will not follow the simple factorizable relations used in proving the consistency of Bayes theorem with quantum probability functions.

This is again a result of the paradox that entanglement is a non-local effect and therefore there isn't any possibility of decomposing/factorizing the density of
states of such an entangled quantum state locally.

Even by applying a locality condition to the Bell inequalities, in the stochastic Clauser-Horne Model say, it can be shown that this (local) model, as far as applied to the singlet-state and without using quantum mechanical formalism, is not completely stochastic (i.e. there are possible configurations for which the model is deterministic). However as soon as you apply quantum mechanical formalism it becomes non-local again.

Even in experiment, where the so-called Clauser-Horne inequalities correspond to the fixed conditional probabilities of light polariser orientations for entangled photon ensembles which have not been identified show that unless they were exactly identical, the different conditional probability values of the photons themselves could not be factorized.

So the paradox is based on non-locality which is in direct contradiction to the pure local cause and effect framework of special relativity. Any theory combined with quantum mechanics becomes non-local, it has to whenever any form of quantum formalism is used. EPR is purely a paradox of relativity, not of quantum mechanics.


First Indirect Evidence of EPR Entanglement 



The first efforts to relate theory and thought experiment with actual experiment came from the pioneering work of Australian physicist John Clive Ward working with British physicist Maurice Pryce along with the work of one of the greatest experimental physicists of the 20th century, Chinese-American Physicist Chien-Shiung Wu.


Their work was on formulating and experimentally verifying the probability amplitude for quantum entanglement was the first attempt to develop a way to find such "spooky actions" in an EPR apparatus.

In a 1947 paper, published in Nature[7], Ward and Pryce were the first to calculate, and use, the probability amplitudes for the polarisation of two entangled photons moving in opposite directions.

For polarisations x and y, Ward derived this probability amplitude to be



which once normalised can be expressed as



where 1 and 2 refer to the two quanta propagating in different directions. Ward's probability amplitude is then applied to derive the correlation of the quantum polarisations of the two photons propagating in opposite directions.

This prediction was experimentally confirmed by Wu and Shaknov in 1950.[8] In current terminology this result corresponds to a pair of entangled photons and is directly relevant to a typical Einstein-Podolsky-Rosen (EPR) paradox


Chien-Shiung Wu – often referred to as Madame Wu or the First Lady of Physics – from the University of Columbia was first to give indirect evidence of entanglement in the laboratory.

She showed an Einstein-type correlation between the polarisation of two well-separated photons, which are tiny localised particles of light.

                                Chien-Shiung Wu at her Columbia University Physics Lab, 1963.

Madame Wu's work led to the confirmation of the Pryce and Ward calculations on the correlation of the quantum polarizations of two photons propagating in opposite directions. This was the first experimental confirmation of quantum results relevant to a pair of entangled photons as applicable to the Einstein-Podolsky-Rosen (EPR) paradox.

However, direct evidence of the EPR paradox, one which would include a complete isolation of local effects and test the effect of nonlocality of quantum phenomina would require a few more decades, until the laser was invented, which allowed the French physicist Alain Aspect to form an experiment that we would recognize today as a quantum entanglement circuit.





Alain Aspect's experiment

                                                        Experimental Physicist Alain Aspect


Aspect was set upon his way by his supervising professor, Bernard d'Espagnat, whose career centered around gathering experimental evidence to uncover the deep nature of reality. "I had the luck," said d'Espagnat, "to discover in my university a young French physicist, Alain Aspect, who was looking for a thesis subject and I suggested that testing the Bell inequalities might be a good idea. I also suggested that he go and talk to Bell, who convinced him it was a good idea and the outcome of this was that quantum mechanics won.”

Aspect's experiment used particles of light – photons – rather than material particles such as electrons or protons. Then, as now, photons are by far the easiest quantum objects from which to produce entangled pairs. There is, however, a minor complication concerning the property that is actually recorded in a photon-measuring experiments such as Aspect's or those of other researchers we'll be talking about later. Both Bell and Bohm presented their theoretical arguments in terms of the particle spin. Photons do have a spin (they're technically known as spin-1 particles), but because they travel at the speed of light, their spin axes always lie exactly along their direction of motion, like that of a spinning bullet shot from a rifle barrel. You can imagine photons to be right-handed or left-handed depending on which way they rotate as you look along their path of approach. What's actually measured in the lab isn't spin, however, but the very closely related property of polarization.

Effectively, polarization is the wavelike property of light that corresponds to the particlelike property of spin. Think of polarization in terms of Maxwell's equations, which tell us that the electric and magnetic fields of a light wave oscillate at right angles to each other and also to the direction in which the light is traveling. The polarization of a photon is the direction of the oscillation of its electric field: up and down, side to side, or any orientation in between. Ordinarily, light consists of photons polarized every which way. But if light is passed through a polarizing filter, like that used in Polaroid sunglasses, only photons with a particular polarization – the one that matches the slant of the filter – can get through. (The same happens if two people make waves by flicking one end of a rope held between them. If they do this through a gap between iron railings only waves that vibrate in the direction of the railings can slip through to the other side.)

Aspect designed his experiment to examine correlations in the polarization of photons produced by calcium atoms – a technique that had already been used by other researchers. He shone laser light onto the calcium atoms, which caused the electrons to jump from the ground state to a higher energy level. As the electrons tumbled back down to the ground state, they cascaded through two different energy states, like a two-step waterfall, emitting a pair of entangled photons – one photon per step – in the process.

                                            
                                                  Illustration of the Aspect Experiment

The photons passed through a slit, known as a collimator, designed to reduce and guide the light beam. Then they fell into an automatic switching device that randomly sent them in one of two directions before arriving, in each case, at a polarization analyzer – a device that recorded their polarization state.

An important consideration in Aspect's setup was the possibility, however small, that information might leak from one photon to its partner. It was important to rule out a scenario in which a photon arrived at a polarization analyzer, found that polarization was being measured along say the vertical direction, and then somehow communicated this information to the other photon. (How this might happen doesn't matter: the important thing was to exclude it as an option.) By carefully setting up the distances through which the photons traveled and randomly assigning the direction in which the polarization would be measured while the photons were in flight, Aspect ensured that under no circumstances could such a communicating signal be sent between photons. The switches operated within 10 nanoseconds, while the photons took 20 nanoseconds to travel the 6.6 meters to the analyzers. Any signal crossing from one analyzer to the other at the speed of light would have taken 40 nanoseconds to complete the journey – much too long to have any effect on the measurement.

In a series of these experiments in the 1980s, Aspect's team showed what most quantum theorists expected all along: Bell's inequality was violated.[9] The result agreed completely with the predictions of standard quantum mechanics and discredited any theories based on local hidden variables. More recent work had backed up this conclusion. What's more, these newer experiments have included additional refinements designed to plug any remaining loopholes in the test. For example, special crystals have enabled experimenters to produce entangled photons that are indistinguishable, because each member of the pair has the same wavelength. Such improvements have allowed more accurate measurements of the correlation between the photons. In all cases, however, the outcomes have upheld Aspect's original discovery. Entanglement and nonlocality are indisputable facts of the world in which we live, even if we may find it uncomfortable or "spooky" as Einstein himself did.

Einstein found it "Spooky" because the experiment is a paradox of special relativity, which is based on the idea of local causality, i.e. in a given event in space-time the distance between cause and effect must be separated by a time lapse which depends on the speed of light. In quantum entanglement a measurement, of spin say, of one particle allows you to know the spin of the other particle without disturbing it, such that the measurement of one particle is imposing an equal, but opposite, property, i.e. spin, on the other.

This is a paradox of special relativity because no events can affect another in a non-local way and must be mediated by signals. However the entangled particles, separated in space, are not separated in time. They both share a common time, however mathematically the time of one particle is real and the other is complex (i.e.using complex or imaginary numbers). So in unifying quantum mechanics with special relativity we get scenarios where, instead of the states being predetermined, one state measures time in a negative frame relative to the other and so cancel each other out.

This is why, in drawing Feynman diagrams in space-time, it appears that antimatter particles move backwards in time relative to the matter particles in pair production events for example. The particles do not travel backwards in time, this is just a consequence of unifying a local theory, special relativity, and a non-local theory, quantum mechanics. The paradox lies within this, as why should the local nature vanish? we know "how" to interpret it but we cannot really know the "why".

Niels Bohr felt as if we have no real right to know the "why", which displeased Einstein as all of his theories relied on a local space-time and to accept quantum correlations means having to violate the cosmic speed limit. Setting up the quantum systems however is determined by local events, the particles must be after all carried under local cause and effect events limited by the speed of light, so the paradox can be ironed out by what sets up the system, i.e. me bringing an entangled partner to the moon and leaving its partner on earth but as for the determination of the states of the particles themselves, in special relativity alone the paradox exists so we must interpret it using quantum mechanical non-locality.

Its kind of like saying "we see the world is flat around us, in a local frame, but by travelling around it, in a non-local frame, we know it is round" - so experimental determination of this effect, which has been done for almost 30 years now, is our equivalent of determining the roundness of the world despite it locally appearing flat in this analogy.

Applications of Quantum Entanglement


The phenomenon of entanglement has already begun to be exploited for practical purposes. In the late 1980s, theoreticians started to see entanglement not just as a puzzle and a way to penetrate more deeply into the mysteries of the quantum world, but also as a resource. Entanglement could be exploited to yield new forms of communication and computing. It was a vital missing link between quantum mechanics and another field of explosive growth: information theory. The proof of nonlocality and the quickly evolving ability to work with entangled particles in the laboratory were important factors in the birth of a new science. Out of the union of quantum mechanics and information theory sprang quantum information science – the fast-developing field whose most important fields of development are quantum cryptography, quantum teleportation, and quantum computers.

To see some of the applications of quantum entanglement in the context of quantum computer technology, see my article on quantum computer physics and architecture.

To see a nice example of merging quantum information theory with game theory see my short article here

References


Einstein, A., B. Podolsky, and N. Rosen. "Can a quantum-mechanical description of physical reality be considered incomplete? Physical Review 47 (1935): 777-80.

Schrödinger, E. "Discussion of probability relations between separated systems." Proceedings of the Cambridge Philosophical Society 31 (1935): 555-63.

Scully, M. O., B. G. Englert, and H. Walther. "Quantum optical tests of complimentarity." Nature 351 (1991): 111-16.

Dürr, S., T. Nonn, and G. Rempe. "Origin of QM complementarity probed by a 'which-way' experiment in an atom interferometer." Nature 395 (1998): 33.

Bohm, D. "A suggested reinterpretation of quantum theory in terms of hidden variables." Physical Review 85 (1952): 611-23.

Bell, J. S. "On the Einstein-Podolsky-Rosen paradox." Physics 1 (1964): 195-200.

M. H. L. Pryce and J. C. Ward, Angular correlation effects with annihilation radiation, Nature 160, 435 (1947).

C. S. Wu and I. Shaknov, The angular correlation of scattered annihilation radiation, Phys. Rev. 77, 136 (1950).

Aspect, A. P., P. Grangier, and G. Roger. "Experimental tests of relaistic local theories via Bell's theorem." Physical Review Letters 47 (1981): 460.




Quantum Entanglement Documentary Film, on which this article is based:




No comments:

Post a Comment

Note: only a member of this blog may post a comment.