The Future of Quantum Entanglement

We have all heard stories about strange unexplained connections between kindred people or objects, and we know they are irrational. Two twins, separated at birth and reunited after decades apart, find that they have the same names for their children, the same jobs, and coinciding misfortunes—one gets hit by lightning, and then the other mysteriously has a heart attack. Despite the fantastical nature of these tales, recent research in the area of quantum mechanics has shown that just this sort of eerie relationship—albeit not on the macroscopic scale—may have a physical explanation.This invisible fabric may be sewn by the threads of quantum entanglement.

If the reader has no idea what quantum entanglement is, there are some very basic facts that are worth knowing. Quantum entanglement is not a force, it is not energy, and it is not matter. It is a physical resource that arises from the calculus of quantum mechanics which can only be exploited when used in conjunction with other classical resources (force, matter, or energy). On the other hand, if the reader has heard the phrase and expects that the phenomenon will be demystified in reading this article, read no further. Simply put, quantum entanglement remains a largely uncharted territory. At most, we can only hope to grasp the point at which quantum entanglement fundamentally diverges from classical thinking, and to look down the new pathways of technology it can reveal.

Quantum computing would use qubits instead of bits, and computing potential could be enormously multiplied once the proper quantum algorithms are engineered.

Quantum computing would use qubits instead of bits, and computing potential could be enormously multiplied once the proper quantum algorithms are engineered.

Ever since the Golden Age of physics, the theory of general relativity has put a speed limit on the universe—namely, the speed of light: 300,000,000 meters per second. This upper limit dictates how fast our computers load a webpage, how much energy we can derive from uranium, and which stars we currently see in the night sky, some of which may have already exploded in supernovae. But unlike speed limits in our world which we all know are more like guidelines, the speed of light cannot be exceeded by anything. For example, we would be certain to see the light from a supernova coming long before the blast were to reach us. This topographical paradigm gave rise to the rule of locality, which, next to the authority of the speed of light, was thought to govern all things, including electricity, radiation, and the particles in the Large Hadron Collider.

Now, imagine you have two tiny particles that are generated at some point in space and shoot off in opposite directions toward detectors on either side of their origin. At each side, there are two different detectors, say one square detector and one round. Based on some inherent quantum properties of the particles, the detectors either flash red or blue. It is important to note that the square and round detectors are different, but they both yield the same probabilities for flashing red and blue. Because of the famous uncertainty principle, the particles are neither “red” nor “blue” until you measure them, and you cannot predict which color the detector will flash when you do. At any given measurement, either detector could be red (R) or blue (B), giving a total of four equally likely possibilities (RR, BB, RB, BR) that cycle randomly. Now imagine we cast a spell on these particles by putting them in close proximity and exposing them to a common force-field in a particular way. From now until the end of time, these two particles are strangely married so that even though they may end up light-years apart and each detector is free to flash either red or blue, they must always flash the same color when they are measured with the same type of detector. We still have four possibilities, but the chances of them flashing the same color are now somehow greater than the chances of them flashing different colors, because they must flash the same if both detectors are round, or both are square. At this point, if the reader has followed closely, he or she should be quite confused. Though it may seem, this kind of experiment has been conclusively reproduced in many academic laboratories.

This paradox, which Einstein vehemently and famously rejected as incomplete, violates a concept called “locality.” Ironically, it was Einstein himself who, in trying to reveal quantum entanglement as smoke and mirrors (see the Einstein-Rosen-Polodsky (EPR) experiment), gave the ingredients to the first proof that there were no “hidden variables,” and that it was, as far as classical (Boolean) logic is concerned, magic (1).

The final verdict on the EPR paradox was given by Bell’s Theorem, which essentially showed that the correlations of the results were too strong to be simply random, and yet they were not strong enough to be deterministic. In other words, if there were some shared instruction set that was undetectable to humans (what Einstein thought) and the hypothetical particles already knew which color both of the detectors would flash before they arrived, then each new iteration using the same detector set-up would yield the same results, which was simply not the case. However, if it were merely probabilistic, there would be no explanation for the same-detector-same-color correlation (1).

So what do supernovae, computers, and nuclear reactors have to do with quantum entanglement? They could all potentially be affected (and indeed are increasingly affected every year) by what Einstein sarcastically called “spooky action at a distance,” or non-locality. Consider this: if our universe truly was produced in a massive explosion from a singularity in the “Big Bang,” is it not possible that everything in our universe is still entangled somehow, and physical distance is merely an illusion?

The area in which quantum entanglement has garnered the most success in recent years is quantum computing. The main difference between a quantum computer and a regular computer is in the way in which they store information. A regular computer stores its information in bits that can be toggled from a “0” (off) to a “1” (on) by energizing a transistor with an electrical current. A quantum computer uses a different unit of information called the quantum bit, or qubit. Unlike a regular bit, a qubit can occupy more than just two states (0 or 1); it can occupy any combination of those states simultaneously. Basically, instead of always responding to questions with a “yes” or “no,” it can respond with a “maybe.” Instead of the states of qubits depending on the flow of electrons cycling through a circuit over and over, they can be adjusted instantaneously by changing the state of other entangled qubits.

Unfortunately, quantum computing is still in its infancy, and it has many limitations. The most daunting of these hindrances is the fact that, although quantum entanglement can provide correlation, it still can’t provide causation, because qubits cannot be predicted or forced to be a certain state. This means quantum computing must employ some classical information channels, and the connections among the many bits and qubits inside a computer must be constructed with very clever “quantum algorithms.” Trying to control these disobedient qubits can often be costly, and therefore only certain types of algorithms, like decryption/encryption programs, have been proven to be significantly faster than the same algorithm using regular bits. Even so, the advancements have already begun to stir up controversy, as secure connections and bank accounts—which thankfully can only be breached after running all the computers in the world for years—could potentially be solved in minutes by a quantum computer.

The Crab Nebula is actually the remnants of a supernova that became visible from Earth in 1054 AD. Quantum entanglement may have dictated how the event took place.

The Crab Nebula is actually the remnants of a supernova that became visible from Earth in 1054 AD. Quantum entanglement may have dictated how the event took place.

Another problem is simply one of practicality. Just as regular transistors were once the size of light bulbs, quantum computers are cantankerous and weak at this early stage in their development. The nature of quantum mechanics dictates that the environment of the photons that contain information must be perfect—most designs must be super-cooled in a vacuum—or else the waveforms will collapse, causing errors that greatly diminish the computational efficiency (2).

Recently, Lorenza Viola, a Dartmouth professor in the department of physics and astronomy, has engineered a method to reduce this noise or “decoherence” significantly. Her work is part of the advanced research that will be required to streamline and control the physical resource of quantum entanglement in the future.

“Our approach does not identify new, potentially even more efficient quantum algorithms. However it does provide concrete schemes for substantially reducing the effect of ‘quantum noise’ (so-called decoherence) in physical implementations of quantum bits and quantum logic gates.  As such, we believe that our results might help to bring reliable quantum computers a bit closer to a technological reality than they are today,” said Viola when asked how her research has impacted the overall sphere of quantum computer technology.

We were raised to think that if Colonel Mustard were in the game room with a wrench while Professor Plum was meeting his demise in the bedroom, there is no way Colonel Mustard could be guilty of (or related to) the crime. However, when Erwin Schrödinger showed up on the scene with his imaginary numbers and non-existent cat, everything changed. Our idea of “common sense” continues to evolve before our eyes. Results from a recent experiment called the “Delayed Choice Quantum Eraser” suggest that quantum entanglement can be manipulated in such a way that one event can cause another event to occur in the past (3). Even if we cannot fully understand how this is possible, we can still consider it a resource to be exploited. To our bafflement, as Dartmouth physics professor Barrett Rogers once said, “You can’t ask the equations why; they just work.”

References

1. N. D. Mermin, Boojums All the Way Through: Communicating Science in a Prosaic Age.  (Cambridge University Press, 1990).
2. D.P. Divincenzo, Science 255–261 (1995).
3.Y. Kim, R. Yu, S. P. Kulik,Y. Shih, Delayed Choice Quantum Eraser (American Physical Society, 2000)

Leave a Reply

Your email address will not be published. Required fields are marked *