It’s one of technology’s hottest (and most elusive) goals for the 21st century: quantum computing. You’ve probably heard talk of these powerful machines, which have the potential to completely transform our computing capabilities and upend modern data security. Although the foundations of this technology have already been laid down in research labs, we haven’t yet been able to develop quantum computers that can overtake their classical counterparts. However, some recent research from a group at the University of Maryland and NIST’s Joint Quantum Institute may bring us one step closer.
Still speaking classically, we’re able to perform computations by running raw data (encoded in a series of bits) through gates, physical devices that perform some sort of function on the data. For example, a gate might reverse the state of a single bit (this is known as a NOT gate) or it might compare two separate pieces of information and adjust its output based on the relationship it finds between them. For example, an AND gate produces a 1 if both of its inputs are 1s, and a 0 otherwise. An OR gate yields a 1 if either of its inputs are 1s, and a 0 if both are 0s. These basic functions are at the heart of modern computers—by setting up many, many such gates to check and modify enormous numbers of bits, we can construct systems that do almost anything we need them to. Almost, but not quite.
You see, certain processes—including complex physics simulations and some artificial intelligence applications—require a different type of computer. For all their power, classical computers are fundamentally constrained by that notion of a bit. To see why, let’s look at a simple example.
Imagine we have a system with just two bits (in real life, most have billions). Since each of those bits has two possible states, 0 and 1, the system collectively has four: 00, 01, 10, 11. If we want to run all possible combinations of this two-bit system through a gate, it’ll take four times the computing power (which usually translates to four times as long) compared to performing the calculation on just one configuration. That’s not major news to anyone. However, when we start looking at more complex pieces of data with many bits, it’s actually quite easy to run into computations that can’t be completed within the next millennium—which is a long time to wait.
Enter quantum mechanics. In the quantum world, things act funny: Certain particles, like electrons and photons, can take on indeterminate states, meaning it’s impossible to tell exactly what they look like. For example, an electron has two possible states (sound familiar?), called spin-up and spin-down. The electron must take on either of those states—but under certain conditions it can be both at once! If that doesn’t sound remarkable, consider how you would react if the result of your coin toss was both heads and tails.
As it turns out, this could be a boon for solving those complicated problems we can’t complete on current computers. If a quantum bit, or qubit, can be in both state 1 and state 0 at once, we can use clever manipulation of this fact to save time on computations. Going back to our two-bit system, if we use qubits rather than classical bits, we could theoretically test the calculation on every possible combination at once! By performing just one calculation rather than four, we’ve cut down the time to a quarter of what it was before. And as the number of bits in the system increase, so does the time we can potentially save by switching to quantum computers. Computer scientists have already developed countless algorithms that are ready to run on quantum computing systems.
The only problem is that they don’t exist yet. That’s partly because quantum states are notoriously delicate and tend to fall apart, or decohere, before we’re able to perform operations on them. And part of the problem is that it’s really difficult to design quantum gates to perform those operations in the first place.
Just like classical bits, qubits come in a variety of physical implementations, from electron spin to ionized atoms. However, photons present an ideal basis for a qubit, in large part because they are quite stable as quantum systems go. “They barely interact with the environment and therefore are immune to many of the decoherence mechanisms,” Dr. Shuo Sun, of the University of Maryland, explains. Furthermore, they can travel long distances without alteration—for example, we’re able to see galaxies millions of light years away with stunning resolution, given the right instruments. This property makes them ideal for telecommunications and quantum networking. However, the same non-interactiveness that makes photons so stable also makes them hard to manipulate, which throws a wrench into the whole computing mechanism: When two photons cross paths, they travel right on through one another unperturbed, unless they’re in a special medium.
The work reported by the University of Maryland/NIST team in the July 6 issue of Science outlines a mechanism by which the presence of a single photon could in fact change the future behavior of other photons. This solid-state transistor is essentially a semiconductor with a honeycomb-like series of holes in it that can bounce a photon around, trapping it. At the transistor’s center lies a quantum dot, which can register the photon’s presence and record it—a simple form of quantum memory.
This quantum dot is composed of a nanocrystal embedded in a gallium arsenide substrate, which traps a single electron. This electron has a particular spin to it, but when a photon approaches the quantum dot the electron’s spin is reversed. Since the quantum dot is modified by an incoming photon, it can serve as a record of photons passing through—like the NOT gate mentioned above! The transistor can then use this record to control the transmission of future particles.
The transistor developed by the team is tiny and fast; about one million would fit into a grain of salt, and it can theoretically process 10 billion qubits per second. This makes it an ideal candidate for compact quantum systems, which until now have required unwieldy and complicated setups like laser traps. Although this transistor is still a “proof-of-concept”, as Sun warns, it certainly seems to be a promising lead that may push forward the field of photonic quantum computing!
—Eleanor Hook