A good way to understand the history of quantum computing (QC) is to look at the progress made by the National Institute of Standards and Technology (NIST). In the 1990s, NIST researchers had created qubits in the form of a so-called “atomic clock”. The most accurate atomic clock (the NIST-F2) kept time to an accuracy of less than a millionth of a billionth of a second. That may seem like overkill, but GPS satellites–which run on similar technology–depend on such amazing precision to send time-stamped signals, which help pinpoint locations anywhere on the surface of the planet to within about a meter.
Quantum computing (QC) makes use of the same principles as atomic clocks. Today, researchers at NIST’s “Workshop On Quantum Computing & Communication” continue to make progress in this arcane area. They are doing this by using technology that–in its earliest incarnations–used to be referred to as, well, just “atomic clocks”. The catch is that this is about far more than just keeping time accurately; it’s about exploiting the strange properties of the sub-atomic world for valuable computer processing (and data culling) processes.
The idea is to capitalize on ions’ wave-particle duality. In its wave-form, an ion exhibits what might be thought of as peaks and valleys. (The peaks represent where the ion is more likely to be found when measured.) The waves / valleys will combine or “interfere” to form a new wave pattern. By manipulating this process, scientists create patterns that can be converted back into qubit values–thereby yielding useful information.
When it comes to QC, the building blocks for processing information are quantum logic gates. By arranging such gates in a circuit, data scientists can create a kind of flowchart, which enables a computer to carry out a wide variety of logical operations (including highly-complex calculations). Lasers can be used to cause the ion’s internal energy state to go into a superposition of 0 and 1; thereby allowing the gate to process multiple possibilities simultaneously (unlike ordinary logic gates, which only have two options).
Each additional qubit doubles the number of combinations the gate can process at the same time. Hence there is an exponential increase in capacity with each new qubit. Due to the fragility of the qubit states, though, measurement can only extract a small amount of information about the results of the computations. One way to accomplish this is to build microscopic “quantum drums” that vibrate by releasing tiny packets of energy known as phonons.
In Japan, researchers built a QNN (quantum neural network) using photons in optical fiber: a specialized quantum computer good at performing highly-complex tasks such as solving the notorious “traveling salesman” problem.
Also worth noting: JILA (the Joint Institute for Laboratory Astrophysics) developed a 3-D quantum gas atomic clock. This was an early example of the second quantum revolution. The device controlled the interactions between strontium atoms to an exquisite degree–which enabled atomic clocks that were more accurate than ever before. JILA physicists used the quantum properties of a strontium lattice atomic clock to simulate magnetic long-sought properties in solid materials.
Here’s how it works: The atoms are confined in an optical lattice. A laser probes the atoms in order to couple the atoms’ spins and motions. The two atomic spin states enable superposition, which is required for processing quantum information (QI).
The first quantum revolution enabled the laser and transistor: the basic building block of computers. Now the aim is to make enough progress to inaugurate a second quantum revolution. This second revolution would be about controlling individual quantum systems (and their constituent parts: ions) to a greater extent than before. Doing so will permit the use of QI, thereby enabling more powerful applications. Once this can be done, the possible applications are mind-boggling.