# Quantum Physics: Applications to Espionage, Intelligence, and Security Issues

█ K. LEE LERNER/

LARRY GILMAN

Quantum physics, which has been called "the science of the very small," is essential to the design of modern microelectronics. Without quantum physics it would not be possible to design the microscopic structures that make today's digital circuits possible. Such circuits, in turn, are essential to the conduct of all kinds of modern espionage, warfare, and security operations. The further application of quantum physics to computing and communications is at present being systematically researched by many groups, including the U.S. Quantum Information Science and Technology (QuIST) Program of the Defense Advanced Research Projects Agency (DARPA). DARPA has historically funded the development of such fundamental advances in electronics as the microchip.

## Limitations of Conventional Electronics

Ordinary circuit components obey the laws of classical physics; that is, their behavior is predictable and single-valued. An integrated-circuit memory cell is either ON or OFF, never both at once. There is no upper limit on how large a device exhibiting such behavior can be; one could build a computer out of stars and planets, if one had the means to move them about. However, the laws of quantum physics place strict limits on how small a device can be and still behave classically. Quantum physics tells chip designers how small they can make their transistors and other circuit components and still obtain classical, causal behavior from them.

However, quantum effects (those physical laws that dominate the behavior of matter at the subatomic level) are not only an obstacle to infinite miniaturization, they can be exploited to produce devices that have no parallel in the macroscopic world, the world of large objects. An early example of such a device is the tunnel diode (invented in 1958), an electronic device that takes advantage of the fact that an individual subatomic particle can appear randomly on the far side of an otherwise insurmountable barrier (i.e., "tunnel" through the barrier). Despite a few oddities such as the tunnel diode, however, quantum phenomena have for decades been perceived by designers of computers and communications systems more as a limiting factor on conventional device size than as an invitation to build novel devices.

Since the 1990s, physicists have realized that quantum phenomena open the door to powerful new techniques in computing and communications. In particular, they are hoping to exploit the phenomenon of quantum "entanglement" to produce superfast computers, unbreakable cryptographic systems, and error-free transmission of information. All of these advances, when they become available in working devices, will have many uses in both the civil and military sectors.

The concepts of quantum entanglement and quantum information are basic to the development of new quantum computing, cryptography, and communications technologies, and are reviewed separately below.

## Entanglement

Individual subatomic particles, such as photons, do not exist in single, well-defined states like on-off light switches. Rather, they exist as a superposition of states. Experiments show, for example, that prior to observation (i.e., definitive interaction with a large-scale system) a photon can actually have more than one polarization at once and be in more than one place at once.

Not only can individual particles exist in superposed or ambiguous states prior to observation, but the superposed states of pairs, triplets, or larger groups of particles can be related to each other by means of entanglement. Entanglement arises because the superposed states of particles that have interacted directly retain a definite, permanent relationship even after the particles have separated. Two entangled photons, for example, may be sent to two different detectors, A and B. Individually the photons do not, while in transit, have definite polarizations. When the polarization of one of the photons is collapsed to a definite value by measurement at detector A, however, photon, bound for detector B, instantly takes on the opposite polarization. There is no delay; the effect is truly instantaneous.

Despite appearances, this does not offer a means of faster-than-light communication (which would contradict the special theory of relativity); there is, in principle, no way to control what polarization detector A observes. The observed value at detector A is random, and the value that is instantly imposed on the photon bound for detector B is also random. There is thus no way for A to signal to B by using the instantaneous relationship between the entangled photons. However, entanglement is still useful. Transmission of an identical string of random bits to two receivers is important in cryptography, and using a stream of entangled photons for transmitting that bitstream has the valuable property that it cannot be eavesdropped upon, as quantum physics declares that any effort to interfere with (i.e., measure) either entangled photon en route will be detectable by the intended receivers. Nonquantum or classical communications links cannot give this absolute privacy guarantee. Transmission of entangled photon pairs over tens of kilometers of optical fiber has recently been demonstrated, bringing quantum cryptography closer to practical realization.

Entangled photons can also be used to achieve what is termed superdense coding or quantum dense coding—the transmission of multiple bits of classical information through the transmission of a smaller number of qubits (entangled photons). Another application of entanglement is quantum teleportation, discussed further below.

## Quantum Information and its Implications for Communications and Computing

The quantum phenomena of superposition and entanglement have important
implications for computing and communications, even apart from
cryptography. In classical information theory, the minimum unit of
information is a bit (short for "binary digit," since a bit
is usually, though arbitrarily, symbolized as a 1 or 0); in quantum
information theory, the minimum unit of information is the quantum bit or
qubit (pronounced CUE-bit). One qubit is the amount of quantum information
stored by a microscopic system (e.g., photon) that exists in a superposed
pair of states. Quantum computation applies logical operations to qubits,
much as classical computation applies Boolean logical operations to bits.
The advantage of quantum computation arises from the superposition
property of quantum systems:
*
L
*
qubits (e.g., isolated atoms) can, through superposition, contain the
equivalent of 2
^{
L
}
bits, and quantum-logical operations can be performed simultaneously on
all those bits. The result, potentially, is massive parallelism with
corresponding speedup of certain calculations. One important class of
calculations that would be greatly speeded by a quantum computer is the
factorization of a large integer
*
N
*
. Factorization is the discovery, given
*
N
*
, of two numbers
*
x
*
and
*
y
*
such that
*
x
*
×
*
y
*
=
*
N
*
. For large
*
N
*
this is a time-consuming calculation, and it is on this difficulty that
many cryptosystems (e.g., public-key cryptography) depend. When quantum
computers are built, such cryptosystems will quickly become worthless, for
the factorization problem with have become manageable even for very large
*
N
*
.

Many nuts-and-bolts obstacles remain, however, in the construction of a
full-scale quantum computer. One challenge is the accurate transmission of
quantum information—qubits, held in superposed quantum
states—from one place to another within a quantum computer or
between one quantum computer and another. Quantum teleportation may
provide a practical answer to this problem. Quantum teleportation allows
the perfect recreation of a quantum system at the far end of a
transmission channel. In this technique, one member of an entangled photon
pair is combined at a transmitter with the quantum system to be
teleported—a photon, other particle, or even a collection of
particles—in such a way that bits of classical information (1s and
0s) are produced that characterize the system to be teleported. Both the
entangled photon and the system to be teleported are destroyed by this
process; that is, they no longer exist as a superposition of quantum
states, but are measured as having definite, unique values. The classical
information (bits) derived by the transmitter from its measurements is
sent in conjunction with the remaining member of the entangled photon pair
to a distant receiver. The receiver can re-create or
"resurrect" the original quantum system with all its
superpositional ambiguity (qubit content) intact, just as if it had never
been measured (destroyed) by the transmitting system. Because quantum
physics declares that systems with identical quantum-mechanical
descriptions are not only
*
similar
*
but are
*
the same
*
—have no individuality, cannot be distinguished from each
other—the "resurrected" system in effect
*
is
*
the original system: that is, the original system has been teleported
from the transmitter to the receiver, including whatever quantum
information it contains. (Strictly speaking, every quantum
system—e.g.,
photon—contains an infinite amount of information; most of this,
however, is in principle unextractable.) Quantum teleportation has already
been demonstrated at kilometer distances for single-particle systems, and
may eventually be used to communicate quantum information without error
from one part of a quantum computer to another. However, it will never be
practical to teleport large systems of particles such as human beings; the
number of bits to be transmitted would be prohibitively large.

## █ FURTHER READING:

### PERIODICALS:

Bennett, Charles H., and Peter W. Shor. "Privacy in a Quantum
World."
*
Science
*
284 (April 30, 1999):747–748.

Bennett, Charles H., and David P. DiVincenzo. "Quantum information
and computation."
*
Nature
*
404 (March 16, 2000): 247–255.

Taubes, Gary. "Quantum Mechanics: To Send Data, Physicists Resort
to Quantum Voodoo."
*
Science
*
274 (Oct. 25, 1996): 504–505.

"However, it will never be practical to teleport large systems of particles such as human beings; the number of bits to be transmitted would be prohibitively large."

The term "prohibitively large" sounds a lot like people in the past who have said things like... "no one needs more than 32 megs of memory" "no one will want more colors than black" "science has discovered everything there is to know"

Large.. is small next year.

Beam me up Scotty