Search for

No matches. Check your spelling and try again, or try altering your search terms for better results.

assessments

Jul 24, 2015 | 09:01 GMT

Approaching a Quantum Leap in Computing

A D-Wave Systems chip designed to operate as a 128-qubit superconducting adiabatic quantum optimization processor.
(D-Wave Systems, Inc.)
Forecast Bullets
  • The widespread use of quantum computers in industry is likely only a decade or two away.
  • The United States will probably maintain its lead in the field, though China will be competitive.
  • The countries and companies that first access quantum computers will enjoy a powerful advantage over their peers in areas that stand to gain from the technology.

Quantum computers, or computers based on the principles of quantum mechanics, stand to exponentially increase computing power within the next two decades. Though the scientific community is still fiercely debating the very nature of quantum mechanics itself, and numerous technical obstacles stand in the way of applying the principles of quantum mechanics to machines, the field is rapidly developing.

Now, the widespread use of quantum computers in industry is likely only a decade or two away. Such devices will be far more powerful than even the most powerful supercomputers seen today, carrying significant implications for national security, cyberwarfare and intelligence operations, among many other things. Just how powerful quantum computers can be — and how their adoption could lead to another revolution in computer-related technologies — becomes clear when we consider their computing power. Using a quantum computer to solve a problem can loosely be thought of as trying all possible solutions at once, whereas using a classical solution would mean trying them in sequential order. The expansion in computing power gained by incorporating quantum mechanics principles into computing could prove to be as revolutionary to computer science as research in physics and electromagnetism has proved to modern electronics.

Quantum Mechanics: A Primer

The field of quantum mechanics arose from German physicist Max Planck's attempts to describe the spectrum of light emitted by hot bodies. Specifically, he wondered what accounted for the shift in color from red to yellow to blue as the temperature of a flame increased. Planck devised an equation explaining what he had observed, based on the assumption that matter behaved differently at the atomic and subatomic levels.

Though even the great German physicist questioned this assumption, his research kicked off 30 years of scientific inquiry that yielded the theories and discoveries that form the basis of today's understanding of physics and chemistry. Albert Einstein introduced one of quantum mechanics' most famous and perplexing concepts just five years or so after Planck devised his equation, extending the latter's assumption by asserting that a quantum of light, or a photon, behaves as both a wave and a particle. This duality, along with the many other dualities embedded in quantum mechanics, became the bedrock of the field.

The field of quantum mechanics arose from German physicist Max Planck's attempts to describe the spectrum of light emitted by hot bodies.

Today, scientists still debate how to interpret quantum mechanics. Perhaps the most widely held approach is called the Copenhagen interpretation, which holds that every quantum particle, known as a "cat," exists in all of its possible states at once until it is measured; only when it is observed does the particle exist in one state. This concept has become known as the principle of superposition.

The superposition principle is one of the fundamental features of "quantum bits" or "qubits," the quantum computer's equivalent to the bits of classical computers. Classical computing relies on data comprising numerous individual bits that can only exist in one of two states, 0 or 1. Computers process data composed of long ordered strings of 0s and 1s. Today's computer chips are made up of millions of transistors and capacitors that can only exist as a 0 or 1; while switching these states now takes a mere fraction of a millisecond (a period that is shrinking every day), there are still natural limits to how fast data can be processed and how small transistors and capacitors can be shrunk.

A qubit has the advantage of being able to be a 0, a 1, and a superposition of both 0 and 1 — that is, it can exist in all possible states. This allows quantum computers to exist simultaneously in all possible states, whereas a classical computer could only exist in them sequentially. This means that a quantum computer can perform vast numbers of calculations at the same time, and that the power of a quantum computer increases exponentially as the number of qubits increases.

An additional boost to the potential power of quantum computers comes from the concept of "quantum entanglement," which Einstein famously described as "spooky action at a distance." Quantum entanglement is the principle that some quantum systems' states cannot be described by the states of their individual elements alone because those elements may be "entangled;" in other words, different elements' states are related to one another in some way, meaning that what happens to one will affect the other, no matter how vast the distance separating the two. Among other things, quantum entanglement can be used to create "super-dense" coding in which two classical bits can be encoded and transmitted via one qubit.

Physicist Erwin Schrödinger famously illustrated a dual quantum state by imagining a cat in a box with a bottle of poison. From outside the box, the viewer cannot determine whether the cat is dead and can thus think of the cat as at once dead and alive, a superposition of both possible states of the cat's life.

Potential Applications

Though quantum computers will have a broad impact on society, the most obvious areas that stand to benefit are the ones that supercomputers dominate today: cryptography, research and military applications. The most well-known capability quantum computers could unlock would be the use of what is known as Shor's algorithm, something classical computers cannot do and a tool of significant interest to the National Security Agency, CIA and Chinese government.

In short, Shor's algorithm would enable the breaking of complex codes by speeding up the search for a given number's prime factors, the backbone of modern-day encryption methods. The gains that would be made by using a quantum computer to break a code over a classical computer are gigantic: A quantum computer can do in minutes or hours what a classical computer would take years or much longer to do. Of course, the floodgates of stored data will not suddenly open once Shor's algorithm comes into play; quantum computers could also be use to encrypt information far more securely than is possible with classical computers, something already under intense study. 

Outside of the military and intelligence spheres, quantum computers would greatly expand data processing and permit the simulation of almost every natural phenomenon. They would also lead to outcomes such as the faster development of new drugs and more accurate weather forecasting, as well as those as exotic as the search for extraterrestrial life and the development of artificial intelligence.

Quantum computing would have important implications for the development of artificial intelligence because it would expand machine-learning algorithms. Today's algorithms rely on pattern recognition; with quantum computing, machines could adapt to anomalous situations. A highly refined machine-learning algorithm would help automated systems handle non-routine tasks, an area that has been lacking in the automation and digitization of jobs and that would improve upon the current research on autonomous cars, robots and drones.  

Developing Quantum Computers

Building a quantum computer is no easy task. Still, the past five years have seen significant progress toward the development of an economical quantum computing machine and its components, though the industry remains in its infancy. The problem of preserving and storing qubits lies at the heart of the challenge: A qubit in a superposition state is quite fragile. Its interaction with other particles (whether qubits or otherwise) essentially forces it to collapse into one state or the other (e.g., a 0 or a 1).

Physicists have tried to preserve qubits by supercooling their environment to temperatures just above absolute zero (-273.15 degrees Celsius) and using them in a vacuum. But for nearly all practical purposes outside of research environments and possibly a few government agencies, quantum computers would need to exist at ambient temperatures. The record for storing quantum data at room temperature, set in 2013, is a mere 39 minutes (an improvement upon the previous record of 2 seconds). Even with its prodigious computing power, a qubit needs more time than that to perform meaningful calculations. Of course, classical computers once faced similar challenges.

Like today's quantum computers, the classical computers of the 1950s filled rooms, and the idea of shrinking them down to the size of the device you are using to read this article was a distant prospect.

All challenges aside, there has been no shortage of interest in researching technologies for quantum computers. Established technology firms, defense contractors, intelligence agencies and startups, among many others, are pursuing them. In fact, Canadian startup D-Wave Systems, Inc. has already begun selling the first commercial quantum computer, unveiling a 1,000-qubit version of the D-Wave Two in June. The company is collaborating with Google, Lockheed Martin and NASA to develop quantum computers further.

Naturally, D-Wave has found the technical challenges it faces daunting. Its computers have come under heavy criticism for their inflexibility: The D-Wave processor is designed to perform optimization tasks and little else. More fundamentally, some have questioned whether the D-Wave system actually relies on quantum mechanics. Some physicists and IBM have argued that classical computers are capable of performing the same functions and tasks that D-Wave's system does.

For its part, IBM has made its own recent breakthroughs in quantum computer development. In April, IBM researchers published a paper describing a method of simultaneously detecting both an error common to all computers and an error unique to quantum computers. The first is a "bit-flip error," where a 0 accidentally flips to a 1, or vice versa, while the second is a "sign-flip error," where the relationship between 0 and 1 flips. Previous research attempts could not detect both errors at the same time.

Though other countries share the United States' keen interest in developing quantum computers, none appears able to supplant the United States as the global leader in the field. China is likely the only other country with the financial power as well as the military and national security motivations to explore quantum computers and their properties, and Beijing has dedicated significant funds to such research. The Chinese have already shown the ability to perform as well as their American counterparts in developing supercomputers, though China has lagged behind the United States in the commercialization of domestically developed and designed classical computers and their components. But with commercially available quantum computers still decades away, China's interest in the technology, at this point, is mainly strategic. With its successes in supercomputing, China could conceivably develop a fully functioning quantum computer before the United States, though whether it could develop a commercially viable model before U.S. companies do is more doubtful.

Timetable

Despite the extensive interest quantum computers are generating worldwide, they will not replace classical computers anytime soon; even their adoption among niche customers remains at least a decade away, if not two. Instead, the spread of quantum computers will likely occur in the same sort of slow, methodical manner seen with the adoption of classical computers.

Today's quantum computers, the D-Wave One and the D-Wave Two, are highly refined machines that have been designed to perform one task only: optimization. The development of similarly specialized machines that focus on solving a single problem, whether factorization, simulations or moving traffic efficiently, will continue. These quantum computers will compete with the supercomputers that are currently being developed and optimized to perform similar specific tasks. Government agencies, as well as companies involved in relevant security-related applications such as cryptography, will be satisfied with quantum computers that can perform only one task, just as they are satisfied with supercomputers that are likewise specialized.

However, the development and possible commercialization of a more practical and universal quantum computer remains a distant goal, even though companies like Google are aiming for it now. The adoption of single-task (and, later, universal) quantum computers will be linear rather than exponential, and some industries will adopt them more quickly than others. But just as oil supermajors will use their powers of simulation to unlock more oil reserves, the countries and companies that are the first to access quantum computers will enjoy a powerful advantage over their competitors in areas that can make use of the technology.

Stratfor
YOU'RE READING
Approaching a Quantum Leap in Computing
CONNECTED CONTENT
2 Geo |  1 Topics |  1 Themes 
SHARE & SAVE

Copyright © Stratfor Enterprises, LLC. All rights reserved.

Stratfor Worldview

OUR COMMITMENT

To empower members to confidently understand and navigate a continuously changing and complex global environment.

GET THE MOBILE APPApp Store
Google Play