Quantum Computing - Next-Generation

Quantum Computing - Next-Generation
Quantum computing, which is a new paradigm in computing, utilizes the benefits of quantum mechanics to improve computing experience. Rather than relying on binary digits..

Quantum computing, which is a new paradigm in computing, utilizes the benefits of quantum mechanics to improve computing experience. Rather than relying on binary digits (0 and 1), which the computers have depended on since the early years, quantum computers will use quantum bits that can be in a superposition of states.

Quantum computing

Quantum computing is computing using quantum-mechanical phenomena, such as superposition and entanglement. A quantum computer is a device that performs quantum computing. They are different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superpositions of states. A quantum Turing machine is a theoretical model of such a computer, and is also known as the universal quantum computer.

The field of quantum computing was initiated by the work of Paul Benioff (de) and Yuri Manin in 1980, Richard Feynman in 1982, and David Deutsch in 1985. A quantum computer with spins as quantum bits was also formulated for use as a quantum spacetime in 1968.

Background

Physicists trace the history of quantum theory back to 1927, when German physicist Werner Heisenberg showed that the classical methods did not work for very small objects, those roughly the size of individual atoms. When someone throws a ball, for instance, it's easy to determine exactly where the ball is, and how fast it's moving.

But as Heisenberg showed, that's not true for atoms and subatomic particles. Instead, an observer can see either where it is or how fast it's moving – but not both at the exact same time. This is an uncomfortable realization: Even from the moment Heisenberg explained his idea, Albert Einstein (among others) was uneasy with it. It is important to realize that this "quantum uncertainty" is not a shortcoming of measurement equipment or engineering, but rather how human brain works. We have evolved to be so used to how the "classical world" works that the actual physical mechanisms of the "quantum world" are simply beyond our ability to fully grasp.

Over the past five decades, standard computer processors have gotten increasingly faster. In recent years, however, the limits to that technology have become clear: Chip components can only get so small, and be packed only so closely together, before they overlap or short-circuit. If companies are to continue building ever-faster computers, something will need to change.

One key hope for the future of increasingly fast computing is quantum physics. Quantum computers are expected to be much faster than anything the information age has developed so far. But my recent research has revealed that quantum computers will have limits of their own

Both practical and theoretical research continues, and many national governments and military agencies are funding quantum computing research in additional effort to develop quantum computers for civilian, business, trade, environmental and national security purposes, such as cryptanalysis.

A small 16-qubit quantum computer exists and is available for hobbyists to experiment with via the IBM quantum experience project. Along with the IBM computer a company called D-Wave has also been developing their own version of a quantum computer that uses a process called annealing. 

Analysis

As progress in the field accelerates at an exponential rate, 2018 should see an avalanche of breakthroughs. It is a race for so-called “quantum supremacy,” when a quantum computer demonstrably and markedly outperforms a classical supercomputer for any class of problems.

Booth Google (GOOG, +1.23%) and IBM (IBM, +2.53%), two leaders in quantum computing, have laid out plans to achieve this goal. Intel (INTC, +0.00%) also has a horse in the race, announcing a new 49 qubit neuromorphic chip designed for quantum computing research at the annual Consumer Electronics Show in Las Vegas last week.

The stakes are enormous. Quantum computers promise to set a new paradigm for solving some of the hardest math and computing problems today—problems such as analysing the interactions of multiple genes in health outcomes, modelling the energy states of chemicals, and predicting the behaviour of atomic particles. They also might make the Internet inherently insecure by quickly cracking modern cryptography used to lock our IT infrastructure and the web.

Google, IBM, and a number of start-ups are working on quantum computers that promise to be more flexible and likely more powerful because they will work on a wider variety of problems. A few years ago, these flexible machines of two or four qubits were the norm. During the past year, company after company has announced more powerful quantum computers. In November 2017, IBM announced that it has built such a quantum machine that uses 50 qubits, breaking the critical barrier beyond which scientists believe quantum computers will shoot past traditional supercomputers.

The downside? The IBM machine can only maintain a quantum computing state for 90 microseconds at a time. This instability, in fact, is the general bane of quantum computing. The machines must be super-cooled to work, and a separate set of calculations must be run to correct for errors in calculations due to the general instability of these early systems. That said, scientists are making rapid improvements to the instability problem and hope to have a working quantum computer running at room temperature within five years.

Assessment

Our assessment is that the era of quantum will likely usher in seismic changes in the way society functions. As we are seeing the first major impacts of wide-scale artificial intelligence, we are also realizing that classic semiconductor-based computing limits our ability to solve the biggest problems that we had hoped artificial intelligence could tackle. 

Comments