top of page
Search
  • Denis Pepin

Quantum Computing: Past, Present, and Future

Updated: Mar 31


The  field of quantum computing, nestled within computer science, leverages the principles of quantum physics to perform computations that outstrip the capabilities of classical computers. Quantum computers utilize quantum bits, or qubits, which can simultaneously occupy states in a superposition, such as 0 and 1, and become entangled with other qubits, offering a potent resource for parallel processing. Quantum computing has the potential to tackle problems that classical computers find insurmountable, including tasks like factoring large numbers, simulating quantum systems, and optimizing complex functions.


The origins of quantum computing reach back to the early 20th century when luminaries like Albert Einstein, Niels Bohr, and Erwin Schrödinger laid the foundations of quantum mechanics. In 1935, Einstein and his contemporaries introduced the enigmatic concept of quantum entanglement, which implies that distant particles can share a quantum state and instantaneously influence each other. In 1944, John von Neumann formulated the mathematical framework of quantum mechanics, which later served as an inspiration for the development of digital computers.



In 1981, Nobel laureate Richard Feynman postulated that quantum computers could more efficiently simulate quantum systems than classical computers. In 1985, David Deutsch introduced the first universal quantum computer model, capable of performing any computation achievable by a classical computer. In 1994, Peter Shor devised a quantum algorithm that could factor large numbers exponentially faster than any known classical algorithm, posing a substantial challenge to existing encryption methods. In 1996, Lov Grover designed a quantum algorithm that could search an unsorted database quadratically faster than any classical algorithm.


During the late 1990s and early 2000s, researchers commenced the construction and operation of small-scale quantum computers employing various physical platforms such as superconducting circuits, trapped ions, photons, and nuclear magnetic resonance. In 2001, IBM demonstrated the initial implementation of Shor's algorithm on a 7-qubit nuclear magnetic resonance quantum computer. In 2007, D-Wave Systems asserted that they had developed the first commercial quantum computer using a technique called quantum annealing, although the validity and utility of D-Wave's devices remain subjects of controversy and debate.


In recent years, quantum computing has made significant strides in both theory and practice. Researchers have developed novel quantum algorithms for various applications, encompassing machine learning, cryptography, optimization, and chemistry. They have also enhanced the quality and scalability of quantum hardware through methods like error correction, fault tolerance, and modularity. Numerous companies and organizations have undertaken ambitious projects aimed at demonstrating quantum supremacy or advantage, showcasing the capability of a quantum computer to tackle tasks that are unfeasible or impractical for classical computers.


Some noteworthy milestones in the recent progress of quantum computing include:

In 2019, Google announced achieving quantum supremacy using a 53-qubit superconducting processor called Sycamore, capable of completing a random sampling task in about 200 seconds, which would take a state-of-the-art supercomputer about 10,000 years to accomplish.

In 2020, IBM unveiled its Quantum roadmap, with the goal of constructing a 1-million-qubit fault-tolerant quantum computer by 2030. IBM also launched its Quantum Network, providing cloud access to its quantum processors for researchers and developers.


In 2020, China reported achieving quantum supremacy using a photonic processor called Jiuzhang, performing a boson sampling task in about 200 seconds, which would take a classical supercomputer about 2.5 billion years to complete.


In 2021, IonQ announced building the world's most powerful quantum computer, employing 32 trapped ion qubits with high fidelity and connectivity, and also became the first publicly traded pure-play quantum computing company.


Also in the same year, Microsoft released its Quantum Development Kit, which includes tools and libraries for creating and running quantum programs on various platforms. Microsoft also collaborated with several universities and companies to advance its vision of topological quantum computing.

In 2023, quantum computing is transitioning from the confines of university physics department laboratories to the expansive realm of industrial research and development. This shift is made possible by substantial financial backing from multinational corporations and venture capitalists.



The current iterations of modern quantum computing, led by organizations like IBM, Google, IonQ, Rigetti, and others, are not without their imperfections. These machines, in their current state, remain relatively compact and are susceptible to errors, placing them in the "noisy intermediate-scale quantum" phase of development. The inherent delicacy of these diminutive quantum systems leaves them vulnerable to various sources of error, and rectifying these issues poses a formidable technical challenge.


The ultimate goal is to create a large-scale quantum computer that can autonomously correct its errors. A diverse array of research groups and commercial enterprises are diligently working toward this objective, employing various technological approaches.


Currently, the predominant approach involves the use of loops of electric current within superconducting circuits for the storage and manipulation of information, a method embraced by entities such as Google, IBM, and Rigetti. Another technique, known as "trapped ion" technology, revolves around working with groups of electrically charged atomic particles, capitalizing on the inherent stability of these particles to minimize errors, a method promoted by IonQ and Honeywell. Another avenue of exploration centers on confining electrons within minuscule semiconductor particles, which can then be integrated into the well-established silicon technology of classical computing, a path actively pursued by Silicon Quantum Computing. Yet another strategy involves the utilization of individual particles of light, or photons, which can be manipulated with a high degree of precision, a field where companies like PsiQuantum are crafting intricate "guided light" circuits for quantum computations.


At present, there is no clear victor among these technologies, leaving open the possibility that a hybrid approach may ultimately prove to be the most successful.


Quantum computing, although still in its infancy, has already displayed substantial potential and promise in addressing some of the most formidable challenges in the realms of science and technology. With an increasing number of researchers and developers entering the field, we can anticipate further breakthroughs and innovations in the near future.



Comments


bottom of page