📖 Read more: The World in 2126: AI Will Change Everything
What Is Quantum Computing
A quantum computer exploits quantum mechanical phenomena — superposition and entanglement — to process information in ways impossible for classical computers. While a classical bit takes exclusively the value 0 or 1, a qubit can exist in a superposition of both states simultaneously.
Practically, this means a quantum computer with just 100 qubits handles a state space of 2100 dimensions — more possibilities than atoms in the universe. As physicist Richard Feynman explained in 1982, a quantum computer could simulate quantum systems without the exponential overhead faced by classical computers.
Qubit vs Bit: The Fundamental Difference
A classical bit can only be 0 or 1. A qubit can exist in the state α|0⟩ + β|1⟩ — a superposition of both. Measurement collapses the superposition to a single outcome, but during computation, quantum parallelism enables exponential speedup on certain problems. Entanglement links two or more qubits so that the state of one immediately depends on the state of another — regardless of distance.
Historical Overview: From Feynman to Willow
The idea of using quantum mechanics for computation was born in the 1980s. Paul Benioff proposed the quantum Turing machine in 1980, while Yuri Manin and Richard Feynman independently suggested that quantum-based hardware would be more efficient for simulating quantum systems.
Critical milestones followed: Peter Shor published his 1994 algorithm that breaks RSA encryption exponentially faster, Lov Grover presented his 1996 search algorithm offering quadratic speedup on unstructured databases, and Seth Lloyd proved that quantum computers can simulate quantum systems without exponential cost — confirming Feynman's groundbreaking conjecture.
Google Willow: The Error-Correction Era
In December 2024, Google unveiled Willow, a quantum computing chip with 105 superconducting qubits. The real revolution was not the qubit count, but the error correction: for the first time, adding more qubits reduced the error rate rather than increasing it — “below-threshold” error correction, the long-sought holy grail of the field.
In a computational benchmark, Willow completed in under 5 minutes a task that the fastest classical supercomputer would need an estimated 1025 years to finish. Admittedly, this benchmark does not correspond to a practical problem, but it demonstrates the technology is moving in the right direction. Google aims for a fault-tolerant quantum computer within the decade.
📖 Read more: AI and Jobs: 15 Professions at Risk
IBM: Condor, Heron, and the Roadmap
IBM has remained at the forefront of the quantum race. In December 2023, it unveiled IBM Condor with 1,121 superconducting qubits — the first quantum processor to exceed 1,000 qubits. Alongside it, IBM presented IBM Heron with 133 qubits but significantly lower error rates, designed for practical computations.
IBM's roadmap envisions a 100,000-qubit system by 2033, using modular architecture that connects multiple quantum chips. In June 2023, IBM researchers demonstrated that a quantum computer outperforms a classical supercomputer on a physics problem — a significant step toward practical utility.
Other Players: D-Wave, IonQ, Microsoft, China
The quantum race involves dozens of companies and universities worldwide:
- D-Wave: Pioneer in quantum annealing with over 5,000 qubits, specialized in optimization problems. Uses adiabatic quantum computation rather than logic gates.
- IonQ: Uses trapped ions with exceptional accuracy. The technology exhibits the lowest error rates in the industry but faces challenges in scaling.
- Microsoft: Investing in topological qubits based on anyons — particles that “remember” how they were manipulated, offering inherent fault tolerance. In 2025, the company announced progress on Majorana qubits.
- Rigetti Computing: Superconducting qubit startup with cloud-based quantum access via Amazon Braket.
- China (USTC): In December 2020, the Jiuzhang team demonstrated quantum advantage on a 76-photon photonic system. In 20 seconds it completed a task that would take a classical supercomputer 600 million years.
📖 Read more: AI Consciousness: Can Machines Think?
Quantum Machine Learning: The AI + Quantum Convergence
Quantum machine learning (QML) is one of the most promising fields at the intersection of quantum computing and AI. The core idea: since machine learning relies on linear algebra — matrices, vectors, transformations — and quantum computing inherently operates through linear algebra, the synergy is natural.
Key QML algorithms:
- HHL Algorithm (Harrow-Hassidim-Lloyd): Solves systems of linear equations exponentially faster — a fundamental tool for regression and optimization.
- Quantum Neural Networks: Hybrid circuits combining quantum gates with classical parameters for pattern recognition.
- Quantum Boltzmann Machines: Using quantum annealing (D-Wave) to train deep neural networks — already in an experimental phase.
- Quantum GANs: Generative adversarial networks leveraging quantum correlations for molecular structure generation in drug discovery.
In 2023, researchers at Gero published a hybrid quantum-classical model based on a restricted Boltzmann machine, implemented on a commercial quantum annealing platform, for generating novel pharmaceutical molecules with physicochemical properties comparable to known medicinal compounds.
Cryptography: The Quantum Threat and the Solution
Shor's algorithm is perhaps the greatest threat: a sufficiently powerful quantum computer could break the RSA, Diffie-Hellman, and Elliptic Curve cryptographic protocols that today protect email, banking transactions, and government networks. By estimates, at least 3 million physical qubits would be needed to break RSA-2048 in a reasonable timeframe.
The answer: Post-quantum cryptography. In July 2024, NIST published the first post-quantum encryption standards: ML-KEM (formerly CRYSTALS-Kyber) for key exchange and ML-DSA (formerly CRYSTALS-Dilithium) for digital signatures, based on lattice-based cryptography. Meanwhile, quantum cryptography (Quantum Key Distribution - QKD) uses entangled quantum states to guarantee no one can intercept data transmission.
“Harvest Now, Decrypt Later” — The Hidden Threat
State actors and hackers are already storing encrypted data with the intent to decrypt it later, once quantum computers become powerful enough. This is why governments are rushing to adopt post-quantum standards now, before the quantum threat materializes. The NSA has already required federal agencies to transition to quantum-resistant algorithms by 2035.
📖 Read more: AI Smartphones 2026: Complete Guide to Mobile Intelligence
Practical Applications: Where We Will See Results First
Pharmaceuticals and Drug Discovery
Simulating molecular interactions is an inherently quantum problem. Quantum computers can model proteins and molecular structures with accuracy impossible for classical computers. About 2% of global energy production goes to nitrogen fixation for ammonia (Haber process) — quantum simulations could dramatically increase the energy efficiency of this process.
Finance and Optimization
Portfolio optimization, risk analysis, and fraud detection are ideal problems for quantum annealing. Goldman Sachs and JPMorgan Chase are already experimenting with quantum algorithms for derivatives pricing and Monte Carlo simulations.
Materials and Chemistry
Designing new materials, superconductors, batteries, and catalysts can benefit significantly. BMW and Mercedes are experimenting with quantum computers for next-generation battery development.
Logistics and Supply Chain
Route optimization problems (traveling salesman, vehicle routing) are classic NP-hard problems where quantum annealing shows promise at smaller scales.
Challenges: Why We Don't Have a Practical Quantum Computer Yet
Despite impressive announcements, significant obstacles remain:
- Decoherence: Qubits lose their quantum state within nanoseconds to seconds. Superconducting quantum computers require temperatures of 20 millikelvin — colder than outer space.
- Error Rates: Modern quantum gates exhibit error rates of ~0.1-1%, while fault-tolerant operation requires ~0.1% or less. Each “logical” qubit needs hundreds or thousands of “physical” qubits for error correction.
- Scaling: Connecting many qubits requires extraordinarily complex electrical signals with precise timing. Superconducting systems need specialized superconducting cables made only by Japan's Coax Co. and helium-3 — a rare nuclear research byproduct.
- Algorithmic maturity: Many promising quantum algorithms have been “dequantized” — classical versions with similar complexity were found, reducing the expected advantage.
📖 Read more: AI Investments: Where the Billions Are Going
NISQ vs Fault-Tolerant: Where We Stand
Physicist John Preskill introduced the term NISQ (Noisy Intermediate-Scale Quantum) in 2018 to describe the current era: quantum computers with dozens to a few hundred qubits, too noisy for fully reliable computations but interesting enough for experiments.
The transition to Fault-Tolerant Quantum Computing (FTQC) requires quantum error correction that “consumes” many physical qubits per logical qubit. Google's Willow chip is a significant step: it demonstrated that scaling can indeed reduce errors, paving the way for fault-tolerant architectures.
Researchers at Harvard, in collaboration with MIT, QuEra Computing, Caltech, and Princeton, achieved in 2023 logical quantum computations on reconfigurable atom arrays, funded by DARPA's ONISQ program — showing that practical quantum circuits with error correction are feasible.
Skepticism and Realistic Assessment
Not everyone is enthusiastic. A Nature spotlight article (2023) characterized today's quantum computers as “For now, [good for] absolutely nothing,” noting they still have no practical advantage on any real-world problem. A Communications of the ACM article (2023) concluded that only “small-data problems” in chemistry and materials have realistic chances of quantum advantage, while machine learning applications “will not achieve quantum advantage in the foreseeable future.”
Key reasons for skepticism:
- GPU accelerators improve so rapidly they reduce the quantum advantage.
- "Quantum supremacy" claims are based on artificial benchmarks with no practical value.
- Processing large non-quantum data (big data) remains a challenge for quantum systems.
- Some algorithms have been “dequantized” — equivalent classical solutions were found.
What Changes with Quantum AI by 2030
Despite the doubts, progress is accelerating:
- 2025-2026: First hybrid quantum-classical AI models in the cloud (IBM Qiskit, Amazon Braket, Google Cirq). Variational quantum eigensolvers for chemistry. First quantum-enhanced optimization applications in logistics.
- 2027-2028: Fault-tolerant logical qubits at small scale. Drug discovery pipelines with quantum advantage in molecular simulation. Microsoft topological qubits in pilot phase.
- 2029-2030: IBM 100,000+ qubit systems. Quantum ML models for materials science, climate modeling, genomics. Post-quantum cryptography in universal deployment. First quantum internet prototypes.
The convergence of quantum computing and AI will not replace classical computers — it will complement them. The most likely scenarios involve hybrid architectures where classical computers handle the main workload and quantum computers take on specialized tasks requiring exponential computational space — molecular simulations, optimization, cryptanalysis.
The Quantum AI era is just beginning. While it will not change everything tomorrow, the technology being developed today will become the foundation of a new computational era — one where the boundaries of possibility are redefined by the physics of the quantum world.
