Introduction
Quantum computing – once a theoretical dream – is rapidly becoming a reality, promising to solve problems that stump even today’s supercomputers. By leveraging the bizarre properties of quantum physics, quantum computers can process information in fundamentally new ways, potentially tackling tasks in minutes that might take classical machines millennia (What Is Quantum Computing? | IBM) (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post). This immense computational power could transform how we secure data, develop medicines, manage finances, route logistics, and design energy systems. Researchers and governments worldwide are investing heavily in this technology, recognizing that quantum computing may confer huge economic and security advantages to whoever masters it first (Chinese scientists are at the forefront of the quantum revolution – The Washington Post). At the same time, experts caution that we are in the early innings of a long journey – quantum computers will complement, not replace, classical computers, and practical benefits are just beginning to emerge (Quantum Computing Explained | NIST) (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum).
In this article, we delve into how quantum computing works and why it represents a paradigm shift. We will explore the current state of quantum hardware and research, and examine real-world use cases across various sectors – from the threat it poses to cybersecurity, to its potential in revolutionizing drug discovery, finance, logistics, energy, and more. We will also address the formidable challenges that remain (such as qubit instability and error correction) and consider what the future might hold. The goal is an accessible yet accurate overview of how quantum computing is reshaping industries, balancing excitement with a clear-eyed view of the road ahead.
Background
How quantum computing works: Classical computers use bits that represent either 0 or 1. Quantum computers, by contrast, use quantum bits or “qubits,” which harness quantum-mechanical phenomena to encode information as 0, 1, or a superposition of both at the same time (What Is Quantum Computing? | IBM). A qubit can exist in multiple states simultaneously – a counterintuitive property known as superposition, where the qubit effectively represents a combination of possibilities (What Is Quantum Computing? | IBM). For example, two classical bits can encode one of four possible states at a time, but two qubits in superposition can encode all four states at once (What Is Quantum Computing? | IBM). In general, N qubits can represent $2^N$ states simultaneously, giving quantum computers an exponential parallelism that grows rapidly with more qubits (What Is Quantum Computing? | IBM). This is one source of their potential power.
Another crucial phenomenon is entanglement, which correlates qubits in ways not possible classically. When qubits become entangled, their states are linked such that measuring one instantly affects the state of the other, no matter how far apart they are (What Is Quantum Computing? | IBM). Entangled qubits behave as a unified system – changing or measuring one will “collapse” the state of the others in a predictable way (What Is Quantum Computing? | IBM). This allows quantum algorithms to coordinate operations on many qubits at once. Quantum computers perform calculations by manipulating qubits with quantum gates – analogous to logic gates in classical circuits – to induce superposition, entanglement, and interference. Through sequences of such gates (a quantum circuit), an algorithm steers the qubits’ state toward a solution, with the final measurement yielding the result. Importantly, quantum algorithms leverage interference between probability amplitudes: certain paths reinforce the correct answers while cancelling out incorrect ones (What Is Quantum Computing? | IBM) (What Is Quantum Computing? | IBM). In effect, a quantum computer can explore a vast solution space in parallel and use interference to extract the most likely solutions.
To illustrate, Google’s Hartmut Neven likened classical versus quantum problem-solving to exploring a maze: a classical computer must try one path at a time, backtracking when it hits a dead end, whereas a quantum computer can “derive a bird’s-eye view of the maze” and test multiple paths simultaneously via superposition and entanglement (What Is Quantum Computing? | IBM). When designed correctly, the overlapping “quantum waves” of possible solutions interfere so that wrong paths cancel out and the correct path emerges strongly (What Is Quantum Computing? | IBM). This capability isn’t magic – it only works for specific problem types and requires carefully crafted algorithms – but where applicable it offers a radically new approach to computation.
Quantum vs classical mindset: The concept of quantum computing was first proposed in the 1980s by physicists like Richard Feynman, who realized that simulating quantum systems might require quantum computers. In 1994, Peter Shor discovered a quantum algorithm to factor large numbers exponentially faster than any known classical method, implying that a sufficiently advanced quantum computer could break RSA encryption. This result galvanized interest in quantum computing, as it showed unique algorithms could harness quantum effects for real-world problems. Over the past two decades, scientists have developed other algorithms (for example, Grover’s algorithm for searching unsorted databases faster than classically possible) and steadily improved experimental qubits in the lab.
Today’s quantum computers are still fragile and relatively small in scale – typically tens or hundreds of qubits – an era often called NISQ (Noisy Intermediate-Scale Quantum) (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum). Qubits are implemented in various physical forms: superconducting circuits (used by IBM, Google), trapped ions (IonQ, Honeywell), neutral atoms (ColdQuanta, Pasqal), photons (used in quantum communication and some experimental processors), and more (What Is Quantum Computing? | IBM) (What Is Quantum Computing? | IBM). Each approach has pros and cons in terms of speed, stability, and scalability. Crucially, qubits must be isolated from external interference and kept extremely cold (in the case of superconducting qubits, a fraction of a degree above absolute zero) to preserve their quantum behavior (Quantum Computing is Real. It Will Simulate the… | Flagship Pioneering). Even then, qubits tend to decohere – lose their quantum state – within microseconds or milliseconds, introducing errors. This is why building a large, reliable quantum computer is so hard: any noise or slight disturbance can corrupt the computation. Researchers are actively developing quantum error correction techniques that spread information across multiple physical qubits to form one logical (error-resistant) qubit. However, error correction demands significant qubit overhead (often dozens or hundreds of physical qubits per logical qubit) and remains an enormous engineering challenge.
In summary, quantum computing harnesses “the oddities of quantum mechanics” – superposition and entanglement – to process information in ways impossible for classical binary computers (Quantum Computing is Real. It Will Simulate the… | Flagship Pioneering). A small number of qubits can encode an astronomical amount of information in parallel, and quantum gates allow computations that effectively examine many possibilities at once (Quantum Computing is Real. It Will Simulate the… | Flagship Pioneering). But qubits are also exotic: maintaining their quantum state requires extreme conditions, and reading out a result causes the delicate superpositions to collapse to classical 0s and 1s. The field sits at the intersection of computer science, physics, and engineering, and it is “only now becoming real machines” after decades of research (Quantum Computing is Real. It Will Simulate the… | Flagship Pioneering). With that background, let’s look at where this technology stands today and how it’s starting to be applied across industries.
Current State of Quantum Computing
In recent years, quantum computing has moved from the laboratory to early practical demonstrations, though truly useful quantum computers are still on the horizon. Researchers refer to achieving a computation that no classical supercomputer could feasibly replicate as reaching quantum supremacy (or “quantum advantage”). In 2019, Google claimed the first such milestone: its 53-qubit Sycamore processor performed a carefully chosen random-number sampling calculation in 200 seconds, which Google estimated would take 10,000 years on the world’s fastest classical supercomputer (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum) (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post). This achievement was heralded as a “Wright brothers moment” – akin to the first powered flight, proving the principle even though the task itself had no practical use (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post). “Things that were only theoretical in the past are now becoming reality,” said quantum scientist Ashley Montanaro about Google’s result (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post). Caltech’s John Preskill (who coined “quantum supremacy”) called it “a remarkable achievement…a testament to the brisk pace of progress in quantum computing hardware.” (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post)
However, the Google result also came with caveats. IBM researchers argued that with clever optimizations, a classical supercomputer could actually do the same task in a couple of days, not millennia, calling the “supremacy” claim into question (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum). More importantly, that random sampling problem isn’t directly useful – it was a scientific benchmark rather than a breakthrough for industry. As a Chinese quantum physicist, Chao-Yang Lu, bluntly put it: “The current state of the art is that no experiments have demonstrated quantum advantage for practical tasks yet.” (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum) In other words, we know quantum computers can outperform classical ones on contrived problems; the race now is to achieve an advantage on practical applications that matter in the real world.
Despite the challenges, the hardware is progressing quickly. In late 2023, IBM unveiled “Condor,” a quantum processor with 1,121 superconducting qubits, the first to surpass the 1,000-qubit milestone (IBM Unveils Condor: 1,121‑Qubit Quantum Processor). This was an order-of-magnitude leap beyond previous devices like IBM’s own 433-qubit Osprey (2022) and Google’s 50+ qubit Sycamore (IBM Unveils Condor: 1,121‑Qubit Quantum Processor). Condor’s debut showed that engineering efforts to scale up qubit count are bearing fruit – IBM had to solve myriad wiring and cryogenic packaging issues to integrate over a thousand qubits in one machine (IBM Unveils Condor: 1,121‑Qubit Quantum Processor) (IBM Unveils Condor: 1,121‑Qubit Quantum Processor). A 1,121-qubit quantum computer has a theoretical state space of $2^{1121}$, an astronomically large number that no classical supercomputer could ever fully simulate (IBM Unveils Condor: 1,121‑Qubit Quantum Processor). This puts devices like Condor firmly into regimes impossible to model with brute-force classical computing, opening the door to exploring new quantum algorithms on a scale never tried before.
That said, qubit quality matters as much as quantity. Today’s qubits are noisy; adding more of them doesn’t help unless error rates are low. Leading platforms report qubit coherence times (how long they stay quantum) on the order of tens to hundreds of microseconds, and gate fidelities (accuracy of operations) often in the 99% range. These sound high, but complex algorithms require thousands or millions of gate operations, so even 1% error per gate is far too high – errors compound quickly. Recognizing this, companies like IBM and Google track holistic metrics like “quantum volume” (which factors qubit count and fidelity) and focus on improving error rates alongside scaling up qubits. In 2023, Google made a notable step toward fault-tolerant computing: using a new 72-qubit “Willow” chip, their team demonstrated that increasing the size of an error-correcting code (from 17 to 49 to 97 physical qubits encoding one logical qubit) actually lowered the logical error rate, showing for the first time that adding qubits can beat errors rather than worsen them (Quantum processor enters unprecedented territory for error correction – Physics World) (Quantum processor enters unprecedented territory for error correction – Physics World). “This is the first time we have seen convincing, exponential error suppression as we increase the number of physical qubits,” said Google’s Quantum AI team, calling it a key proof-of-concept for scalable quantum error correction (Quantum processor enters unprecedented territory for error correction – Physics World). Although still far from the large, flawless qubit arrays needed for general-purpose algorithms, it was a crucial validation that quantum error correction “really is going to work” in the long run (Quantum processor enters unprecedented territory for error correction – Physics World).
Major players and global efforts: The quest for quantum computing is a worldwide effort spanning tech companies, startups, and government labs. In the United States, IBM and Google lead in superconducting qubit technology. IBM has operated a public quantum cloud service since 2016 and released a series of increasingly powerful processors (50 qubits in 2017, 127 qubits in 2021, 433 in 2022, and now 1,121) (IBM Unveils Condor: 1,121‑Qubit Quantum Processor), with a roadmap aiming for thousands of qubits in the next couple of years and eventually a million-qubit fault-tolerant machine within the decade (What Is Quantum Computing? | IBM). Google, after its 2019 supremacy experiment, has focused on reducing errors and developing better qubit architectures (like the new Willow chip). Other U.S. players include startup IonQ (using trapped ion qubits noted for stability), Rigetti Computing (also superconducting), and quantum annealing company D-Wave Systems (which has built specialized 5000+ qubit machines for optimization problems, though annealers are a different paradigm from gate-based quantum computers). Tech giants Microsoft, Intel, and Amazon are also in the mix: Microsoft is pursuing exotic topological qubits (still unproven), while Amazon Web Services offers Amazon Braket, a cloud platform giving researchers on-demand access to various companies’ quantum hardware.
Europe and the UK are heavily invested via the EU’s Quantum Technologies Flagship program (a €1 billion initiative) and national efforts. In 2021, Germany’s Fraunhofer Institute received Europe’s first IBM Quantum System One (a 27-qubit commercial quantum computer installed on German soil) as part of a partnership to spur local R&D (Quantum Use Cases in Pharma & Biotech). France, the Netherlands, and others are funding startups focusing on alternative approaches like photonic and neutral-atom qubits. 英国 for example has companies like Oxford Quantum Circuits and a National Quantum Computing Centre under development. Canada (home of D-Wave) and Australia (home to leading quantum silicon chip research) are notable players as well.
(File:IBM Q System One (Fraunhofer) installation.jpg – Wikimedia Commons) Technicians installing IBM’s Quantum System One at Germany’s Fraunhofer Institute – Europe’s first on-site commercial quantum computer (2021). Such deployments reflect growing global investment in quantum technology.
Perhaps the most intense activity is in China, which has poured billions into quantum research as a national priority (Chinese scientists are at the forefront of the quantum revolution – The Washington Post). In 2020, Chinese researchers at USTC demonstrated a photonic quantum computer (Jiuzhang) that performed a specialized computation (boson sampling) ~$10^{24}$ times faster than a classical supercomputer (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum) – another form of quantum supremacy, though for a niche problem. China has also developed superconducting processors (Zuchongzhi with 66 and later 113 qubits) achieving quantum advantage on certain random circuit tasks (IBM Unveils Condor: 1,121‑Qubit Quantum Processor). Beyond computing, China leads in quantum communication (famous for launching the Micius quantum satellite and building a nationwide fiber network for quantum key distribution). This broad quantum push has raised concern in the West about falling behind; one report noted China was outranking the U.S. in quantum-related patent filings (especially in communications), although the U.S. still led in quantum computing patents thanks to IBM, Google, and others (Chinese scientists are at the forefront of the quantum revolution – The Washington Post). In response, U.S. funding for quantum science has increased (e.g. the National Quantum Initiative Act of 2018, and ongoing investments through DOE and NSF), and allied countries are coordinating on quantum R&D and setting up talent programs.
In summary, the state of quantum computing in 2025 is one of rapid progress with significant caveats. We now have machines with 50-1000+ qubits available on the cloud, and early demonstrations of quantum advantage in controlled settings (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post). Industry involvement is growing: IBM’s Quantum Network has over 200 member organizations experimenting with quantum solutions, and a whole ecosystem of software frameworks (like IBM’s Qiskit or Google’s Cirq) and error-mitigation techniques has emerged. Yet, these devices are still error-prone and limited in what they can practically do. No one has yet cracked a drug discovery problem or defeated encryption with a quantum computer. John Preskill has dubbed the current era “NISQ” – valuable for research and certain heuristics, but not powerful enough for revolutionary applications like breaking RSA or simulating complex chemistry exactly. Achieving those will require fault-tolerant quantum computers with many thousands of high-quality qubits and full error correction, which experts believe may be a decade or more away.
What keeps interest high is the tantalizing evidence that quantum computers will eventually work as hoped. As NIST physicist Andrew Wilson explains, entangled quantum systems “have no independent existence” – a connected quantum computer essentially functions as one giant organism tackling a problem (Quantum Computing Explained | NIST). This holistic computational power, once harnessed at scale, could enable feats like simulating molecular interactions for new drugs or breaking currently unbreakable codes. In the next sections, we look at how various industries are already exploring this potential, even as the technology remains nascent.
Use Cases Across Sectors
Realizing practical quantum computing will be a gradual process – we won’t wake up one day to find all our computers replaced by quantum ones. Instead, as the hardware improves, we expect to see quantum co-processors tackling specialized tasks that are intractable for classical computers. Many industries have begun experimenting with quantum algorithms to assess potential advantages in their most complex problems. Below, we examine use cases (existing pilots and future prospects) in several key sectors: cybersecurity, pharmaceuticals, finance, logistics, and energy. In each case, we highlight what quantum computing promises, what has been achieved so far, and how organizations are preparing for the quantum era.
Cybersecurity
Perhaps the most widely discussed impact of quantum computing is on cybersecurity – specifically, on the encryption that secures our digital world. Modern public-key cryptographic schemes (like RSA and ECC) rely on mathematical problems like integer factorization and discrete logarithms, which are effectively impossible for classical computers to solve if large key sizes are used. However, quantum algorithms threaten to break these schemes. Peter Shor’s groundbreaking 1990s algorithm showed that a quantum computer could factor large numbers exponentially faster than any known classical method. In theory, a full-scale quantum computer running Shor’s algorithm could crack RSA-2048 encryption (commonly used for secure websites, emails, etc.) in a matter of hours or days, defeating the security of much of today’s internet communications.
This looming threat is taken so seriously that governments and standards bodies are already acting. “Future quantum computers may have the ability to break some of today’s most common forms of encryption,” warned a 2022 U.S. White House memorandum, noting that even though such a quantum computer does not exist yet, we must prepare and mitigate the risks now (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House). Data that is confidential for long periods (think state secrets, health records, financial data) could be intercepted today and decrypted years later once quantum capabilities are available – a strategy known as “harvest now, decrypt later.” To counter this, the U.S. National Institute of Standards and Technology (NIST) has been running a multi-year program to develop post-quantum cryptography (PQC) – encryption algorithms that can run on classical computers but are designed to resist quantum attacks. In 2022, NIST announced the first standardized PQC algorithms (for key exchange and digital signatures) (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House). Governments are mandating a transition: the U.S. government has ordered its agencies to inventory and upgrade their cryptography to PQC over the next few years (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House), and allied countries are following suit. This is a massive undertaking (an estimate put U.S. federal agency costs at $7+ billion to transition) but crucial for long-term data security.
On the flip side, quantum technology also offers new tools to enhance security. One example is quantum key distribution (QKD) – a method to share encryption keys with security guaranteed by quantum physics. QKD uses quantum particles (typically photons) transmitted over fiber optics or even satellites; if an eavesdropper tries to intercept the key, the quantum state is disturbed and the intrusion is detected. QKD networks are operational in some regions (banks in Switzerland have used QKD for secure links, and China built a 2,000-km QKD network between Beijing and Shanghai). However, QKD requires special hardware and is distance-limited, so it’s currently used in niche scenarios. A more near-term benefit is quantum random number generators, which use quantum processes to produce truly unpredictable numbers for cryptographic keys, improving on pseudo-random algorithms.
The net effect on cybersecurity is that quantum computing is both a threat and an enabler. In the coming 5-10 years, it’s unlikely a quantum computer will be large enough to break modern encryption – estimates vary, but some experts project that a quantum computer with a few thousand logical (error-corrected) qubits could factor RSA-2048, which might be achievable by the 2030s. Governments aren’t waiting to find out; they’re already starting the migration to quantum-resistant crypto (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House). Companies are advised to do the same, especially those handling sensitive data. This has spawned an industry of PQC solutions and hybrid encryption methods to ease the transition.
Meanwhile, in areas like network security, authentication, and blockchain, researchers are examining both vulnerabilities and opportunities. For instance, Bitcoin’s elliptic curve signatures could theoretically be forged by a quantum attacker, which has led to discussions about upgrading blockchain protocols in the future. Cybersecurity firms and cryptographers are working on quantum-safe VPNs, secure boot protocols, and other measures to ensure that when quantum computing arrives at scale, our critical systems remain secure.
In summary, quantum computing’s biggest impact on cybersecurity will be the necessity to evolve our cryptographic foundations. This is often described as the Y2K of encryption – except we know it’s coming and must act years in advance. The upside is that by transitioning to post-quantum cryptography (and possibly leveraging quantum tech like QKD where appropriate), we can stay ahead of adversaries. The U.S. administration stated it plainly: “Though a quantum computer powerful enough to break current cryptography does not yet exist, [we are] preparing for and mitigating the risks…posed by a potential future quantum computer.” (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) Organizations that delay may find, a decade from now, that their data vaults can be opened like tin cans. The wise course is to upgrade defenses before the quantum hackers arrive.
Pharmaceuticals (Drug Discovery)
The pharmaceutical and biotechnology sector is one of the most excited about quantum computing’s potential, because so many challenges in drug discovery and materials science are fundamentally quantum-mechanical problems. Molecules and chemical reactions are quantum systems – electrons and atoms following the rules of quantum physics – yet today we design drugs and materials using imperfect approximations on classical computers. Despite huge advances in computational chemistry, classical simulation of complex molecules (like proteins or new drug compounds) is extraordinarily demanding. Exact simulation of a molecule’s behavior would require computing the quantum wavefunction of perhaps thousands of electrons – a task that blows up exponentially and becomes unmanageable even for molecules much smaller than a simple protein.
Quantum computing offers a way to simulate chemistry and materials with much higher accuracy, by natively processing quantum states. In principle, a quantum computer with enough qubits and proper algorithms could model the electronic structure of molecules exactly, enabling “in silico” experiments for drug discovery that are impossible today. As one industry analysis noted, “medicine is an information science dealing with inherently quantum systems” – molecules – “and thus a quantum computer can, in principle, model these with far greater fidelity.” (Quantum Use Cases in Pharma & Biotech) This could dramatically accelerate R&D: researchers might predict a drug’s efficacy or toxicity before synthesizing it, design optimal drug candidates for a given target, or discover new materials for medical use (e.g. better MRI contrast agents or more efficient biotech processes). Quantum computers could also analyze huge biological datasets (genomics, proteomics) more efficiently via quantum machine learning, potentially finding patterns that lead to personalized medicine (Quantum Use Cases in Pharma & Biotech).
While this vision is still years out, early progress has been encouraging. Even today’s small quantum processors have been used to simulate tiny molecules and simple reactions as proofs of concept. In 2020, Google used a quantum processor to simulate aspects of a hydrogen-based chemical reaction, an achievement that “spurred a wave of investment and experimentation in pharma-focused quantum computing” (Quantum Use Cases in Pharma & Biotech). IBM similarly demonstrated the simulation of modest molecules (like the energy surface of BeH₂) on its quantum hardware. These initial quantum chemistry experiments are rudimentary – they handle only a few electrons with noticeable error – but they verify that quantum computers can encode molecular problems. With each generation of hardware, the size of chemical systems simulable by quantum methods increases. One metric is “quantum volume” which IBM has steadily increased, indicating more complex computations (like larger molecular orbitals) are becoming feasible on quantum hardware.
Pharma companies and research labs are taking notice and investing now. Major drug companies have formed partnerships with quantum computing firms or started in-house teams to explore use cases. For example, in 2021 Boehringer Ingelheim (a German pharma giant) partnered with Google Quantum AI to investigate how quantum computers could speed up drug design and molecular dynamics simulations (Quantum Use Cases in Pharma & Biotech). Merck KGaA in Germany joined a UK consortium with Oxford University and startup SEEQC to build a quantum computer tailored for drug development (Quantum Use Cases in Pharma & Biotech). Roche, Johnson & Johnson, GSK, and others have active quantum chemistry research programs or collaborations. A consortium called QuPharm was formed by major biopharma firms to jointly identify and develop pre-competitive quantum use cases in pharma, from target identification to clinical trial optimization (Quantum Use Cases in Pharma & Biotech). On the tech side, IBM launched a 10-year Quantum Accelerator with Cleveland Clinic, aiming to apply quantum computing (and AI) to biomedical problems like drug discovery and genomics (Quantum Use Cases in Pharma & Biotech). These collaborations give pharmaceutical researchers access to early quantum hardware via cloud platforms (IBM Quantum, Azure Quantum, etc.) and help train a new generation of quantum-aware computational chemists.
Concrete examples of quantum use cases in pharma/biotech include:
- Drug candidate screening: Quantum algorithms (like Variational Quantum Eigensolver, VQE) can compute molecular binding energies or reaction rates more accurately for candidate drug molecules binding to a target protein. This could narrow down leads faster than classical docking simulations. Even a small improvement in simulation fidelity can save significant time and cost in drug development.
- Protein folding and dynamics: The problem of how a protein folds into its 3D shape (critical for function) is computationally intense. Quantum computers might eventually tackle aspects of protein folding or simulate protein-ligand interactions at atomic detail, aiding structure-based drug design.
- Optimization in clinical trials: Beyond chemistry, quantum algorithms (like quantum-inspired optimization) could optimize complex trial parameters or supply chains for pharma. For instance, determining optimal trial site allocations or patient cohorts might be framed as large combinatorial optimizations where quantum algorithms (or annealers like D-Wave) provide an advantage (Quantum Use Cases in Pharma & Biotech).
- Design of new materials for healthcare: e.g. better polymers for drug delivery, improved catalysts for drug synthesis, or quantum simulation of novel biomaterials.
Notably, quantum annealers have already been tested on certain biomedical problems. D-Wave’s quantum annealing machines, which specialize in solving optimization problems, have shown promising results in molecular similarity analysis and protein design in collaboration with biotech firms (Quantum Use Cases in Pharma & Biotech). While annealers don’t run gate-based algorithms, they can sometimes handle specific tasks like identifying low-energy configurations of molecules or aligning genetic sequences.
For now, limitations of hardware mean that any quantum advantage in pharma is likely hybrid – a quantum computer working in tandem with classical HPC. A common approach is to have a classical computer handle parts of a simulation and offload the quantum-hard piece (like solving the quantum chemistry of a drug binding pocket) to a quantum processor. In fact, a recent breakthrough by Microsoft and Quantinuum used a hybrid workflow (quantum + classical) to simulate a chemical catalyst, hinting at how the first practical quantum chemistry applications will be achieved (Microsoft, Quantinuum Use Hybrid Workflow to Simulate Catalyst).
Pharma executives are tempering expectations to avoid hype, but optimism is high. Industry estimates suggest that life sciences and chemistry could realize over $1.3 trillion in value by 2035 from quantum technologies (Quantum Use Cases in Pharma & Biotech) if breakthroughs occur. This figure reflects speeding up R&D timelines, reducing failure rates of drug candidates, and discovering therapies that would otherwise be missed. As a concrete sign of interest, leading pharma companies have filed dozens of patents on quantum computing methods for drug design in just the last two years (Quantum Use Cases in Pharma & Biotech).
It’s important to stress that we are still in early days – no disease has been cured via a quantum computer, nor has any drug been discovered purely by quantum simulation. But the groundwork is being laid. A senior researcher at a pharma-quantum partnership noted that even getting a slight edge in molecular simulation today and validating it against lab results can justify the effort, as it paves the way for bigger gains later. The CEO of a quantum computing startup working on drug discovery quipped that we should think of current quantum computers like the first microscopes: initially very crude, but opening a window to a new level of insight. As hardware improves, that window will widen. If and when useful quantum advantage is achieved for chemistry – for example, accurately predicting a complex reaction or binding affinity that classical methods get wrong – it will mark a paradigm shift in how drugs and materials are developed. Companies that have built expertise early will be ready to reap the benefits, while those that ignore the technology risk being left behind in the next decade.
Finance
The finance sector, with its complex mathematical models and optimization problems, is another arena that stands to be reshaped by quantum computing. Banks, hedge funds, and insurance firms deal with vast computational challenges: pricing complex derivatives, optimizing investment portfolios, detecting fraud patterns in transactions, and managing risk through extensive simulations. Many of these tasks push classical computing to its limits, especially when real-time decisions or very high accuracy is needed. That’s why big financial players have been among the early adopters of quantum computing research, keen to seize any computational edge that could translate into profit or reduced risk.
Several promising use cases in finance have emerged for quantum algorithms:
- Portfolio optimization: Determining the optimal allocation of assets in a portfolio under constraints (risk, return, regulatory limits) is a difficult optimization (often NP-hard) especially as the number of assets grows. Quantum computers (or quantum-inspired algorithms) can potentially explore the enormous solution space more effectively. For example, JPMorgan Chase has been examining how quantum algorithms could solve portfolio optimization problems faster or find better optima than classical heuristics (Solving quantum linear systems on hardware for portfolio optimization). They’ve developed quantum implementations of techniques like Monte Carlo linear systems solvers and Quantum Approximate Optimization Algorithm (QAOA) for this purpose (Accelerating Quantum Optimization Research by Algorithm-Specific …). Early results show that even current quantum processors or simulators can handle small portfolio instances and confirm the approach, with the expectation that larger quantum machines will tackle real portfolios with hundreds of assets.
- Risk analysis and Monte Carlo simulations: Financial institutions rely on Monte Carlo simulations for tasks like calculating the Value-at-Risk of portfolios, option pricing, and forecasting market scenarios. These involve simulating thousands or millions of random paths for asset prices. Quantum algorithms can accelerate Monte Carlo by using amplitude amplification (a quantum technique related to Grover’s algorithm) to quadratically speed up the convergence of simulations. In 2021, Goldman Sachs in collaboration with quantum startup QC Ware demonstrated a proof-of-concept of a quantum accelerated Monte Carlo simulation on actual hardware (Goldman Sachs, QC Ware and IonQ Demonstrate Quantum Algorithms Proof-of-Concept That Could Revolutionize Financial Services, Other Industries) (Goldman Sachs, QC Ware and IonQ Demonstrate Quantum Algorithms Proof-of-Concept That Could Revolutionize Financial Services, Other Industries). Using IonQ’s trapped-ion quantum computer, they implemented a simplified version of their algorithm for option pricing – it showed the expected speedup in principle, although the hardware scale was tiny. Goldman’s head of quantum research, William Zeng, noted this as a step toward “evaluating risk and simulating prices for financial instruments at far greater speeds than today” (Goldman Sachs, QC Ware and IonQ Demonstrate Quantum Algorithms Proof-of-Concept That Could Revolutionize Financial Services, Other Industries) (Goldman Sachs, QC Ware and IonQ Demonstrate Quantum Algorithms Proof-of-Concept That Could Revolutionize Financial Services, Other Industries). The demonstration was modest (pricing a very simple derivative), but it proved that as quantum hardware improves, such algorithms could materially speed up risk calculations that currently take extensive computing time.
- Cryptography and security: Banks are also concerned about the flip side of quantum – the threat to the cryptography underlying secure transactions (as discussed in the cybersecurity section). Many banks are already experimenting with quantum key distribution (QKD) for ultra-secure communication between data centers. For instance, HSBC has integrated QKD to secure some financial data transmissions ( JPMorgan leads quantum computing arms race ). Additionally, banks want to ensure their encryption and customer data remain safe in a post-quantum world; thus, many are starting to implement post-quantum cryptography for critical systems, often in partnership with government cybersecurity initiatives.
- Trading optimization and machine learning: Some firms are exploring quantum algorithms for market prediction and trading strategies. Quantum machine learning algorithms might be applied to analyze market data for subtle patterns or to optimize trading paths (though this is highly experimental). Another angle is optimizing operational processes, like settlement networks or ATM cash management, which involve complex logistics similar to supply chain problems (where quantum optimization might help).
Notably, a benchmark report in 2025 found that nearly 80% of major global banks are now engaged in quantum computing research or partnerships ( JPMorgan leads quantum computing arms race ). JPMorgan Chase stands out as a leader – it reportedly accounts for two-thirds of all job postings in quantum computing among banks and over half of the research publications from the banking sector ( JPMorgan leads quantum computing arms race ). JPMorgan has a dedicated quantum team that’s already yielding results, using “quantum-inspired” algorithms (i.e. algorithms inspired by quantum techniques but runnable on classical hardware) to gain improvements in areas like portfolio optimization and fraud detection today ( JPMorgan leads quantum computing arms race ). European banks are also prominent: for example, Spain’s BBVA and UK’s Barclays have run pilots with D-Wave’s annealers on trading optimizations, and Italy’s Intesa Sanpaolo is exploring quantum methods for credit scoring and option pricing ( JPMorgan leads quantum computing arms race ). The consultancy McKinsey estimates that by 2035, use cases of quantum computing in finance could generate up to $600+ billion in value (through improved revenues and cost savings) ( JPMorgan leads quantum computing arms race ).
Financial leaders see quantum as a more focused tool than, say, AI. “Unlike AI, which is being embedded across every part of a bank – quantum will transform a specific subset of use cases. But where it applies, the impact will be staggering,” says Alexandra Mousavizadeh, co-CEO of Evident, a firm tracking bank adoption of quantum ( JPMorgan leads quantum computing arms race ). “It will completely transform areas such as portfolio optimization, credit scoring and fraud detection far beyond today’s computing capabilities.” ( JPMorgan leads quantum computing arms race ) In practical terms, a successful quantum algorithm could mean better investment returns with lower risk, or detecting a fraudulent transaction that might have slipped through conventional systems – outcomes with direct financial and reputational implications.
Looking ahead, most banks acknowledge that quantum advantage in finance is a few years away – but they are investing now to be ready. They’re training quantum teams (the number of quantum professionals in banking jumped 10% in a recent 6-month period ( JPMorgan leads quantum computing arms race )) and collaborating with quantum startups and academic groups. JPMorgan’s CEO Jamie Dimon has mentioned quantum computing in annual letters as a key long-term technology. Governments are encouraging public-private partnerships, recognizing that a quantum-equipped financial sector is a national competitive advantage.
One interesting intersection is quantum computing and AI in finance. As AI models (like deep neural networks for market prediction) grow more complex, quantum computers might eventually aid in training or executing these models faster via quantum machine learning techniques. We’re not there yet, but research is ongoing into quantum algorithms for clustering financial data or improving reinforcement learning for trading strategies.
In summary, finance sees quantum computing as a strategic future tool – likely invisible to consumers, but working behind the scenes to make financial services more efficient and secure. The first toeholds will probably be in speeding up computations that are already done (risk, pricing, optimization), turning overnight batch jobs into real-time analytics, for example. As hardware matures, we may then see qualitatively new capabilities, like simulations that account for market uncertainties in ways classical computers simply cannot, giving rise to more robust financial models. The competitive nature of finance means that banks are racing each other – and the clock – to master this technology. Those who adopt quantum solutions early could gain an edge in profitability or risk management, while laggards might find themselves at a disadvantage. As one report concluded, “While quantum computing is still years away from full-scale deployment, banks that fail to prepare could face both security risks and missed opportunities.” ( JPMorgan leads quantum computing arms race )
Logistics (Supply Chain & Transportation)
Global logistics – the art of moving goods and people efficiently – presents a labyrinth of optimization problems that could benefit tremendously from quantum computing. From routing delivery trucks through busy cities, to scheduling flights and shipping containers, to optimizing warehouse operations, the logistics sector struggles with so-called “NP-hard” problems (like the famous Traveling Salesman Problem) that become unwieldy as their size grows. Even the best classical algorithms often resort to heuristics and approximations for these complex combinatorial puzzles. Quantum computing, with its ability to explore many possibilities in parallel, holds promise for finding better solutions or speeding up computation for logistics challenges.
Route optimization is a prime example. Consider a delivery company like DHL or FedEx trying to compute the most efficient routes for thousands of packages each day – factoring in traffic, time windows for deliveries, and vehicle capacities. This is an enormous computation. A quantum computer could, in theory, evaluate many route combinations simultaneously and exploit interference to highlight the optimal ones. In practice, researchers have already tested quantum approaches on smaller routing problems. One landmark demonstration was by Volkswagen in 2019: they used a D-Wave quantum annealer to optimize the routes of buses in Lisbon, Portugal, in a pilot project (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things). Nine buses serving 26 stops were routed in near-real-time using a quantum algorithm, with the goal of minimizing congestion for commuters (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things). The system successfully calculated the fastest routes for each bus, adjusting to traffic conditions. While nine buses is a modest number, the world’s first live quantum traffic optimization proved out the concept. Volkswagen noted that if fully scaled, the approach could be applied to any city with fleets of any size, improving public transport efficiency and, by extension, last-mile logistics for deliveries (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things).
Another optimization is supply chain network design – deciding how to allocate inventories, which distribution center should serve which store, etc. Companies like UPS and DHL are exploring quantum algorithms for supply chain resilience. Quantum computers could help dynamically re-route shipments when disruptions occur (like a port closure or natural disaster) by quickly recalculating optimal shipping lanes or reallocating inventory. DHL publicly stated that quantum computing is an “exciting development for the logistics industry” because it allows solving the perennial “most efficient route between multiple nodes” problem in complex environments (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things). DHL has experimented with quantum-inspired algorithms in warehouse packing (optimizing how to pack parcels to maximize cargo space) and found improvements over classical methods (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things). A quantum algorithm can treat the packing of objects as an energy minimization problem, akin to how particles settle in a low-energy configuration, and potentially find arrangements that use space more efficiently than human-devised rules.
Vehicle routing extends beyond delivery vans to trucks, cargo ships, and airplanes. Airlines could use quantum computing to optimize flight schedules and crew assignments – problems that involve juggling thousands of variables (crews, aircraft, maintenance windows, airport slots). Likewise, ride-sharing companies (Uber, Lyft) have real-time routing puzzles that could benefit from quantum speedups for immediate decisions.
Quantum annealers are particularly suited to these optimization tasks. D-Wave’s quantum annealer has been used in a project with FedEx to explore more efficient ways to load cargo planes and route packages, according to media reports. While specifics are often kept confidential, it’s known that logistics giant UPS is an investor in D-Wave and has been testing its tech. Mercedes-Benz and BMW have also looked at quantum computing to optimize production logistics and supply chains for car manufacturing.
In the maritime shipping domain, IBM and ExxonMobil researchers collaborated on using quantum algorithms to optimize maritime route planning for liquefied natural gas (LNG) carriers (ExxonMobil | IBM). They treated the scheduling of shipments and routing of tankers as a complex optimization and managed to model it on a quantum device. “By partnering with IBM Quantum, our aim is to level-up our ability to tackle more complex optimizations,” explained Vijay Swarup of ExxonMobil regarding this effort (ExxonMobil | IBM). The energy industry’s shipping logistics can thus overlap with the kind of supply chain problems seen in retail or manufacturing.
Despite progress, no one is yet using quantum computers in day-to-day logistics operations – current machines are simply not powerful enough to outperform classical optimization software on large instances. But the potential gains from even a small improvement are huge in this low-margin industry. A few percentage points better route efficiency can save millions in fuel and labor costs and reduce emissions (important as companies strive for greener logistics). For example, if a quantum solution improved route efficiency such that delivery trucks drove 10% fewer miles, that’s a 10% fuel saving – a big win for both profitability and sustainability.
Recognizing this, companies are preparing. Airbus held a global Quantum Computing Challenge a couple of years ago, soliciting academic solutions for airline route optimization and aircraft design problems using quantum algorithms. Maersk, the world’s largest shipping line, has a quantum research agreement with IBM to examine port optimization and vessel scheduling. And logistics software providers (the companies whose software optimizes supply chains for retailers, etc.) are partnering with quantum startups to incorporate quantum solvers into their toolkits as soon as the hardware is capable.
It’s also worth mentioning that improved optimization isn’t the only benefit – quantum-secure communication can be important in logistics too. DHL pointed out that while quantum methods can create “unhackable” communications within a supply chain (using QKD to secure data between logistics hubs), there’s also the risk of malicious actors using quantum computers to attack supply chain systems (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things). So logistics companies, like banks, have to be quantum-aware on security.
Ultimately, logistics might see a two-phase adoption: first quantum-inspired algorithms running on classical hardware (some of which are already in use, applying quantum principles to improve current solutions), and later true quantum optimizations once hardware is ready. The roadmap could be similar to what happened with machine learning – initial skepticism, followed by pilot projects, and eventually widespread use when the value became undeniable. A DHL study suggested that within the next decade, quantum computing could transform logistics, but also cautioned that “widespread usage is still years away, as questions remain over the highly touted capabilities.” (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things) That seems apt: quantum logistics is coming, but it will require more mature tech.
The takeaway is that global commerce runs on solving giant puzzles – and quantum computing is a new kind of puzzle-solving engine. Whether it’s mapping city traffic or orchestrating a global supply web, quantum techniques promise more optimal answers. We’ve already seen glimpses: Volkswagen’s buses avoiding jams with quantum routing (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things), or a DHL lab packing more boxes per crate via a quantum algorithm. As those lab tests turn into real deployments, we may enjoy faster deliveries, less traffic, and leaner supply chains that can adapt on the fly. The packages on your doorstep could, indirectly, have a quantum computer to thank.
Energy
The energy sector encompasses everything from electricity grids and renewable energy integration to oil and gas exploration and new material development for batteries and solar panels. It’s a sector both highly computational and critically important to society’s future – making it a prime beneficiary of advances in computing like quantum. There are several fronts on which quantum computing could reshape energy industries:
1. Power grid optimization: Modern electrical grids are extremely complex to operate. Grid operators must constantly balance supply and demand, routing power from plants (or solar/wind farms) to consumers while minimizing losses and avoiding overloads. The rise of renewable energy, which is intermittent, plus new loads like electric vehicles, has made grid management even more dynamic and complex. Quantum algorithms (like quantum annealing or QAOA) can potentially solve optimization problems in grid operations more efficiently. For example, determining the optimal configuration of capacitor banks or flow on transmission lines to minimize power loss is a combinatorial optimization that some studies have mapped to quantum formulations (Quantum Algorithms Will Optimize Power Grid Efficiency). E.ON, one of Europe’s largest energy utilities, announced in 2024 that it’s working with IBM to apply quantum computing to manage the complexity of its energy grid as renewable inputs grow (EON Taps Quantum Computing To Optimize Energy Grid Complexity). They developed a prototype quantum algorithm for managing weather-related risk in energy pricing and grid balancing, which in simulations could outperform classical methods when run on a sufficiently advanced quantum computer (EON Taps Quantum Computing To Optimize Energy Grid Complexity) (EON Taps Quantum Computing To Optimize Energy Grid Complexity). While hardware isn’t there yet, E.ON expects that in a few years, quantum computations could help deliver more stable energy prices and efficient grid management as they incorporate solar and wind power (EON Taps Quantum Computing To Optimize Energy Grid Complexity). The U.S. Department of Energy is similarly exploring quantum approaches for grid optimization through programs like ARPA-E’s QC3 initiative, targeting improvements in areas like novel superconductors for power transmission (which involves materials design, another quantum use case) (Quantum Computing Use Cases in Materials & Chemicals).
2. Renewable energy and storage materials: The transition to clean energy depends on discovering new materials – for better batteries, better solar cells, catalysts for efficient fuel production, etc. Quantum computing can significantly aid materials science research. For instance, finding a catalyst that can speed up hydrogen production or carbon capture without using rare expensive metals is a huge challenge. Quantum simulations could allow scientists to evaluate many candidate catalyst materials at the atomic level. A partnership in the UK between quantum software company Riverlane and chemicals firm Johnson Matthey is working on quantum methods to design improved catalysts (Johnson Matthey is a leader in catalytic converters, fuel cells, etc.) (Quantum Computing Use Cases in Materials & Chemicals). Their work encodes complex chemical interactions into quantum circuits to predict catalyst performance, with the vision of discovering new catalysts (like ones that enable ammonia production at low pressure or non-platinum fuel cell catalysts) once quantum hardware is mature (Quantum Computing Use Cases in Materials & Chemicals) (Quantum Computing Use Cases in Materials & Chemicals). Similarly, quantum computers could help identify new battery electrode materials with higher capacity or faster charging by simulating different compounds’ electrochemical properties more precisely than classical models.
A concrete example is the case of ammonia synthesis (for fertilizer). The Haber-Bosch process used today is very energy-intensive. Researchers are investigating biological catalysts (enzymes like nitrogenase) that produce ammonia at ambient conditions. Simulating the enzyme’s active site – a complex metal cluster – is a daunting quantum chemistry problem. In 2022, a team used a quantum-inspired approach to start elucidating that mechanism ([PDF] Potential Applications of Quantum Computing), and future larger quantum computers could simulate it exactly, potentially leading to a catalyst that mimics nature and saves vast energy in fertilizer production. “By simulating catalyst behavior with quantum precision, researchers aim to develop catalysts that allow ammonia production at lower temperatures and pressures,” noted one materials science review (Quantum Computing Use Cases in Materials & Chemicals) (Quantum Computing Use Cases in Materials & Chemicals). This principle extends to designing superconductors (for lossless power lines), photovoltaic materials with better light absorption, or lightweight alloys for wind turbines – all critical for energy technology.
3. Oil & gas and resource exploration: Companies in the oil and gas industry, like ExxonMobil, BP, and Total, are also exploring quantum computing. These companies face massive computational tasks in seismic data processing (to locate oil reservoirs), fluid dynamics simulations for reservoir modeling, and optimizing their refinery operations. ExxonMobil, for example, joined IBM’s Quantum Network and has been researching quantum approaches to optimize shipping routes for LNG tankers and manage industrial supply chains (ExxonMobil and World’s Leading Research Labs Collaborate with …). The problem of scheduling and routing ships that deliver natural gas around the world is akin to a giant traveling salesman problem with time windows – something quantum optimization could potentially tackle more efficiently. Beyond logistics, quantum machine learning might one day help interpret seismic survey data to better identify underground resources by detecting subtle patterns.
Refineries and petrochemical plants also present huge optimization problems (maximizing output, minimizing energy usage, scheduling maintenance, etc.). Quantum algorithms could be applied to these “smart manufacturing” problems in chemicals and energy, finding better solutions that save fuel or increase yield.
4. Quantum sensing for energy: While not the focus of this article, it’s worth mentioning that quantum sensors (using quantum effects for ultra-sensitive measurements) could also benefit the energy sector. For example, quantum sensors might more precisely detect pipeline leaks or monitor power grid health. These typically involve quantum technologies but not quantum computing per se, though it’s part of the broader quantum revolution affecting industry.
Several energy companies have publicly expressed optimism about quantum computing. A representative of Italy’s Enel (a major utility) said quantum computing could significantly improve grid reliability and integration of renewables once available. And Saudi Aramco’s tech foresight unit is studying quantum algorithms for optimizing its hydrocarbon production network. On the renewable front, startups are emerging that aim to use quantum computing for things like more efficient solar cell design or optimizing placement of wind farms (a layout optimization problem that could be attacked with quantum annealing).
Challenges specific to energy applications: A common challenge in energy problems is that they often involve large-scale continuous variables (like power flow levels) in addition to discrete decisions (on/off, yes/no). Quantum computing excels at discrete combinatorial optimization and certain algebraic problems, but coupling those with continuous variables often requires clever formulation (like discretizing the continuous parts). Hybrid quantum-classical algorithms are being developed, where a quantum computer solves the discrete subproblem (e.g., which grid lines to activate or which routes to pick) and a classical optimizer then handles the continuous adjustment (e.g., how much power to send along each active line). This hybrid approach is likely how early quantum advantages will manifest in energy management.
Another challenge is verification and trust: energy infrastructure is mission-critical, so operators will be cautious in trusting decisions to an algorithm (quantum or otherwise) unless it’s well-validated. Early usage might be as decision support – the quantum system offers a recommendation that human engineers review.
Despite these challenges, the potential economic and environmental gains ensure continued interest. The energy sector may not generate as much direct revenue from quantum computing as, say, finance (no high-frequency trading here), but the societal impact of more efficient energy could be enormous – lower costs for consumers, reduced greenhouse gas emissions, and faster innovation in clean energy tech.
One energy executive summarized the future outlook as follows: If classical computing was a key enabler of the 20th-century energy industry, quantum computing could be a cornerstone of the 21st-century sustainable energy transition. It might help us optimize a highly complex green energy grid and invent the materials that make renewable energy abundant and affordable. In the coming years, as quantum hardware grows, we can expect pilot projects with utilities adjusting power flows using quantum algorithms during peak demand, or battery researchers reporting that a quantum model predicted a new chemistry that testing confirmed. Each such step will reinforce the transformative potential of quantum computing in the energy domain.
Challenges and Limitations
For all its promise, quantum computing today remains a frontier technology facing significant challenges and limitations. It’s important to temper the hype with a dose of reality about what’s holding quantum computers back and the hurdles that must be overcome for them to reach their transformative potential. Below we outline some of the major challenges:
- Decoherence and error rates: Qubits are extremely sensitive to their environment. The slightest interaction with stray electromagnetic fields, vibrations, or thermal energy can cause a qubit’s delicate superposition state to collapse (decohere) into a normal classical state (What Is Quantum Computing? | IBM). Currently, superconducting qubits must be kept in dilution refrigerators colder than outer space to reduce thermal noise, and even then they maintain coherence only for microseconds. Trapped ion qubits have longer coherence (up to seconds), but operations on them are slower. Every quantum gate operation has a probability of error, and those errors compound quickly in a computation. By contrast, classical transistors can perform billions of operations essentially error-free. The high error rates in today’s quantum hardware mean that algorithms must be very shallow (few operations) or else results become meaningless. Error mitigation strategies can alleviate noise to some extent (by extrapolating what the result would be with zero noise), but they are not a full solution. Truly tackling this issue requires quantum error correction – encoding a single logical qubit into many physical qubits in such a way that if some qubits flip erroneously, the error can be detected and fixed via redundancy. Error correction is theoretically possible (the threshold theorem says if physical error rates are below a certain level, you can suppress logical errors indefinitely by using enough qubits). However, implementing it is enormously resource-intensive. For example, estimates suggest breaking RSA-2048 via Shor’s algorithm might require on the order of 20 million physical qubits if using surface code error correction with current error rates – a number far beyond the few hundred or thousand physical qubits we have now. The recent Google result showing improved logical qubit fidelity with 50+ qubits was a big milestone (Quantum processor enters unprecedented territory for error correction – Physics World) (Quantum processor enters unprecedented territory for error correction – Physics World), but fully error-corrected qubits are likely 5-10 years away at least. Until error rates are tamed, most quantum computers operate in the “NISQ” regime where they cannot run deep, complex algorithms reliably.
- Scalability: Even putting error correction aside, scaling the number of physical qubits is a huge engineering challenge. Superconducting qubits, for instance, require control wiring for microwave pulses to each qubit. In IBM’s 127-qubit chip, and even more so in the 1,121-qubit Condor, the tangle of wires and coaxial lines feeding into the cryostat is formidable (IBM Unveils Condor: 1,121‑Qubit Quantum Processor). At some point, simply fitting more wires and maintaining uniform conditions across a larger chip becomes impractical. Companies are exploring modular approaches (linking multiple chips with quantum interconnects) to scale beyond a few thousand qubits. But connecting qubits between chips (or between distant nodes) quantum-mechanically is another research frontier (related to building a quantum internet using entanglement swapping). Trapped ion systems face different scaling issues: they can trap, say, 20-50 ions in a single chain with high fidelity, but scaling to hundreds might require transporting ions between trap zones or using photonic links between ion traps. All of these add complexity and potential for error. In short, going from prototype to a large-scale quantum computer is not just a matter of “more chips” like in classical computing; it requires new breakthroughs in architecture and materials. Companies like IBM have unveiled roadmaps involving quantum modular systems to reach millions of qubits by stacking and linking processing units (Discover the World’s Largest Quantum Computer in 2025 – SpinQ) (IBM’s Quantum Computing: Roadmap to 4000 Qubit System by 2025), but executing those plans will be an enormous task.
- Algorithm maturity and software: On the software side, quantum computing is still in its infancy. Only a handful of quantum algorithms are known that offer clear exponential speedups (Shor’s, Grover’s, quantum simulation algorithms). Many potential use cases (optimization, machine learning) are using heuristic quantum algorithms that may or may not outperform classical ones in practice. There’s a need for more quantum algorithms research to expand the repertoire of problems that quantum computers can tackle better. Moreover, programming a quantum computer is not like programming a classical one; it often requires knowledge of quantum physics and linear algebra at a fairly deep level. The field is working on higher-level abstractions and compilers to make quantum programming more accessible (analogous to how early computers eventually got Fortran and C instead of assembly). Projects like Qiskit, Cirq, and various domain-specific quantum languages are steps in that direction. But the “stack” is immature – for example, there’s no robust quantum error-correcting compiler in common use yet, and concept like “debugging” a quantum program are non-trivial (because measuring a qubit collapses its state). So, software and developer tools for quantum lag behind the hardware progress. Without them, it’s harder to unleash the creativity of a wide developer community as happened in classical computing.
- Decoherence times vs computation times: A related limitation is that quantum computers currently cannot run algorithms requiring a large number of sequential steps, because the qubits will lose coherence before the algorithm finishes. This is why many NISQ-era algorithms are designed to be shallow (few time steps) and why quantum error correction (which allows arbitrarily long computations in principle) is so crucial. If an algorithm needs 1,000 sequential gate layers and your qubits decohere after 100, you simply can’t run it successfully today. Techniques like dynamic circuits (where measurement results mid-circuit can feed forward into later operations) might squeeze more out of short coherence times by allowing a bit of real-time adaptation (IBM roadmap to quantum-centric supercomputers (Updated 2024) | IBM Quantum Computing Blog). But ultimately, either coherence times need to improve, or error correction needs to fill the gap.
- Cryogenics and infrastructure: Most high-performance quantum computers (superconducting, spin qubits, etc.) require sub-Kelvin temperatures. This means dilution refrigerators, which are expensive, power-hungry pieces of equipment themselves. Running a quantum datacenter will have non-trivial overhead cost and power usage (some irony: using lots of energy to keep something near absolute zero in order to…solve energy optimization problems, for example). Photonic quantum computers and certain atomic ones can operate at room temperature, which could be a future advantage, but those are currently less advanced in general-purpose computing tasks. As quantum machines scale, engineering the cryogenic environment (or optical tables in the case of photonics/ion traps) to handle thousands of qubits is an active area of R&D. IBM’s “System Two” design is an example of a next-gen refrigeration system built for modular quantum processors.
- Talent and interdisciplinary knowledge: Building and operating quantum computers requires a unique blend of skills – quantum physics, electrical engineering, computer science, error-correcting code theory, etc. There is a talent shortage in the field, which could slow progress if not addressed through education and training. Companies often have to hire physics PhDs and then turn them into hardware engineers or software developers for quantum, as there aren’t yet large pools of experienced “quantum engineers.” This is changing as universities create quantum computing programs and companies sponsor workshops to cross-train classical computer scientists in quantum, but it remains a challenge.
- Near-term usefulness and hype: A more conceptual limitation is that until quantum computers definitively solve a practical problem better or faster (with full problem-scale advantage) than classical, there will remain skepticism and risk of a “quantum winter” if expectations aren’t managed. The community is cautious of over-promising. As of 2025, we have prototypes of quantum advantage (e.g., Google’s random circuit, USTC’s photonic experiment), but the challenge is to achieve quantum advantage in a useful task. John Preskill himself emphasizes this distinction. If progress is slower than expected, there could be disillusionment among investors or the public. So far, investment remains strong, but it hinges on steady advances.
Given these challenges, why do experts remain optimistic? Because none of the fundamental roadblocks have been shown to be impossible – just very hard. Decoherence can be countered by error correction; scaling issues can be solved by modularity and improved fabrication; new algorithms are being discovered by a growing research community. Each year, milestones are met: more qubits, higher fidelities, first logical qubits, first small quantum chemistry solutions, etc. It’s a marathon, not a sprint.
It’s instructive to draw a parallel to the early days of classical computing. In the 1940s and 50s, vacuum tube computers would crash frequently, were room-sized, and people questioned what they were good for beyond math tables. It took decades to go from those fragile behemoths to reliable, ubiquitous silicon microchips – along the way many engineering barriers were overcome (transistors, ICs, etc.). Quantum computing may follow a similar trajectory: we’re in the vacuum tube era of quantum, inching toward the transistor era (some speak of an upcoming “quantum Moore’s Law” once we find the right scalable qubit tech). There could be unforeseen breakthroughs – for instance, the development of topological qubits (pursued by Microsoft) which theoretically have built-in error resilience, would be game-changing if achieved. Or new materials might prolong coherence significantly.
In summary, the current limitations of quantum computing are real and significant: noisy qubits, limited algorithms, scaling pain, and high complexity. But none appear insurmountable given enough time and research. As one analyst quipped, “Quantum computing is the hardest thing we’ve ever tried to build, but also potentially the most revolutionary.” The effort is justified by the potential payoff – and progress, while sometimes slower than the hype cycle suggests, has been steady and accelerating. The next section looks at how these challenges might be addressed and what the future outlook is for quantum computing as the field strives to move from laboratory curiosity to mainstream tool.
Future Outlook
What does the future hold for quantum computing? While exact timelines are uncertain (and experts sometimes disagree on whether we’re 5 years or 20 years away from certain milestones), there is broad consensus that quantum computing will gradually mature into a practical technology that complements classical computing. Here we outline some expectations and possibilities for the coming years and decade:
1. Fault-tolerant quantum computers: Achieving a fully fault-tolerant quantum computer – one that uses error correction to run arbitrary-length algorithms reliably – is the holy grail. Many in the field predict this will happen in stages by the late 2020s or early 2030s. IBM’s roadmap, for example, aims to demonstrate simple error-corrected logical qubits and then scale up via modular systems to thousands of logical qubits by around 2030 (IBM’s Quantum Computing: Roadmap to 4000 Qubit System by 2025). Google similarly has a goal of building a useful error-corrected quantum machine by the end of the decade. The first fault-tolerant devices might still be modest in size (e.g., capable of handling something like 100 logical qubits, which with error correction could be millions of physical qubits), but enough to perform some groundbreaking calculations like factoring large numbers or simulating medium-sized molecules with chemical accuracy. Once fault tolerance is achieved, a quantum computer’s effective power can be increased by simply adding more qubit modules – much like how classical computing scaled by adding more transistors, but here with qubits and error correction overhead.
2. Quantum advantage in useful tasks: In the nearer term (next 3-5 years), we’re likely to witness a few instances of “quantum advantage” on commercially relevant problems. This might come in the form of a quantum computer outperforming the best classical algorithm on a specific optimization or machine learning task – even if by a small margin – demonstrating value to industry. Candidates could be something like a quantum option pricing calculation that outpaces a classical Monte Carlo, or a quantum optimization that yields a better solution for a logistics routing problem than any classical solver can. These achievements will probably be domain-specific and done in collaboration with end-users (banks, logistics firms, etc.), showing tangible benefit. As Alexandra Mousavizadeh noted about banking, quantum will likely transform specific use cases with staggering impact ( JPMorgan leads quantum computing arms race ), rather than everything at once. So we may see quantum computing quietly enter the enterprise in niche applications first, somewhat analogous to how GPUs (graphics processing units) were initially for graphics only, then found wider use in HPC and AI. In fact, many predict quantum processors will live in the cloud or in data centers as specialized accelerators that users call for certain subroutines – similar to how we call an AI accelerator for neural network tasks today.
3. Integration with classical computing and HPC: The future is hybrid computing – quantum and classical working together. Classical computers are not going anywhere; in fact, as NIST aptly stated, “quantum computers will not replace classical, but work together to solve problems classical computers can’t solve fast enough.” (Quantum Computing Explained | NIST) We can expect tighter integration of quantum processors in high-performance computing (HPC) environments. For example, supercomputing centers are already installing small quantum machines or simulators to experiment with hybrid algorithms. In the future, a researcher might write a program where most of the work runs on classical CPUs/GPUs, but the hardest part calls a quantum subroutine (perhaps via a cloud API) to crunch through a chemistry calculation or an optimization. Cloud platforms like AWS, Azure, and IBM Cloud are likely to be the main way users access quantum computing for quite some time, as only large organizations will have the resources to maintain their own quantum hardware on-site. By accessing quantum via cloud, even small startups or university groups can leverage the power without owning a dilution fridge.
4. Advances in hardware approaches: We will probably see multiple hardware technologies co-existing, each improving. Superconducting qubits and trapped ions are the current frontrunners, so second-generation versions of these (with better coherence and higher qubit counts) will continue to roll out annually. But alternative qubit types may catch up: photonic quantum computers (like those pursued by Xanadu, PsiQuantum) could achieve large numbers of qubits by leveraging photonic chips and may be easier to scale, though currently they struggle with two-qubit gate fidelity. Neutral atom qubits (from companies like Quantinuum and Pasqal) offer potentially huge scalability (thousands of atoms optically trapped) and are making strides in gate fidelity and connectivity. Spin qubits in silicon (research by Intel, HRL, University of New South Wales) are attractive for their compatibility with semiconductor fabrication – if technical issues (like uniformity and two-qubit gate errors) are solved, they could leverage existing chip infrastructure to scale massively. Each of these could see breakthroughs: for instance, a photonic system might demonstrate a prototype with 1 million entangled modes (qubits) for a specific algorithm, or a neutral atom system might implement a small error-corrected logical qubit using hundreds of atoms.
And then there’s the dark horse: topological qubits (using exotic Majorana fermions), which in theory could be naturally immune to certain errors. Microsoft has pursued this for years and only recently achieved some progress in creating the required topological states. If they succeed, topological qubits might allow error rates so low that far fewer of them are needed – a potential shortcut to fault tolerance. It’s high-risk, high-reward research; we might know in a few years whether it will pan out or if the field doubles down on the other approaches.
5. Software and algorithms bloom: On the software front, as hardware improves, more developers and domain experts will be interested in writing quantum code. We’ll likely see more high-level frameworks and maybe domain-specific languages for quantum computing. For example, chemists might use a software that lets them input a molecular structure and choose “solve ground state energy with quantum backend,” without needing to understand the quantum circuit details. Likewise, in machine learning, libraries could offer quantum-accelerated versions of certain algorithms under the hood. As more people get their hands on real quantum hardware through cloud platforms, we can expect a bloom of novel algorithms and optimizations. Many will be hybrid algorithms, taking inspiration from classical techniques (like iterative refinement, gradient descent, etc.) but adding a quantum twist. The community might also discover new uses for quantum computers that we haven’t anticipated yet – much as early general-purpose computers ended up being used for things their inventors didn’t foresee (like video games or AI).
One possibility is quantum-assisted AI: using quantum computers to enhance machine learning tasks. There’s already research in quantum kernel methods and quantum neurons. If quantum computers can efficiently handle certain high-dimensional linear algebra tasks, they might accelerate training of AI models or improve them. With the current boom in AI, any synergy with quantum will be eagerly explored.
6. Democratization and education: Over the next decade, quantum computing education will become more mainstream. Universities are rapidly adding quantum information science courses, and even high school curricula are beginning to include basic quantum concepts in some forward-thinking regions. As a result, the pool of people who can contribute to the field will grow. We may see the first “quantum-native” startups founded by students who learned quantum programming as undergrads, analogous to how many of today’s AI startups are led by people who played with machine learning in college. This democratization is important because it brings fresh ideas and perspectives, and ensures that when the hardware is ready, there’s a workforce ready to use it.
7. Policy and funding environment: Governments will likely increase support as quantum is seen as a strategic technology. We can expect continued or expanded national quantum initiatives. International cooperation as well as competition will shape development – e.g., the EU will invest to avoid falling behind the US and China; allied countries might coordinate on standards for quantum communication and share research findings. There’s also the consideration of ethical and security implications: if quantum computing enables breaking encryption, how do nations manage that transition responsibly? Already, moves like the U.S.’s encryption transition roadmap (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) indicate forward-looking policy. We might see new international agreements or treaties related to quantum tech usage (for instance, norms around using quantum computing in warfare or espionage – though those might be wishful thinking given the secrecy typically involved).
8. Unexpected breakthroughs: It’s worth noting that in any rapidly evolving field, unexpected breakthroughs can occur. For example, someone might invent a better error-correcting code that drastically reduces overhead, or a novel qubit design that is both stable and easy to couple, or a theoretical algorithm that reduces the required qubit count for a problem by an order of magnitude. These could accelerate timelines. Conversely, we might hit unforeseen snags (like some noise source that is hard to eliminate beyond a certain qubit count) that slow progress. The future isn’t linear or guaranteed, but the multiple parallel approaches being pursued provide resilience – if one path faces a dead end, another might succeed.
In a hopeful scenario, by 2035 we have quantum computers with perhaps 100+ logical qubits (or equivalently, tens of thousands of physical qubits with error correction) in regular use. At that point, they might routinely solve problems in chemistry, optimization, and cryptography that are out of reach classically. The impact would be felt across industries: pharmaceutical companies could be designing drugs with far fewer lab trials, material scientists could rapidly prototype new compounds for carbon capture or superconductivity, logistics networks might be globally optimized in real-time, and secure quantum communication links could protect sensitive data. We might even see consumer-level impacts – for instance, quantum-computed optimization in Google Maps providing slightly faster routes, or quantum-secured transactions in your banking app (without you even knowing, under the hood the bank’s servers used quantum-generated keys).
However, experts also warn to avoid hyperbole in the near term. Quantum computing is not magic; it won’t solve all problems or immediately outperform classical computers at everything. Conventional computers will also improve (plus new technologies like neuromorphic chips, more sophisticated AI, etc., will advance alongside). It’s likely that for many tasks, classical or classical+AI approaches will remain superior or sufficient. Quantum computers will be deployed where they truly add value – they’ll become another powerful tool in the toolbox. In the end, the transformative potential of quantum computing will be realized through careful integration with existing systems and by solving previously unsolvable challenges rather than replacing everyday computing tasks.
To conclude the outlook: the next decade of quantum computing will be about turning promise into reality – scaling up hardware, proving quantum advantage on real problems, and integrating quantum capabilities into the computing landscape. It will require overcoming formidable scientific and engineering challenges, but the momentum and talent entering the field give reason for optimism. The result, if successful, won’t be a single “big bang” invention but a gradual inflection point in computing history – one that in hindsight may be seen as on par with the microprocessor or the internet in terms of its impact on technology and society.
Conclusion
Quantum computing is often described with superlatives – revolutionary, game-changing, unprecedented. As we have seen, there is good reason for such enthusiasm: by exploiting quirks of nature that defy classical intuition, quantum computers promise to tackle problems that were previously thought intractable. From breaking the cryptography that underpins our digital security to accurately simulating the molecules of life for drug discovery, the potential applications span a remarkable range of fields. Major industries – cybersecurity, pharmaceuticals, finance, logistics, energy, and beyond – stand to be reshaped by this new form of computation. Already, collaborative efforts are underway to explore these applications: banks running quantum algorithms for portfolio optimization, drug companies partnering with quantum startups to model proteins, utilities testing quantum methods for grid management, and governments investing billions to not get left behind in the quantum race (Chinese scientists are at the forefront of the quantum revolution – The Washington Post) ( JPMorgan leads quantum computing arms race ).
Yet, it’s equally clear that quantum computing is not a magic wand, at least not in its current form. The road to practical, widespread use is challenging. Qubits are fragile and fickle; scaling them while keeping error rates low is an immense technical hurdle. In 2025, we find ourselves at an interesting juncture: quantum computers exist and work, but they are still mostly limited to demos and prototypes. No world-changing drug has been discovered by a quantum computer (not yet, anyway), no power grid is dispatching electricity solely on qubit calculations, and your personal data is still safe from quantum codebreakers for the moment. The progress is real but often behind the scenes – incremental improvements in coherence time, new error-correction records, a prototype algorithm beating a classical one on a toy problem. It’s the kind of foundational progress that doesn’t make headlines every day but is absolutely necessary for the end goal.
One might ask, when will quantum computing truly arrive? History suggests that transformative technologies often take longer to materialize than first hoped, but then their impact exceeds expectations in the long run. It took decades from the first integrated circuit to the smartphone, or from the discovery of DNA’s structure to modern genetic engineering. For quantum computing, many experts envision the 2030s as the decade when it transitions from a brilliant lab experiment to a practical industry tool. By then, if current trends hold, we should have medium-sized fault-tolerant quantum computers performing tasks that no classical supercomputer can match (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum). Industries will have gradually adopted hybrid workflows that incorporate quantum routines where beneficial. End-users might not even realize when an application is leveraging quantum – much like most people don’t know which parts of their software run on a GPU or in the cloud – they will simply enjoy better results, be it safer online communication, faster delivery of goods, cheaper drug development, or more reliable clean energy.
Meanwhile, the preparation for that future is happening now. Companies are “quantum-proofing” their encryption in anticipation of future quantum attacks (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House). Universities are churning out students who know quantum algorithms. A global community of scientists and engineers is meticulously solving problem after problem to make quantum computing viable. International standards will likely emerge (for instance, in how quantum devices report their capabilities, or interfaces to integrate them with classical IT infrastructure). There’s also a growing ecosystem of quantum startups focusing on everything from specialized software to novel cooling systems, indicating a healthy maturation of the field.
It is important throughout to avoid the hype pitfalls. Quantum computing is sometimes portrayed in almost mystical terms, which can lead to misconceptions. It’s not true that a quantum computer can solve every problem exponentially faster – in fact, for many ordinary computing tasks (word processing, email, etc.), a quantum computer offers no advantage. The power of quantum is specific: it excels at problems with certain structures (like factoring, unstructured search, simulating quantum systems, optimization landscapes with many local minima, etc.). For other problems, classical computers or AI might always be more practical. Therefore, the narrative has shifted away from “quantum computers will replace classical computers” to a more nuanced understanding: they will augment what we can do, unlocking new possibilities in domains that matter most.
One must also consider that as quantum computers develop, classical algorithms are not standing still. A salient example was IBM’s response to Google’s supremacy claim, showing a better classical simulation approach (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum). In fields like chemistry and optimization, classical methods and even AI are improving too. This “moving target” means quantum needs to leapfrog what classical methods can do at the time of deployment. It’s a high bar, but not unreachable – nature gives quantum computers a form of parallelism and interference that is unique.
In closing, the story of quantum computing is one of human ingenuity at its finest. We took a set of phenomena that Einstein once called “spooky” (quantum entanglement) and learned how to wield them for computation. We’ve built devices that operate on principles utterly foreign to the classical world – a feat of science and engineering few would have believed 50 years ago. As Dr. John Preskill reflected, the progress in quantum hardware has been brisk and remarkable (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post), and it continues to accelerate. How this story unfolds in the next chapters will be exciting to watch.
The industries that stand to gain are right to invest and explore now, so they can be ready as quantum capabilities come online. It’s somewhat reminiscent of the early days of computing in business – companies that embraced computers early gained a competitive edge, those that hesitated fell behind. The difference this time is that governments are deeply involved too, mindful of national competitiveness and security. It’s a rare convergence of scientific intrigue, commercial interest, and geopolitical significance.
Ultimately, the measure of success will be when quantum computing delivers real-world value that improves lives or creates new knowledge. That might be the discovery of a new drug for a currently incurable disease, enabled by quantum simulations. It might be a major step towards net-zero emissions because quantum-designed catalysts made carbon capture economically viable. It might be that your personal data remains secure in the coming decades because we transitioned to quantum-resistant cryptography in time. These outcomes, if they come to pass, will justify the decades of work by thousands of researchers.
Standing in 2025, we can say: quantum computing is no longer science fiction, but it’s also not yet commonplace reality. It is a budding revolution, carefully and methodically reshaping industries one step at a time. As with any revolution, predicting the exact timeline is tricky – but the direction is clear. The consensus of the tech and science community is that quantum computing will be transformative, and the prudent course is to engage with it proactively. The quantum future is being built now, and its impact will unfold over the next many years, quietly at first and then, as the saying goes, all at once.
In the words of a recent White House fact sheet: “Quantum information science holds the potential to drive innovations across the economy…while future quantum computers may break encryption, we are preparing now so that quantum will be a benefit to society and not a threat.” (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) The dual nature of quantum computing – its power and its risks – will be managed through responsible development and deployment. If we get it right, quantum computing will indeed reshape industries and usher in solutions to problems previously deemed unsolvable, marking a new era in technology as impactful as the digital revolution that preceded it.
Sources:
- IBM – What is Quantum Computing? (IBM Think Blog) (What Is Quantum Computing? | IBM) (What Is Quantum Computing? | IBM)
- NIST – Quantum Computing Explained (NIST.gov) (Quantum Computing Explained | NIST) (Quantum Computing Explained | NIST)
- The Washington Post – Google’s Quantum Supremacy Breakthrough (2019) (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post) (Google scientists say they’ve achieved ‘quantum supremacy’ breakthrough over classical computers – The Washington Post)
- IEEE Spectrum – China’s Quantum Computers and Primacy (2021/2024) (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum) (Two of World’s Biggest Quantum Computers Made in China – IEEE Spectrum)
- PostQuantum (Marin Ivezic) – IBM Unveils 1,121-Qubit Condor Processor (2023) (IBM Unveils Condor: 1,121‑Qubit Quantum Processor) (IBM Unveils Condor: 1,121‑Qubit Quantum Processor)
- Physics World – Quantum error correction milestone by Google (2023) (Quantum processor enters unprecedented territory for error correction – Physics World) (Quantum processor enters unprecedented territory for error correction – Physics World)
- Finextra – JPMorgan leads quantum computing arms race (banks) (2025) ( JPMorgan leads quantum computing arms race ) ( JPMorgan leads quantum computing arms race )
- BusinessWire – Goldman Sachs, QC Ware, IonQ Monte Carlo demo (2021) (Goldman Sachs, QC Ware and IonQ Demonstrate Quantum Algorithms Proof-of-Concept That Could Revolutionize Financial Services, Other Industries) (Goldman Sachs, QC Ware and IonQ Demonstrate Quantum Algorithms Proof-of-Concept That Could Revolutionize Financial Services, Other Industries)
- White House OSTP – Fact Sheet on Post-Quantum Cryptography (2024) (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House) (FACT SHEET: Biden-Harris Administration Continues Work to Secure a Post-Quantum Cryptography Future | OSTP | The White House)
- DHL – Quantum computing in logistics (DHL report) (2020) (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things) (Quantum computing could transform the logistics industry within the next decade | DHL Logistics of Things)
- PostQuantum – Quantum Use Cases in Pharma & Biotech (2023) (Quantum Use Cases in Pharma & Biotech) (Quantum Use Cases in Pharma & Biotech)
- Quantum Zeitgeist – E.ON taps quantum for energy grid (2024) (EON Taps Quantum Computing To Optimize Energy Grid Complexity) (EON Taps Quantum Computing To Optimize Energy Grid Complexity)
- Flagship Pioneering – Quantum Computing is Real (Explainer) (2022) (Quantum Computing is Real. It Will Simulate the… | Flagship Pioneering) (Quantum Computing is Real. It Will Simulate the… | Flagship Pioneering)
- Washington Post – Chinese scientists at forefront of quantum (2019) (Chinese scientists are at the forefront of the quantum revolution – The Washington Post) (Chinese scientists are at the forefront of the quantum revolution – The Washington Post)
- PostQuantum – Quantum Use Cases in Materials & Chemicals (2023) (Quantum Computing Use Cases in Materials & Chemicals) (Quantum Computing Use Cases in Materials & Chemicals)