Quantum computers don't speed up everything. They help with structured problems where quantum interference can cancel wrong answers: molecular simulation, certain optimization structures, cryptographic problems with periodic structure, and unstructured search (with a modest quadratic speedup). Most business software problems are not quantum-relevant.

Chapter 2 of 7 13 min

What Makes a Problem Quantum

When quantum computing helps and when it doesn't. Complexity theory made accessible for technical leaders assessing quantum relevance for their problems.

The Assumption That Costs Money

Most conversations about quantum computing start from a wrong assumption: that quantum computers are simply faster classical computers. That they do the same thing, but with more parallelism, like adding more cores to a server.

This assumption is expensive because it leads to a predictable sequence: a team identifies their slowest computation, ships it to a quantum cloud service, gets worse results than their laptop, and concludes quantum computing is hype. The problem wasn’t the quantum hardware. It was the question.

Quantum computers are not faster classical computers. They are a different kind of computer that exploits different physics to solve different problems. The word “different” is doing all the work in that sentence.

The Expensive Mistake

A team identifies their slowest computation, ships it to a quantum cloud service, gets worse results than their laptop, and concludes quantum is hype. The problem wasn’t the hardware. It was the question.

How Quantum Computing Actually Works (Without the Physics)

A classical computer explores solutions one at a time (or several at a time with parallelism). A quantum computer does something structurally different: it sets up a computation where all possible answers exist simultaneously as quantum amplitudes, then uses interference to amplify the amplitudes of correct answers and suppress the amplitudes of wrong ones.

If that sounds like it should work for everything, consider: the interference has to be designed to amplify the right answer. That design is the quantum algorithm. And not all problems have a mathematical structure that allows you to construct such an interference pattern.

A useful analogy from signal processing: think of a quantum algorithm as a matched filter. If your signal (the correct answer) has a specific frequency signature (mathematical structure), you can build a filter that amplifies it and attenuates everything else. If your signal has no distinguishable structure, no filter will help. You’re back to examining every possibility.

The Complexity Zoo, Translated

Computer science classifies problems by how hard they are. You don’t need to memorize the classification, but you need to understand three boxes to make sense of quantum claims.

P (Polynomial time): Problems a classical computer can solve efficiently. Sorting a list. Finding a shortest path. Multiplying matrices. These problems scale reasonably as input grows. Quantum computers can solve these too, but there’s no point. Your classical hardware already handles them.

NP (Nondeterministic Polynomial time): Problems where a proposed solution can be verified quickly, but finding the solution might take impractically long. The traveling salesman problem. Scheduling. Protein folding (in its decision form). Many real-world optimization problems live here.

BQP (Bounded-error Quantum Polynomial time): Problems a quantum computer can solve efficiently, with bounded probability of error. This is quantum computing’s home territory.

The relationship that matters: P is inside BQP (quantum can do everything classical can). BQP contains some problems believed to be outside P (quantum can solve some problems classical computers can’t solve efficiently). But BQP almost certainly does not contain all of NP. This last point is critical and routinely ignored in vendor presentations.

It means: quantum computers cannot efficiently solve all hard problems. They can efficiently solve hard problems that have the right mathematical structure.

BQP Is Not All of NP

Quantum computers cannot efficiently solve every hard problem. BQP (quantum’s home territory) contains some problems believed to be outside P, but almost certainly does not contain all of NP. Structure matters more than difficulty.

The Four Families of Quantum-Relevant Problems

After decades of research, quantum algorithms cluster around four families of problems where quantum offers genuine, proven or strongly conjectured advantages.

1. Simulating Quantum Systems

This is the one place where quantum advantage is almost tautological. Simulating quantum mechanics on a classical computer is exponentially expensive: the state space doubles with every particle you add. A system of 50 interacting quantum particles requires 2^50 (about a quadrillion) complex numbers to describe classically.

A quantum computer simulates quantum systems naturally because it is a quantum system. Richard Feynman’s 1981 observation that you need a quantum machine to efficiently simulate quantum mechanics remains the strongest foundational argument for quantum computing.

Practical relevance: molecular simulation for drug discovery and materials science. Simulating the electronic structure of a molecule like caffeine (24 heavy atoms) is beyond classical exact methods. Battery electrolyte chemistry, catalyst design, and high-temperature superconductor modeling all involve quantum systems that are classically intractable at useful scales.

Current status: small molecules (up to about 12-20 qubits of meaningful simulation) have been demonstrated. Useful-scale molecular simulation likely requires thousands of error-corrected logical qubits. The advantage is real. The hardware isn’t there yet.

2^50

Classical Cost

~1 quadrillion numbers for 50 particles

12-20

Qubits Demonstrated

Meaningful molecular simulation

1,000s

Logical Qubits Needed

Useful-scale simulation

2. Problems with Periodic or Algebraic Structure

Shor’s algorithm factors large integers exponentially faster than any known classical method. This isn’t because factoring is generically hard. It’s because factoring has a hidden periodic structure that quantum interference can detect.

The quantum Fourier transform, which underlies Shor’s algorithm, finds periodicities in functions exponentially faster than classical approaches. This is a precise, well-understood mathematical capability, not a vague “speedup.”

Practical relevance: breaking RSA and elliptic curve cryptography. This requires a fault-tolerant quantum computer with several thousand logical qubits, which translates to millions of physical qubits at current error rates. No such machine exists today. The timeline is uncertain but most estimates place it between 2030 and 2040. The consequence, however, is already present: data encrypted today with classical methods can be stored and decrypted later (“harvest now, decrypt later”), which is why post-quantum cryptography migration is already urgent.

Current status: largest numbers factored by Shor’s algorithm on real hardware are trivially small (hundreds, not the hundreds-of-digits numbers used in cryptography). The theoretical result is ironclad. The engineering gap is enormous.

2024

NIST finalizes post-quantum cryptography standards

2025

Governments mandate PQC migration

2030-2040

Cryptographically relevant quantum factoring

3. Optimization with Specific Structure

This is where most of the hype concentrates and where the honest answer is most nuanced.

Grover’s algorithm provides a quadratic speedup for unstructured search: finding a marked item in an unsorted database of N items takes roughly the square root of N quantum steps instead of N classical steps. Quadratic, not exponential. For a billion items, that’s roughly 31,000 quantum steps versus a billion classical ones. It sounds impressive until you account for the overhead: each quantum step is much slower and noisier than a classical step.

QAOA (Quantum Approximate Optimization Algorithm) and related variational approaches attack combinatorial optimization. The honest assessment: for the problem sizes runnable on today’s hardware (tens to low hundreds of variables), classical solvers, including simulated annealing, branch-and-bound, and modern SAT solvers, generally match or beat quantum results. Whether quantum gains an advantage at larger scales remains an open and actively debated research question.

There are specific problem structures where quantum speedups beyond Grover’s quadratic are conjectured but not proven: certain constraint satisfaction problems, some semidefinite programming relaxations, and specific classes of sampling problems.

Practical relevance: portfolio optimization, vehicle routing, logistics scheduling, job shop scheduling. These are real problems that real companies spend real money solving. The question isn’t whether they’re important. It’s whether quantum approaches will beat the classical algorithms that have been optimized for decades and continue to improve.

Current status: quantum advantage for practical optimization has not been demonstrated. It may exist for specific problem structures at sufficient scale. Or it may not. Honest researchers and vendors acknowledge this uncertainty.

4. Sampling and Machine Learning (Conjectured)

Some problems involve sampling from complex probability distributions: certain financial models, training some types of machine learning models, and generating synthetic data with specific statistical properties.

Quantum computers may have advantages for specific sampling tasks. Google’s 2019 “quantum supremacy” result and subsequent random circuit sampling experiments demonstrated that quantum processors can sample from distributions that are classically intractable to simulate exactly. Whether this capability translates to practical, useful sampling problems is an active research area.

Quantum machine learning has received enormous attention and investment. The honest state: for classical data processed by quantum machine learning algorithms, no convincing advantage has been demonstrated at practical scale. The cases where quantum machine learning helps most naturally involve quantum data, which is a niche that grows as quantum sensors and quantum communication advance.

The Overhead Question

Even when a quantum algorithm provides a theoretical speedup, the practical question is whether it actually runs faster on real or near-term hardware.

Consider Grover’s search. The quadratic speedup means searching a trillion items takes about a million quantum steps instead of a trillion classical ones. But each quantum step involves: preparing qubits, executing gates with 99-99.9% fidelity (meaning 0.1-1% error per gate), potentially error-correcting (at 1000:1 physical-to-logical qubit ratios), and measuring results. If a quantum step takes a microsecond with all overhead, the million quantum steps take one second. A classical computer iterating through a trillion simple comparisons at a few billion per second finishes in a few hundred seconds. The quantum advantage is real but modest, maybe 100x at a problem scale of a trillion.

For this overhead to be justified, you need problems large enough that the quantum scaling advantage overcomes the per-step penalty. For quadratic speedups like Grover’s, the crossover point is very large. For exponential speedups like Shor’s or quantum simulation, the crossover comes much sooner (at smaller problem sizes) once the hardware is capable.

How to Assess Your Own Problems

A practical checklist, not a guarantee.

Step 1: Is the problem actually hard? Many “optimization problems” in business are solved perfectly well by classical heuristics. If your supply chain optimization runs in minutes on a workstation, quantum will not make it meaningfully faster. Check whether your bottleneck is computational hardness or simply data pipeline latency, bad implementation, or insufficient classical hardware.

Step 2: Does the problem have quantum-exploitable structure? Does it involve simulating quantum-mechanical systems? Does it rely on hard mathematical problems with algebraic structure (like factoring)? Does it involve sampling from complex distributions? Is it a combinatorial optimization where the constraint structure creates quantum interference opportunities? If none of these apply, the problem is unlikely to be quantum-relevant.

Step 3: What is the problem scale? Quantum advantages generally require problems above a minimum scale to overcome the per-operation overhead. For quadratic speedups, this scale is very large. For exponential speedups, it’s smaller but still requires hardware that may not exist yet.

Step 4: What is the classical baseline? What is the best known classical algorithm for this problem? How aggressively has it been optimized? Classical algorithms for combinatorial optimization, simulation, and machine learning improve every year. Your quantum approach needs to beat not today’s classical methods, but the classical methods that will exist when the quantum hardware matures.

Step 5: What is the value of better solutions? A 1% improvement in a trillion-dollar portfolio allocation is worth $10 billion. A 1% improvement in scheduling the night shift at a warehouse is worth very little. Quantum computing’s real near-term value is in domains where small improvements in solution quality at large scale translate to enormous value. These domains include financial derivatives pricing, pharmaceutical molecule screening, materials discovery, and cryptographic security assessment.

What This Means for You

Most software problems are not quantum problems. The vast majority of what keeps engineering teams busy, building APIs, processing data, training neural networks on classical data, rendering interfaces, managing infrastructure, has no quantum angle. That’s not a failure of quantum computing. It’s a reflection of the fact that quantum mechanics helps with a specific class of computational problems.

The useful question is not “can quantum computing help my business?” It’s “does my business have a specific computational problem with the right mathematical structure, at sufficient scale, where the value of better solutions justifies the investment in a technology that is still maturing?”

For most organizations, the honest answer today is: not yet, but it’s worth understanding the landscape so you recognize the moment when it changes.

Key Takeaways

  • Quantum computers solve specific problems with specific mathematical structure, not everything faster.
  • Four families: quantum simulation (strongest case), algebraic/periodic problems (Shor’s), structured optimization (uncertain), and sampling/ML (conjectured).
  • Quadratic speedups (Grover’s) require very large problem sizes to justify quantum overhead. Exponential speedups (Shor’s, simulation) pay off sooner once hardware is ready.
  • Ask five questions before investing: Is the problem actually hard? Does it have quantum-exploitable structure? Is the scale large enough? What is the classical baseline? What is the value of better solutions?