Quantum information obeys strict constraints: it cannot be copied (no-cloning), cannot be deleted without a trace (no-deleting), and leaks into the environment through decoherence. These constraints make error correction harder but also make quantum cryptography possible.

Chapter 5 of 7 13 min

The Rules Quantum Must Obey

No-cloning, no-deleting, the uncertainty principle, and decoherence. The constraints on quantum information are not bugs. They enable quantum cryptography and define what quantum computers can do.

The Rules Quantum Must Obey

Every technology has constraints. Classical computers cannot solve certain problems in polynomial time. Combustion engines cannot exceed Carnot efficiency. Bridges cannot ignore gravity. These constraints are not failures of engineering. They are features of the physics the technology operates within.

Quantum information has its own constraints, and they are strange ones. You cannot copy a quantum state. You cannot delete one without leaving a trace. You cannot simultaneously know certain pairs of properties with arbitrary precision. And quantum information constantly leaks away through decoherence.

At first glance, this looks like a list of problems. Every constraint sounds like something that makes quantum computing harder. And they do make it harder. But here is the part that deserves careful attention: the same constraints that frustrate quantum engineers are the ones that enable quantum cryptography, guarantee the security of quantum key distribution, and make certain quantum protocols provably impossible to cheat. The rules are not bugs. They are the operating conditions of a fundamentally different kind of information.

Can be copied freely. Can be read without changing it. Error correction uses redundant copies. Deletion is straightforward.

Cannot be copied (no-cloning). Reading changes the state. Error correction requires entanglement, not copies. Deletion leaves traces (no-deleting).

The no-cloning theorem: you cannot copy a quantum state

In classical computing, copying is trivial. You copy files, clone repositories, replicate databases, back up servers. The ability to copy information is so basic that you do not think of it as a capability. It is like breathing.

In quantum computing, copying is impossible. Not difficult. Not expensive. Physically impossible. The no-cloning theorem, proved by William Wootters and Wojciech Zurek in 1982, states that no physical process can take an arbitrary unknown quantum state and produce a perfect copy of it while leaving the original intact.

The proof is remarkably simple. It follows directly from the linearity of quantum mechanics. If you could clone a qubit in superposition, the mathematics would require the cloning operation to work on every possible superposition state. But linearity dictates that the operation can only do one thing to the combined system. A device that perfectly copies state 0 and perfectly copies state 1 will not perfectly copy a superposition of 0 and 1. The math does not permit it.

Note the qualifiers: “arbitrary” and “unknown.” If you know the state you want to copy, you can prepare another qubit in the same state. If the qubit is known to be in state 0, you can prepare a second qubit in state 0. That is not cloning. That is preparation. Cloning means taking a qubit in an unknown state and replicating it. That is what is forbidden.

The consequences ripple through all of quantum information:

Error correction becomes harder. In classical computing, the standard approach to error correction is redundancy: make three copies of each bit and take a majority vote. If one copy flips, the other two outvote it. You cannot do this with qubits because you cannot copy the qubit. Quantum error correction exists, but it works through entanglement rather than copying, and it requires many physical qubits to protect a single logical qubit. Current estimates for fault-tolerant quantum computing suggest ratios of roughly 1,000 physical qubits per logical qubit, though this ratio depends heavily on the error rate and the error-correction code used.

Eavesdropping becomes detectable. If Eve intercepts a qubit being sent from Alice to Bob and tries to copy it (to read it while forwarding the original), she cannot. She can measure the qubit, but measurement changes the state (Chapter 2). She can then prepare a new qubit in the measured state and send it to Bob, but this qubit will not be in the same state as the original unless Eve guessed the measurement basis correctly. This disturbance is detectable by Alice and Bob through statistical tests. The no-cloning theorem is the physical foundation of quantum key distribution security.

Quantum information is scarce. You cannot back up a quantum state. You cannot make a safety copy before a risky operation. If a qubit decoheres, the information it carried is gone. This scarcity is not just an engineering challenge. It is a fundamental feature. Quantum information is a non-fungible resource in a way that classical information is not.

No-Cloning as Feature

The same theorem that makes quantum error correction hard also makes quantum cryptography possible. An eavesdropper cannot copy a qubit without measuring it, and measurement changes the state in a detectable way. The constraint is simultaneously a problem and a security guarantee.

The no-deleting theorem: destruction leaves traces

The no-deleting theorem, proved by Arun Pati and Samuel Braunstein in 2000, is the complement of no-cloning. It states that you cannot take a quantum state and destroy it while leaving the rest of the universe exactly as it was. If you erase quantum information from one place, it must show up somewhere else.

This may sound abstract, but it connects to a deep physical principle: quantum mechanics conserves information. The total quantum information in a closed system does not increase or decrease. It can move around. It can become scrambled across many particles to the point where it is practically inaccessible. But it does not vanish.

The black hole information paradox, one of the major open problems in theoretical physics, is essentially a debate about this principle. Stephen Hawking argued that black holes destroy quantum information. Most physicists now believe information is preserved, even in black holes, though the mechanism remains controversial. The no-deleting theorem is the information-theoretic side of this physical principle.

For practical quantum computing, no-deleting means that discarding qubits is not free. When you measure a qubit (and thus project it into a definite state), you have not deleted the information, you have converted it into a classical measurement result plus environmental entanglement. The bookkeeping of quantum information must account for where it went, not just where it is.

The uncertainty principle as an information constraint

The Heisenberg uncertainty principle is one of the most misunderstood concepts in physics. The common version says: you cannot measure position and momentum simultaneously with arbitrary precision. This is correct but misleading, because it sounds like a limitation of measurement equipment. Build a better instrument and maybe you can measure both.

That understanding is wrong. The uncertainty principle is not about measurement limitations. It is about the quantum state itself. A quantum state that has a very precise position genuinely does not have a precise momentum. The state does not possess both properties simultaneously. Asking for both is like asking a wave to have a definite location: the more tightly localized the wave, the more spread out its frequency content, and vice versa. This is a mathematical property of waves, not a limitation of wave detectors.

For quantum information, the uncertainty principle is better understood as an information-theoretic constraint: certain pairs of measurements are incompatible, meaning no quantum state can have definite outcomes for both. If you prepare a qubit with a definite value in the standard basis (0 or 1), it has maximum uncertainty in the Hadamard basis (equal probability of plus or minus). If you prepare it with a definite value in the Hadamard basis, the standard basis measurement becomes completely random.

This incompatibility is what makes quantum key distribution work. Alice sends qubits prepared in randomly chosen bases. Eve does not know which basis Alice used. If Eve measures in the wrong basis, she gets a random result and disturbs the state. When Alice and Bob compare bases over a public channel (discarding measurements where they used different bases), they can detect Eve’s interference through increased error rates.

The uncertainty principle, in this context, is not an obstacle. It is the mechanism by which privacy is enforced. Nature itself prevents simultaneous knowledge of incompatible properties, and that limitation becomes a cryptographic guarantee.

Decoherence: the leak in every quantum system

Decoherence is the process by which a quantum system loses its quantum properties through interaction with its environment. It is the practical enemy of quantum computing, and understanding it correctly is essential for evaluating any quantum technology claim.

Here is what happens. A qubit in superposition has definite amplitude relationships between its 0 and 1 components. Those relationships (the phases, the coherences) are what enable interference and entanglement. When the qubit interacts with its environment, even weakly, information about the qubit’s state leaks into the environment. The environment effectively “measures” the qubit, bit by bit, destroying the superposition.

Consider a superconducting qubit on a chip. The qubit is a tiny circuit cooled to near absolute zero. Despite the extreme cooling, thermal vibrations, electromagnetic noise, and material imperfections all interact with the qubit. Each interaction leaks a small amount of quantum information into the environment. Over time (measured in microseconds to milliseconds for current hardware), the qubit’s quantum state degrades into a classical mixture: no superposition, no entanglement, no interference. Just a regular bit, and not a very reliable one at that.

Decoherence is not the same as a random bit flip. A bit flip is a simple error: the state changed from 0 to 1 or vice versa. Decoherence is more subtle. It erodes the phase relationships between amplitudes, turning quantum superpositions into classical probability distributions. The information does not just change. It disperses into the environment.

There are two main types of decoherence that quantum engineers track:

T1 (energy relaxation). The qubit loses energy and decays from the excited state (1) to the ground state (0). This is analogous to a ball rolling downhill. T1 times for current superconducting qubits range from tens to hundreds of microseconds.

T2 (dephasing). The qubit loses its phase information without necessarily losing energy. The superposition degrades into a classical mixture. T2 is always less than or equal to 2 times T1. In practice, T2 is often the more restrictive constraint.

The ratio between gate operation time and decoherence time determines how many gates you can execute before the quantum state degrades beyond usefulness. If a single gate takes 20 nanoseconds and T2 is 100 microseconds, you can execute roughly 5,000 gates before decoherence becomes dominant. This is why circuit depth (Chapter 4) is the critical metric: deeper circuits need longer coherence times.

~5,000

Max Gates Before Decoherence

At 20ns gate time, 100μs T2

~1,000:1

Physical to Logical Qubit Ratio

Current error correction estimates

The error correction imperative

The combination of no-cloning and decoherence creates a uniquely difficult error correction problem.

In classical computing, error correction is straightforward in principle: copy the bit, check the copies, fix any that disagree. This works because classical bits can be copied and read without disturbance.

In quantum computing, you cannot copy the qubit (no-cloning), you cannot read it to check for errors (measurement destroys the state), and errors accumulate continuously rather than in discrete flips (decoherence). Every aspect of the classical approach fails.

Quantum error correction solves this through an approach that is genuinely clever. Instead of copying a qubit, you encode it across many physical qubits using entanglement. The encoded state has redundancy, but the redundancy is distributed across the entanglement structure, not across copies. You can check for errors by measuring certain properties of the multi-qubit system that reveal whether an error occurred and what type, without measuring (and thus destroying) the encoded information itself. These measurements, called syndrome measurements, detect the error’s fingerprint without revealing the data.

The overhead is substantial. Protecting a single logical qubit against realistic error rates requires on the order of 1,000 physical qubits with current codes and expected near-term error rates. This is why the jump from “number of physical qubits” to “number of logical qubits” is so significant. A 1,000-qubit processor might support only 1 error-corrected logical qubit. A million-qubit processor might support 1,000 logical qubits. The numbers are illustrative, not precise, but the ratio gives you a sense of the engineering challenge.

Scale Check

A 1,000-qubit processor might support only 1 error-corrected logical qubit. A million-qubit processor might support 1,000 logical qubits. The jump from physical qubit count to logical qubit count is where most quantum computing timelines get sobering.

Constraints as features

There is a pattern worth noticing. Every constraint in this chapter has a dual nature.

No-cloning frustrates error correction. It also enables quantum cryptography. No-deleting complicates qubit disposal. It also guarantees information conservation, which is the basis for reversible quantum computation. The uncertainty principle limits what can be simultaneously known. It also enforces the privacy of conjugate measurements. Decoherence destroys quantum states. It also explains why the classical world looks classical (decoherence is how macroscopic objects avoid quantum weirdness).

This duality is not a coincidence. The constraints of quantum information define the boundary between quantum and classical. A world without these constraints would be a world where quantum mechanics reduces to classical mechanics. The constraints are what make quantum information genuinely different, and therefore genuinely useful for things classical information cannot do.

When evaluating quantum technology claims, these constraints are your reality-check toolkit. Anyone claiming to clone quantum states, suppress decoherence entirely, or bypass the uncertainty principle is either confused or misleading you. Anyone building systems that work within these constraints, using error correction to manage decoherence, using no-cloning to guarantee security, using incompatible measurements for cryptographic protocols, is working with the physics rather than against it.

Key Takeaways

  • No-cloning: you cannot copy an arbitrary unknown quantum state. Proved 1982.
  • No-deleting: destroying quantum information leaves traces somewhere in the environment
  • Uncertainty principle: not a measurement limitation but a fundamental property of quantum states
  • Decoherence: quantum information leaks into the environment over time (T1 and T2 timescales)
  • Every constraint has a dual nature: what frustrates engineering also enables cryptography and security

The next chapter will show three protocols that demonstrate these principles in action: teleportation, superdense coding, and the CHSH game. Each one uses the capabilities from Chapters 2 through 4 while respecting the constraints from this chapter, and each one achieves something that is provably impossible by classical means.