What Classical Computers
Actually Do
Before you can understand why quantum computers are different, you need to understand precisely what classical computers are. Not vaguely — precisely. The answer is surprisingly simple, and surprisingly profound: it is switches. All the way down.
What Is a Computer, Really?
A modern laptop can run a 3D game, play music, send emails, and run an AI model simultaneously. What is the single most fundamental operation it is performing in all of these cases?
The question "what does a computer do?" has many valid answers depending on what level you look at it. At the level of software, it follows instructions. At the level of hardware, it moves electrons. But at the most fundamental level — the level that actually determines what a computer can and cannot do — the answer is this: it manipulates bits.
And a bit is the simplest possible thing: a distinction between two states. On or off. True or false. 1 or 0. Everything else is built from this.
Everything Is Lego
Imagine you are handed a single Lego brick. It is small, simple, and by itself unremarkable. But give someone enough Lego bricks — millions of them — and the instructions for how to assemble them, and they can build a model of the Eiffel Tower. Or a working mechanical clock. Or a full-scale X-wing fighter.
The Lego brick is not interesting in itself. What is interesting is that complexity emerges from simple pieces combined according to simple rules.
This is the insight that makes computers possible: you do not need a different kind of hardware for every different kind of task. You need exactly one kind of hardware — something that can represent 0 or 1, and switch between them on command — combined in the right patterns, enough times, fast enough.
The history of computing is essentially the history of making that one operation — flip a switch — faster, cheaper, and smaller.
Bits, Switches, and Transistors
A transistor is an electronic switch. It has three terminals: one that lets current flow in, one that lets current flow out, and one that controls whether the switch is open or closed. When the control terminal receives a small voltage, the switch closes and current flows — the bit is 1. When the control terminal receives no voltage, the switch opens and no current flows — the bit is 0.
That is the complete description of the physical mechanism underlying all digital computing. Everything else — CPUs, memory, graphics cards, neural networks — is transistors arranged in patterns.
The scale is what makes it astonishing
Logic gates: operations on bits
A transistor on its own is just a switch. Combining transistors in specific wiring patterns creates logic gates — devices that take one or two bits as input and produce one bit as output. The three fundamental gates:
NOT gate: takes 1 input, flips it. NOT(0) = 1. NOT(1) = 0. Built from a single transistor wired as an inverter.
AND gate: takes 2 inputs, outputs 1 only if both inputs are 1. AND(1,1) = 1. AND(1,0) = 0. Everything else = 0. Four transistors.
OR gate: takes 2 inputs, outputs 1 if at least one input is 1. OR(0,0) = 0. OR(1,0) = 1. OR(0,1) = 1. OR(1,1) = 1. Four transistors.
Building Numbers from Bits
You know the decimal (base-10) number system from everyday life. It has ten digits: 0 through 9. Each position in a number represents a power of 10: the rightmost digit is the ones place ($10^0 = 1$), the next is tens ($10^1 = 10$), then hundreds ($10^2 = 100$), and so on.
Computers use binary (base-2), which has only two digits: 0 and 1. Each position represents a power of 2. With 4 bits, you can represent any whole number from 0 to 15. Here is how:
1011 = $1\times8 + 0\times4 + 1\times2 + 1\times1 = 8+0+2+1 = 11$The interactive section below lets you feel this directly. Toggle each bit switch and watch the decimal number update in real time. Try to build every number from 0 to 15.
Bit Toggler — Build Every Number 0–15
Toggle each switch to turn it on (1) or off (0). The decimal, binary, and hex values update live. Try to reach every number from 0 to 15 — the number line at the bottom tracks your progress.
What a Classical Computer Actually Is
- A bit is the simplest unit of information: on or off, 1 or 0Every piece of digital information — text, image, sound, video, code — is ultimately encoded as a sequence of bits. There is no smaller unit of classical information than this single binary distinction.
- A transistor is a physical switch that represents one bitModern CPUs contain around 100 billion transistors in an area the size of a thumbnail. Each can switch states billions of times per second. The power of a computer is not in the sophistication of its individual parts — it is in the scale and speed of their combined operation.
- Logic gates (NOT, AND, OR) are universal — any computation reduces to themShannon's 1937 proof: every logical and mathematical operation can be expressed as a combination of NOT, AND, and OR gates on bits. The entire software stack — operating systems, databases, AI models — rests on this foundation.
- Binary encodes numbers: each bit doubles the representable range4 bits represent 0–15 (16 values = $2^4$). 8 bits represent 0–255 (256 values = $2^8$). 64 bits represent over 18 quintillion values. Binary is used (not decimal) because two voltage levels are far more robust against noise than ten.
- Complexity emerges from simple rules applied at scaleA single transistor is not interesting. 100 billion transistors, wired according to the right patterns, running at 3 GHz, represent the most complex artefact humans have ever built. The Lego principle: everything complex is assembled from something simple.
You now know what a classical computer is.
It is very good at what it does.
So why do we need something different?
- Shannon, C. E. (1937). "A Symbolic Analysis of Relay and Switching Circuits." — Proved that Boolean algebra maps directly to electrical circuits; the theoretical foundation of all digital computing. Shannon was 21 years old.
- Nielsen, M. A. & Chuang, I. L. — Quantum Computation and Quantum Information, Cambridge, 2000. §1.2 "Quantum bits" — opens with classical bits as the baseline, which is exactly this lesson's role in the curriculum.
- Feynman, R. P. — Feynman Lectures on Computation, Addison-Wesley, 1996. Chapter 1 — Feynman's treatment of the bit and classical logic, written for a general audience with characteristic clarity.
- Intel — "Transistor Count History." A public record showing transistor counts in CPUs from 1971 (2,300 transistors in the Intel 4004) to 2024 (~100 billion in Apple M3 Ultra). Every doubling is Moore's Law in action.