In Bits, we learned the basic ideas of binary: how computers represent information using bits - sequences of 1s and 0s. But knowing how computers store information only gets us so far. The real magic is what they do with that information, and how. You can add in your head pretty quickly, but how does a computer - which only knows 1s and 0s - perform that same calculation? The answer lies in a beautiful branch of mathematics that’s over 170 years old, created by a man who never dreamed of a computer.

Mechanical Reasoning

In the 17th century, the German polymath Gottfried Wilhelm Leibniz (1646-1716) had a wild idea. Leibniz - co-inventor of calculus with Newton and made contributions to physics, philosophy, and just about every other field - wondered if human reasoning itself could be reduced to calculation. Could we create a system where logical arguments could be resolved not through rhetoric or debate, but through mechanical computation? He called this vision a “calculus of reasoning”.

It was an audacious dream, to say the least, but Leibniz lacked the mathematical tools to realize it. It would have to wait nearly two centuries. It would have to wait, for the work of George Boole.

Logic as Algebra

A self-taught English mathematician, George Boole (1815-1864) was fascinated by the idea of formalizing logic. In 1854, he published a book with an ambitious title: An Investigation of the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities.

In it, he showed that logical statements could be manipulated using algebraic rules, just like ordinary numbers. Instead of dealing with quanities, his algebra dealt with truth values. Statements could either be true or false, and they can be combined using logical operators like AND, OR, and NOT. For example, in Boole’s system:

  • “It is raining AND I have an umbrella” is true only if both parts are true.
  • “It is raining OR it is snowing” is true if at least one part is true.
  • “It is NOT raining” is true only if it is not raining.

These logical operators might seem obvious now, especially in the case of linguistics, but Boole’s insight was to give them a precise mathematical treatment. Logic had been studied for millenia, with some of its earliest stewards being Aristotle and the Stoics, but Boole’s algebraic approach gave it the precision and power of mathematical notation. He turned philosophy into algebra.

The catch? It was purely abstract. Boole wasn’t thinking about machines or circuits. This was just beautiful, theoretical mathematics. No one had any idea what to do with it.

Shannon’s Electric Epiphany

Fast forward to 1937. A 21-year-old MIT master’s student named Claude Shannon was working in the electrical engineering lab, tinkering with telephone switching circuits. The circuits were complex, hard to design, and nobody had a systematic way to analyze them.

Then, Shannon had an epiphany.

He realized that electrical switches behave exactly like Boolean algebra. A switch could be ON or OFF, just like Boole’s true and false. You could wire switches in series (both must be ON for current to flow) and that’s an AND operation. Wire them in parallel (either can be ON for current to flow) and that’s an OR operation.

Shannon’s master’s thesis, A Symbolic Analysis of Relay and Switching Circuits, showed that any logical operation could be implemented using electrical circuits. It’s often called the most important master’s thesis ever written, because it laid the foundation for all digital circuit design. That abstract algebra Boole invented nearly a century earlier? It would become the language of computation.

Suddenly, Leibniz’s dream didn’t seem so far-fetched. If logic was algebra, and algebra was circuits, then circuits could think. So then, what exactly was this algebra that Boole invented and Shannon brought to life in copper and wire? Lets dive into the building blocks of binary logic - the operations that power every calculation your computer makes.

Logic Gates: The Atoms of Computation

We call a physical device that perfoms Boolean operations a logic gate. In Shannon’s time, these were mechanical switches that clicked open and closed. Today, they’re microscopic transistors etched into silicon, billions of them packed into a chip smaller than your fingernail. But the principle remains the same: take binary inputs (1 or 0) and produce a binary output based on a logical rule.

There are three fundamental gates that form the basis of all digital logic: AND, OR, and NOT. Every computation your computer performs - from rendering this text to training a neural network - ultimately breaks down into some combination of these three operations.

The AND Gate

The AND gate outputs 1 only when all of its inputs are 1. Think of it like a security system that requires two keys to unlock: if either key is missing, the system stays closed.

A Quick Aside: Truth Tables

We can capture all possible behaviors of a logic gate using what we call truth tables. They are a compatc, systematic way of showing every combination of inputs and the resulting output. The gates themselves are defined by their truth tables, similar to how the plot of a graph is defined by its function (it defines a mapping for any possible input to some possible output, therefore defining the graph). Below is the truth table for the AND gate:

Input AInput BA AND B
000
010
100
111

Notice how only the last row outputs 1 - both inputs must be 1 for the output to be 1, or in other words, for AND to be true. This works exactly how we expect in linguistics: “It is raining AND I have an umbrella” is only true if both parts are true.

The OR Gate

The OR gate outputs 1 if at least one of its inputs is 1. It’s the inclusive OR, meaning it includes the case where both inputs are 1, rather than “A or B but not both”. Think of it like a light that can be turned on by either of two switches: if either switch is flipped, the light comes on.

Here is the truth table for the OR gate:

Input AInput BA OR B
000
011
101
111

Any 1 in the input produces a 1 in the output. This aligns with our linguistic intuition: “It is raining OR it is snowing” is true if at least one part is true, while it is false only if both parts are false.

The NOT gate is left as exercise for the reader. See if you can write out its truth table and figure out how it works! (Hint: it only has one input.)

Building Complexity

These three operations might seem almost too simple to be powerful. But here’s the remarkable thing: every computation can be built from just these three gates. Your computer’s ability to multiply million-digit numbers, render 3D graphics, or run machine learning models all comes down to clever combinations of AND, OR, and NOT gates.

Let’s see how we can combine these gates to create something more interesting. Recall how in our exploration of the OR gate, we mentioned that it is the inclusive OR, meaning it outputs 1 if either or both are 1. What if we wanted an exclusive OR - one that outputs 1 only if exactly one input is 1? We can build this using a combination of AND, OR,and NOT gates, and we call it XOR (pronounced “ex-or”, or “zor”). Here’s its truth table:

Input AInput BA XOR B
000
011
101
110

Notice the last row - when both inputs are 1, the output is 0. This is different from the inclusive OR gate, which would output 1 in that case. But we said earlier that we can build any logical operation using just AND, OR, and NOT. So how do we build XOR? Here’s one way to think about it: “A XOR B” is true when “A OR B” is ture, but A AND B is false. If we want to be a little more formal, we can write this as:

A XOR B = (A OR B) AND NOT (A AND B)

For the sake of completeness, and for I feel I would be doing you a disservice if I did not include it, let’s take a quick detour to formalize this notation. In the field of formal (mathematical) logic, we have special symbols to denote these logical operations. So the above expression would instead be written as:

where denotes XOR, denotes OR, denotes AND, and denotes NOT.

This arising complexity from the combination of simple rules shows us something profound: we can create new operations by chaining together simpler ones. And this is exactly how computers work - complex operations built from the smallest building blocks, weaving layers upon layers of abstraction.

Binary Addition: Putting It All Together

We’re finally ready to answer our opening question: how does a computer sum two numbers? Let’s start simple and add two single-bit binary numbers.

When we add in decimal, we learned rules like ”, write down the and carry the “. In binary, we have similar rules, but they’re simpler:

Where the last result is read “one-zero” in binary, which is in decimal. And the last case is key - when we add , we get in the current position and carry a to the next position. Sound familiar? The result bit is determined by XOR, while the carry bit is dtermined by AND! Let’s look at the truth table for adding two bits, A and B:

Input AInput BSum (A XOR B)Carry (A AND B)
0000
0110
1010
1101

Compare this to our earlier tables - the Sum columns is exactly A XOR B, and the Carry column is exactly A AND B. We’ve just built an adder using logic gates!

This simple circuit is called a half-adder, because it doesn’t handle a carry input from a previous addition, or, in other words, it can only add two bits. To add multi-bit numbers, we need full-adders, where each one takes two input bits plus a carry from the previous position.

Scaling Up

Here’s where it gets beautiful. Your computer’s processor has an Arithmetic Logic Unit (ALU) that performs addition on 64-bit numbers. That’s just 64 of these circuits chained together, each one passing its carry to the next. The same basic AND, OR, and XOR gates that add are also adding 18,446,744,073,709,551,615 + 18,446,744,073,709,551,615. Subtraction? That’s addition with some clever use of NOT gates (look up two’s complement from Chapter 1). Multiplication? That’s repeated addition with some shifting. Division? Repeated subtraction. Every arithmetic operation breaks down into these fundamental logical operations. And it all happens in billionths of a second, billions of times per second.

From Boole’s abstract algebra to Shannon’s circuits to the silicon chip in your computer - it’s all the same elegant idea. Logic is computation, and computation is logic. In the next chapter, we’ll see how these logic gates are physically organized into the components that make up a computer: the ALU, registers, memory, and the CPU that orchestrates it all. But for now, appreciate this: every time your computer does anything - sends an email, plays a video, runs a calculation - it’s using the same logical principles Boole formalized 170 years ago, brought to life in a way he could never have imagined.