What is a computer?

You use one every day but probably can't explain how it works. Here's why understanding the basics changes how you think about AI.

Christian Genco Christian Genco

You use a computer for hours every day. You make million-dollar decisions based on what one tells you. But if someone asked you how it actually works (not "it has a processor" but what is actually happening inside that thing) could you explain it?

Most executives can't. That's fine. You don't need to know how an engine works to drive a car. But we're not in the "driving a car" era of computing anymore. We're in the era where the car is starting to drive itself, and it's worth understanding what's under the hood.

I'm going to explain how a computer works starting from the absolute basics. It'll take about ten minutes. By the end, you'll have a much better intuition for the fundamentals of how a computer actually works, which will make it way easier to build up to understanding how AI works.

It starts with a faucet

Turn a faucet on: water flows. Turn it off: water stops.

That's it. That's the fundamental thing a computer does. Not with water, but with electricity. Everything else (every spreadsheet, every video call, every AI chatbot) builds on top of this tiny building block of tiny valves turning on and off.

Water is actually a great way to think about it. Electricity works a lot like plumbing: current flows through wires like water flows through pipes. A valve controls whether the flow passes through or gets blocked. Open the valve: flow. Close it: no flow. Two states: on and off. 1 and 0.

One valve isn't very useful. But what if you had two valves controlling the same pipe?

Transistors: tiny automatic valves

The faucet above requires you to turn it. But what if the water pressure from one pipe could open or close a valve on a different pipe? That's a transistor.

transistor

In the 1940s, engineers figured out that certain materials (silicon, mostly) could act as an electrical valve controlled by another electrical signal instead of your finger. Flow in one wire controls whether flow is allowed through a different wire. No human needed.

transistor

Because they're controlled by electricity instead of fingers, they can switch really fast. Billions of times per second. And because they're made of silicon, you can make them really small. The chip in your phone has about 15 billion transistors, each one smaller than a virus.

But a transistor by itself is still just an on/off valve. The interesting part is what happens when you start combining them.

Logic gates: switches controlling switches

When you wire a few transistors together in specific patterns, you get something called a logic gate. A logic gate takes one or two inputs (each either on or off) and produces one output (on or off) based on a simple rule.

There are just a few kinds that matter:

  • AND gate: the output is on only if both inputs are on. Think of two switches wired in a line. Both need to be on for the light to work.
  • OR gate: the output is on if either input is on. Think of two switches wired side by side. Either one can light it up.
  • NOT gate: flips the input. On becomes off. Off becomes on.

That's basically it. Three rules. AND, OR, and NOT.

AND
⚑
OR
⚑
NOT
⚑

This might feel too simple to be useful. But every single computation a computer has ever done is built from just these three building blocks that were in turn built out of the smaller simpler transistor building block. AND, OR, NOT, billions of times per second.

Doing math with switches

This is where it gets cool. You can wire logic gates together to do addition. Let's build a calculator to add two single-digit binary numbers using only the AND, OR, and NOT gates we talked about above.

This is a pretty simple calculator because there are only four possible cases we have to think about. We could be adding 0 + 0, which is just 0. We could be adding 0 + 1, which is just 1. We could be adding 1 + 0, which is also just 1. Or the tricky one: 1 + 1, which is 2, which in binary is 10. So we write 0 and carry the 1.

Let's focus on the carry digit first. Look at our four cases: 0+0, 0+1, 1+0, 1+1. The carry digit is only 1 when both inputs are 1. That sounds a lot like an AND gate.

A
B
AND
0
carry
A B carry sum

That handles the carry digit. Now what about the sum digit? Look at the pattern: 0, 1, 1, 0. That almost looks like OR (0, 1, 1, 1) except for the last case. When both inputs are 1, OR gives us 1 but we want 0.

So we need: "OR, but not when both are on." We can build that. Take OR(A, B). Then take AND(A, B) and NOT it. Then AND those two results together. That gives us exactly the pattern we want. (This combination is so useful it has its own name: XOR, "exclusive or.")

A
B
XOR
0
sum
A B carry sum

If you bundle these two gates together (the AND for the carry and the XOR for the sum), you get a nice little package at a new level of abstraction: two inputs, two outputs. It's called a half adder.

A
B
HALF ADDER
0
carry
0
sum
A B carry sum

Chain a bunch of these together, and you can add numbers of any size. More logic gates can do subtraction, multiplication, and division. This collection of gates is called an ALU, an Arithmetic Logic Unit. It's the part of a computer that does math.

We've taken our tiny simple transistor building block, built it up into a slightly bigger logic gate building block, then built those into a half adder building block that can do math, then built those into a slightly bigger building block that can do any mathematical operation. See the pattern?

Binary: counting with switches

If all you have is on and off (1 and 0), how do you represent a number like 42?

Think about it this way: if you have one switch, you can represent two states (off and on, 0 and 1). Two switches? Four states (00, 01, 10, 11). Three switches give you eight states, four gives you 16, five gets you 32, six gets you 64, etc.

= 0

Every switch you add doubles the number of combinations. With eight switches (one byte), you can represent 256 different values. That's enough for every letter, digit, and symbol on your keyboard.

This is the binary number system. A shortcut to counting in binary is to count only using numbers that are entirely made of zeros and ones. Start with 0 (that's 0). Then 1 (that's 1). The next number would be 2, but that has a digit other than 0 or 1, so skip it. Skip 3, 4, 5... all the way up to 10. So 10 = 2. What's the next number with only zeros and ones? 11. So 11 = 3. Now we have to go all the way up to 100 (= 4), then 101 (= 5), 110 (= 6), 111 (= 7), 1000 (= 8), 1001 (= 9), and so on.

Fun aside: you have 10 fingers, and each one can be up (1) or down (0). That means you can count to 1,023 on your hands using binary. Hold up your right pinky: that's 1. Put it down and raise your right ring finger: that's 2. Both up: 3. Just your right middle finger: 4. Try it.

Left hand
Right hand
= 0

(I would not recommend using this system to convey the numbers 4, 128, or 132 to someone.)

Everything in your computer (every number, every letter, every pixel of every photo) is stored as a pattern of 1s and 0s. On and off. Switches.

Memory: switches that remember

There's one more trick. If you wire logic gates in a loop (so the output feeds back into the input) you can make a circuit that remembers. Set it to 1, and it stays at 1 even after you remove the input. Set it to 0, and it stays at 0.

feedback NOT AND OR 0 RESET SET

This is a latch, and it's how your computer stores information. Each one holds a single bit: one 1 or 0.

One bit by itself isn't very useful. But group eight of them together and you get a byte, which can represent 256 different values β€” enough for any letter, digit, or symbol on your keyboard. From there it's just a matter of scale:

Unit Bits (switches) Roughly…
1 bit 1 a single yes/no
1 byte 8 one letter or number
1 kilobyte 8,192 a couple paragraphs of text
1 megabyte 8,388,608 a photo or a short song
1 gigabyte 8,589,934,592 a movie
1 terabyte 8,796,093,022,208 ~500 hours of video

That last row is worth pausing on. A terabyte is nearly nine trillion switches. If each one were a physical light switch (about 1.5 by 3 inches), the switches in a single terabyte would cover about 9,000 square miles. That's roughly the size of New Jersey. But because transistors are smaller than viruses, you can fit a terabyte on a micro SD card small enough to swallow.

The CPU: a very fast instruction follower

Now put it all together. A CPU (Central Processing Unit) is just:

  1. An ALU (Arithmetic Logic Unit) that can do math and logic (thousands of half adder circuits like the one above, wired together)
  2. Some memory to hold data and instructions (billions of latch circuits like the one above, wired together), along with a handful of registers: tiny, ultra-fast slots of working memory that hold whatever values the CPU is actively using right now
  3. A control unit that runs a loop: fetch the next instruction from memory, decode what it means, then execute it by telling the ALU what to do β€” over and over, billions of times per second
Memory
Control Unit
β€”
ALU
idle
Registers
A
β€”
B
β€”
out
β€”

That's a computer. The instructions are dead simple. Things like "add these two numbers," "store this result here," "if this value is zero, skip to instruction #47." Each instruction is a pattern of 1s and 0s (machine code).

The CPU reads an instruction, does it, reads the next one, does it. Billions of times per second. That's all it does. There is no magic.

Every app on your phone, every website, every "intelligent" AI assistant: it's all just a long list of these tiny instructions being executed one after another, unimaginably fast.

So what?

Why does any of this matter if you're running a company, not designing chips?

Because the entire history of computing is the story of building increasingly complex layers on top of these absurdly simple foundations. Light switches all the way down.

And AI is just the latest layer. When you hear "neural network" or "large language model," the computer underneath is still doing the same thing it's always done: executing simple instructions, billions of times per second, with no understanding of what any of it means.

That has some big implications, and it's what we'll get into next:

What is AI?
coming soon
A plain-English explanation for people who run companies, not computer science departments.
Christian Genco Christian Genco