The Unix Way ■ Episode 8
In 1969, a developer at Bell Labs wanted to play a video game. The computer he needed cost $75 per session. So he found an old machine in a corner and built an operating system to run it.
Its descendants run most of the world. Rather good return on a video game.
Multics: The Warning
To understand Unix, one must first understand what it replaced.
In 1964, MIT, General Electric, and Bell Labs embarked on Multics, the Multiplexed Information and Computing Service. The vision was magnificent: a time-sharing operating system for hundreds of simultaneous users, with hierarchical filesystems, dynamic linking, and security rings. Computing as a utility, like electricity or running water.
The project ran on a GE-645 mainframe. It required hundreds of engineers. It was, by every account, behind schedule, over budget, and growing in every direction at once. Sam Morgan, the head of Bell Labs' computing research, described it as "climbing too many trees at once."
Bell Labs pulled out in April 1969. Dennis Ritchie, who had worked on the project, later wrote that Multics had become "too late and too expensively" what it set out to be.
Multics eventually shipped. It ran until the last installation was decommissioned in 2000, at the Canadian Department of National Defence. Thirty-one years of continuous service. Not a failure, then. But a warning. The lesson was clear to everyone who left the project: complexity is a one-way street. Once you are on it, you do not come back.
The Game
Ken Thompson had written Space Travel, a solar system simulation, on the GE-645 while working on Multics. Planets orbited. Gravity pulled. The player navigated between them. It was, by the standards of 1969, rather ambitious. It was also rather expensive: $75 per session on the mainframe.
With Multics gone, Thompson needed somewhere to run his game. He found a PDP-7 in a neighbouring department. A Digital Equipment Corporation minicomputer from 1964. 18-bit words. 9 kilobytes of user memory. Barely used. No operating system.
To run Space Travel on this machine, Thompson needed a filesystem to store code, a shell to launch programmes, an editor to write them, and an assembler to translate them into machine instructions. The PDP-7 had none of these things.
So he wrote them. All of them. On 9 KB.
One rather suspects a modern team would have opened a Jira ticket, scheduled a sprint planning session, and debated the backlog grooming methodology before writing the first line.
Thompson wrote the entire system in three weeks during August 1969, while his wife was visiting family. One week for the filesystem. One week for the shell. One week for the editor and assembler. Brian Kernighan, a colleague at Bell Labs, named the result. Multics served many users simultaneously. Thompson's system served one. Multi became Uni. Multics became Unics, then Unix. The pun was intentional.
The Language
The PDP-7 version worked, but it was trapped. Written entirely in assembly language, bound to one machine's instruction set. When Unix moved to the PDP-11, a faster 16-bit minicomputer, it had to be rewritten from scratch. Still assembly. Still immovable.
Dennis Ritchie solved this. Between 1969 and 1973, working alongside Thompson at Bell Labs, he developed the C programming language. Not to build a product. Not to found a startup. To make Thompson's system portable.
C evolved from Thompson's B, which itself derived from Martin Richards' BCPL at Cambridge. Each generation stripped away complexity. BCPL was designed for compiler writing. B simplified it for systems programming. C added data types, structures, and just enough abstraction to describe hardware without hiding it. The language was as minimal as the system it was built for.
In the summer of 1973, Thompson and Ritchie rewrote Unix in C on the PDP-11. The first operating system written in a high-level language. One compiler port, and the system ran on a different machine. Quite the upgrade from $75 per session.
Ritchie published The C Programming Language with Kernighan in 1978. 228 pages. The entire language in a book you could carry in one hand. The "Hello, World" programme on page 6 became the most imitated first line in computing history. Fifty-three years later, C remains the second most used programming language in the world. Not bad for a tool that was built to port a game.
The Pipes
In 1964, nine years before pipes existed, Doug McIlroy wrote an internal memo at Bell Labs:
"We should have some ways of coupling programs like garden hose: screw in another segment when it becomes necessary to massage data in another way."
McIlroy was head of the Computing Techniques Research Department, the group that created Unix. He had been thinking about programme composition for nearly a decade. In 1973, Thompson implemented pipes in one night. The syntax was beautiful in its economy:
ls | grep ".c" | wc -l
Three programmes. Three jobs. Connected by two characters. The output of one becomes the input of the next. No frameworks. No message brokers. No orchestration layer. Just text flowing between tools that each do one thing well.
This was not just a technical feature. It was a design philosophy made executable. If every programme reads text and writes text, then every programme can talk to every other programme. The interface is universal. The contract is implicit. The composition is infinite.
McIlroy distilled it into what became the Unix philosophy:
"Write programs that do one thing and do it well. Write programs to work together. Write programs to handle text streams, because that is a universal interface."
Not a manifesto. A lesson from having 9 KB and no room for excess.
The Spread
Here is where history takes an improbable turn.
AT&T, Bell Labs' parent company, was bound by a 1956 antitrust consent decree that prohibited it from selling products outside the telephone business. AT&T could not sell software. So they licensed Unix to universities for a nominal fee, including the source code.
Terribly inconvenient for AT&T. Rather fortunate for civilisation.
The University of California at Berkeley received Unix in 1974. A graduate student named Bill Joy took an interest. Joy wrote the vi editor, the C shell, and contributed to TCP/IP networking for Unix. He co-founded Sun Microsystems in 1982 on the strength of what he had built at Berkeley. The university's modifications became the Berkeley Software Distribution: BSD.
The consent decree that prevented AT&T from selling software effectively seeded Unix across every university computer science department in the Western world. The students who learned on Unix built the systems we now depend on. Antitrust enforcement as open-source policy. One does appreciate the irony.
From BSD came FreeBSD (1993), NetBSD (1993), and OpenBSD (1995). But the path was not smooth. In 1992, USL, AT&T's subsidiary, sued BSDi and the University of California, claiming BSD still contained proprietary Unix code. For two years, every BSD derivative operated under legal uncertainty. Nobody could say with confidence whether the code was clean. The lawsuit settled in February 1994: of 18,000 files, exactly three were found to be problematic. Three. But the damage was done. Two years of doubt had frozen adoption at exactly the moment a Finnish student posted a message to comp.os.minix.
Linus Torvalds wrote a Unix-inspired kernel from scratch. Announced in August 1991, released as Linux 1.0 in March 1994. Not derived from AT&T code. Not from BSD. Inspired by the design, reimplemented from zero. Torvalds himself later said that if 386BSD or FreeBSD had been available and unencumbered at the time, he probably would never have written Linux. The lawsuit that was meant to protect Unix intellectual property ended up creating its greatest competitor. One does rather enjoy that particular irony.
Linux now runs on every Android phone, most web servers, the top 500 supercomputers (all of them, since 2017), and the Mars rovers. But it does not run alone. FreeBSD powers Netflix's entire CDN, serving roughly a third of all internet traffic during peak hours. Sony chose FreeBSD as the operating system for the PlayStation 4 and PlayStation 5. Apple's macOS runs a BSD derivative called Darwin. Every iPhone, every iPad, every Mac traces its kernel lineage to a corridor at Berkeley in the 1970s. Between BSD on the consoles, the Apple devices, and the content delivery networks, and Linux on the servers and phones, Thompson's 9 KB experiment owns the planet twice over.
The Timeline
The Lesson
Every good thing in this story was born from constraint.
9 KB forced Thompson to write only what he needed. Assembly forced Ritchie to invent a portable language. A consent decree forced AT&T to share. Nothing planned. Everything necessary.
The people who built Unix did not set out to change computing. Thompson wanted to play a game. Ritchie wanted to port a system. McIlroy wanted programmes to talk to each other. Joy wanted better networking. Each solved the problem in front of them, nothing more, and the solutions composed because they were small enough to compose.
Today, a fresh React project ships 247 MB of dependencies on a machine with 16 GB of memory. The constraint is gone. And with it, quite evidently, the elegance.
The developers who built Unix had no choice but to be economical. We have every choice. The question is whether we still make it.
Thompson and Ritchie received the Turing Award in 1983. Ritchie received the National Medal of Technology from President Clinton in 1999, alongside Thompson. He died on 12 October 2011, one week after Steve Jobs. The internet mourned Jobs. Ritchie's death barely made the news. Jobs built products on Ritchie's foundation. The foundation does not tend to make the front page.