15 March 2013. PyCon, Santa Clara. Solomon Hykes steps onto the stage for a five-minute lightning talk entitled “The Future of Linux Containers.”
The problem was genuine. Shipping code between environments broke things.
Different library versions, different kernel configurations, different ideas
of what /usr/local was for. Docker wrapped Linux containers into
a portable format: build once, ship anywhere. For deployment, it was,
and remains, genuinely brilliant.
Then the industry decided it was equally brilliant for development.
One does wish they had checked.
The Original
Package your application. Ship the image. The server runs precisely what you built. No surprises from the host, no dependency drift, no “works on my machine” followed by the predictable silence of someone who knows it does not work on anyone else’s.
Key assumption: the target is a Linux server. Docker uses kernel features (namespaces, cgroups) that exist natively in the Linux kernel. Runs directly on the host. No overhead worth mentioning.
The Copy
By 2016, docker-compose up had become the default onboarding
instruction. Clone the repo. Run compose. Make tea. Wait rather longer than
the tea requires.
Docker was no longer for shipping to servers. Docker was for developing on your MacBook. Which does not run Linux.
The Missing Context
This is where the pattern fractures. The original context (a Linux server) and the copied context (a developer’s laptop running macOS) are architecturally incompatible in ways that compound with every layer of abstraction.
On Linux, Docker runs natively. The filesystem is local. Latency is measured in nanoseconds. On macOS, Docker runs a Linux virtual machine, first HyperKit, then Apple Virtualisation Framework, now Docker VMM. Bind mounts cross a virtualisation boundary. The filesystem is no longer local; it is negotiated.
The numbers, courtesy of Paolo Mainardi’s 2025 benchmarks on an M4 Pro:
And then there is the matter of what you are actually downloading.
node:20 ships an
entire Debian installation
at 1.1 GB. The Node.js tarball from
nodejs.org is 28 MB.
An entire operating system to run JavaScript. On your own machine. Where
JavaScript already runs.
The Cascade
The truly instructive part is not any single cost. It is the sequence: each step presented as the natural, reasonable consequence of the one before, each one adding friction that was not in the original estimate.
You are running a Linux VM on your Mac to execute a process that runs natively on your Mac. The engineering equivalent of driving to your neighbour’s house via Heathrow.
The Timeline
Context matters not merely technically, but historically. Docker did not arrive as a development tool. It arrived as a deployment tool and was gradually reinterpreted, a pattern this series has encountered rather frequently.
The Irony
Nobody questioned whether the tool for shipping to servers was also the right tool for writing code on a laptop. It said “consistent environments” on the tin. That was apparently sufficient.
Kelsey Hightower put it rather well: “You haven’t mastered a tool until you understand when it should not be used.”
Docker for deployment: genuine solution, genuine problem. Docker for local development on non-Linux machines: a deployment tool cosplaying as a development environment.
The README That Could Have Been
The README could say:
npm install && npm start
But that would require trusting developers to install software on their own machines. And we stopped doing that the moment “consistency” became more important than “working.”
Docker is a marvellous deployment tool. The critique is not Docker itself. It is the assumption that a deployment tool is automatically a development tool. Use Docker for what it was designed for: shipping to Linux servers. For local development: install your runtime. Run your code. Trust your machine.