The Invoice ■ Episode 16
"One language for frontend and backend! Share code! One team! No context-switching!"
Splendid. Let us examine what you are actually paying for.
In 1995, Brendan Eich built JavaScript in ten days. For form validation and image rollovers. The language was designed to make buttons change colour when you hovered over them. It was not designed to serve HTTP requests, manage database connections, handle authentication, or run at scale. Thirty years later, it does all of these things. The language has not changed. The expectations have.
The Resource Invoice
Express: 20,000 requests per second. Go: 40,000. Rust: 60,000. On AWS Lambda, Node.js averages 20 milliseconds per invocation. Rust: 1.12 milliseconds. Eighteen times faster for the same work. Not a different workload. Not a different algorithm. The same function, the same input, the same output. Eighteen times the latency because of the runtime.
The real cost is memory. Express idles at 30 to 50 MB. Go: 5 to 10 MB. Rust: 1 to 2 MB. V8's garbage collector runs whether you need it or not. When it pauses, your latency spikes. You are paying for the runtime, not your application. The runtime costs more than the code it runs, which is a rather expensive way to serve JSON.
The Architecture Invoice
One thread. That is your entire backend. Like a server in 1983. Every CPU-intensive operation blocks every other request. Image processing, PDF generation, data aggregation, cryptographic verification: anything that needs the CPU for more than a few milliseconds queues everything behind it.
The fix is not fixing the runtime. The fix is containerisation. Cannot use multiple cores? Spin up multiple containers. Cannot manage containers? Add Kubernetes. Cannot manage Kubernetes? Add a managed Kubernetes service. Cannot afford the managed service? Add a cloud cost optimisation tool. The entire cloud-native orchestration stack exists, in part, to compensate for a language that does not know what to do with your lovely CPU cores.
Go uses all cores by default. Rust uses all cores by default. Node.js uses one, then asks for an orchestration platform. One does wonder how we ended up here.
The Type Safety Invoice
JavaScript has no types. The number 1 and the
string "1" are loosely equal. An array plus a
number is a string. null is an object. These are
not bugs. They are the specification. The language was designed
for ten-line form validators, not ten-thousand-line API
servers.
The fix: TypeScript. A transpilation layer that adds types to JavaScript at the cost of a mandatory build step to a language that was designed to run directly. Type checking consumes 95 per cent of compilation time. A 22,000-line project takes 13 seconds. Large codebases: three minutes per compile. Every save. Every change. Three minutes of waiting for the privilege of knowing that your variables have types.
The workaround:
bypass tsc with swc
(96 per cent faster) and lose type checking at build time.
You have added a type system, then disabled it for
performance. Marvellous.
The Security Invoice
npm hosts 3.2 million packages. In 2024, 500,000 were malicious: 98.5 per cent of all malware across every language ecosystem. Not 98.5 per cent of npm malware. 98.5 per cent of all malware, across every registry, in every language. npm is the world's largest software registry, and also, by a considerable margin, the world's largest malware distribution platform.
In September 2025,
eighteen packages with 2.6 billion weekly downloads
were compromised in a single supply chain attack.
debug. chalk. Packages that live in
every Node.js project on earth. In your dependency tree right
now.
The insecurity is not merely npm. It is the architecture.
Node.js loads third-party code at runtime with full system
access. Every require() call trusts the package
it loads to not read your filesystem, not open network
connections, not exfiltrate environment variables. Go compiles
dependencies once at build time. Rust links statically. The
attack surface is different by design.
The Energy Invoice
Pereira et al. (2021) measured energy consumption across programming languages, normalised to C = 1.00. At data centre scale, the numbers become infrastructure decisions:
Tenable rewrote one Node.js service in Rust: 700 CPUs saved, 300 GB RAM freed, 50 per cent latency reduction. One service. Not the entire backend. One service that happened to be doing actual work instead of waiting for I/O, which is the moment Node.js stops being competent and starts being expensive.
The Alternative
For I/O-bound real-time workloads (chat, streaming, webhooks, notification systems), Node.js is genuinely competent. The event loop was designed for exactly this pattern: wait for data, process it briefly, wait again. If your backend is mostly waiting, Node.js is a reasonable choice. This is not damning with faint praise. This is acknowledging what the tool was designed for.
For everything else: Rust gives you a single binary with no garbage collector, native concurrency, and static linking at 1 to 2 MB. Go gets you halfway there with goroutines and a fast garbage collector, though Google's GC still takes its cut. Neither needs a transpilation step, a bundler, or a package manager at runtime.
The Pattern
Brendan Eich built JavaScript to validate forms.
Ryan Dahl
put it on the server, then called node_modules
"an irreparable mistake" and built Deno to fix it. The
language's own architect moved on. One does wonder whether
the ecosystem noticed.
Built in ten days for form validation. Express: 20K RPS. Rust: 60K. Idle memory: 30-50 MB vs 1-2 MB. One thread. 500K malicious npm packages (98.5% of all malware). The creator called node_modules an irreparable mistake and built Deno. Your language choice is an infrastructure decision. And a climate decision.