Vivian Voss

Why We Render Everything in the Browser

web performance javascript architecture

On Second Thought ■ Episode 07

Open the utility provider's login page. The fan stirs. The cursor hesitates. A laptop with eight gigabytes of memory and a respectable amount of silicon spends a measurable second arranging itself to display a username field, a password field, and a button. The HTTP Archive Web Almanac, 2025 edition, puts the median home page at 697 KB of JavaScript before a single character of content arrives. The browser has been asked, with a fair amount of ceremony, to do what a server in 1996 already did, and to do it on a battery, on a metered connection, on a chassis warm enough to taste.

This is the post about that.

The Axiom

The reflex is universal. Every new web project begins with a frontend framework, a build pipeline, a bundler, and a hydration strategy. The server, where the data already lives, has been demoted to a JSON tap. Rendering is the browser's problem now. Nobody quite remembers when this was decided.

It was not decided in any single meeting, or with any single argument. It compounded. Each project chose the framework because the previous project had; each developer learned the framework because that was the job posting. By the time anyone thought to question the assumption, the assumption had become the architecture, the curriculum, the conference circuit, and the hiring funnel. Reaching for a server-rendered page in 2026 feels, to many engineers, like reaching for COBOL: technically defensible, professionally suspect.

That feeling is the topic.

The Origin

Brendan Eich wrote JavaScript at Netscape in May 1995, in ten days, for form validation and image rollovers. The language shipped without a standard library worth the name. It was, at the time, an entirely reasonable thing for a small extension language to lack; nobody was building applications with it. They were validating that the email field contained an at-sign before the form posted.

Ten years later, Jesse James Garrett gave the next pattern its name. His 2005 essay coined AJAX and described how Gmail and Google Maps used asynchronous XMLHttpRequest to update parts of a page without a full reload. The technique was real, the gain was real, and the use cases were narrow: applications in which a user lived for hours and interacted hundreds of times per session. Backbone arrived in 2010 (Jeremy Ashkenas), Angular the same year (Misko Hevery, Adam Abrons), Ember in 2011, React in 2013 (Jordan Walke at Facebook). Each was a sincere response to a real problem: the desktop application in a browser. Gmail, Maps, Figma. Real applications, real argument.

From Ten Days to Seven Hundred Kilobytes 1995 JS, ten days no stdlib 2005 AJAX named Gmail, Maps 2010 Backbone Angular.js 2013 React at Facebook 2018 SPA is the default 2025 median 697 KB JS to render a heading

The pattern then crossed every threshold. Marketing pages, brochure sites, restaurant menus, government forms, regional rail booking, doctor's surgery appointments. The argument that justified Gmail's architecture was never re-examined for the brochure, because re-examining the argument required a different skill set than building the brochure had required. By 2018, the SPA was treated as the default architecture of the web. By 2020, "I'll just spin up a Next.js project" was the unexamined first move for static content.

The shift was not theoretical; it was a hiring problem. A generation of frontend developers learned React in a bootcamp, then a job, then another job, and never learned a server language. The job market rewarded React on the CV; nobody was hiring for "knows what a session cookie is". TypeScript arrived as the local repair: a build-time wrapper meant to compensate for the deficiencies of the language the team had chosen because it was the only one anyone knew. The fix was added before the question was asked.

Episode 02 of this series left a question open: what happens when a profession trains on tools instead of foundations? This is its first practical consequence. The architecture follows the curriculum. The web becomes a thing that requires a 700 KB bundle to display a heading because the people who build it cannot, professionally, build it any other way.

The Cost

Median page weight in 2025, per the HTTP Archive Web Almanac: 697 KB of JavaScript on home pages, 632 KB on inner pages. The 90th percentile reaches 1,979 KB. The 10th percentile sits at 87 KB. The lower bound is, demonstrably, achievable. Most teams are nowhere near it.

JavaScript per Page, 2025 Median Web 10th percentile 87 KB the achievable floor Median 697 KB where the typical team is 90th percentile 1,979 KB floor

The first cost is paid downstream, on the device, in electricity. On a mid-range Android over 4G, those numbers translate into eight seconds to Time to Interactive on a representative page, more on a slow connection. Mobile data, in EU roaming, is not free. The cost that is harder to name on an invoice is the wattage. Each page load now drives the client CPU and GPU as if it were running a workload, not displaying text. Frameworks parse, hydrate, diff, reconcile, and re-render; the silicon spins; the fan, in a quiet office, becomes audible. Battery sensors register the drain in measurable single-digit percentages per twenty minutes of casual reading. The user is, in literal terms, paying for the rendering with their own battery and their own data plan, on hardware they bought, for content the server already had. There is no return on the spending, because the page was a printed leaflet at heart.

Multiplied across a billion daily web sessions, the unnecessary draw climbs into terawatt-hours per year, a figure documented in the W3C Web Sustainability Guidelines and the Sustainable Web Manifesto reference dataset. The electricity is only the easiest column to count. Beneath it sits the hardware that consumed it: the lithium for the laptop battery that drains faster, the rare earths in the radio that keeps the data flowing, the cobalt in the datacentre's storage, the copper in the racks, the cooling water that the racks then warm. Megatons of natural resources, quietly enlisted on the side of a marketing page that did not need any of them. There is, here, an inconvenient observation. The same conferences that run sustainability tracks ship frameworks of half a megabyte per visit; the same engineer who publishes a sustainability-conscious LinkedIn post will, on Monday morning, push a pull request that adds 200 KB of JavaScript to a marketing page. The contradiction holds itself politely apart, and is, on close inspection, untenable. One cannot ship the bundle and lecture on pollution in the same conversation.

Accessibility collapses without JavaScript. Screen readers must wait for the framework to construct the DOM before they can announce anything; if the JavaScript fails to load, the page is empty. Crawlers must execute the JavaScript to index the content, which means search-engine indexing has become a secondary industry of headless browsers, and the cost of that industry is borne by datacentres no one sees on the invoice. The browser tab is now the heaviest process on the average laptop. Most of what it computes is layout for content that the server already had, in a format the server could already serve.

Then, having pacified the client, the same language was asked to manage the server. Node.js, originally an event-driven runtime for I/O-bound real-time workloads (which it does competently), was deployed in whole cohorts of new projects to do work that PHP, Ruby, Python and Go had been doing efficiently for decades. The reason was not technical. Node.js is not faster than Go (it is not), not smaller than Rust (it is not), not safer than Python (it is not). It was deployed because the team knew only the one language and could not, professionally, choose another. The build pipeline that wraps the language to compensate for its deficiencies on the client was duplicated on the server. Every architectural decision compounds. The bill is paid twice: once at the client, once at the rack.

Around all of it, the dependency cloud, and a quietly inconvenient truth at the bottom of it. JavaScript was shipped, in 1995, without a standard library worthy of the name; npm appeared in 2010 to fill the vacuum, and the vacuum filled, in turn, with hundreds of transitive packages from strangers, then thousands. It is, by some distance, the public software repository most reliably compromised in routine use. The 2024 Sonatype State of the Software Supply Chain report counted more than half a million malicious packages across public registries; npm carried the bulk of them. left-pad in 2016, event-stream in 2018, ua-parser-js in 2021, and the September 2025 incident in which eighteen packages with 2.6 billion weekly downloads were compromised in a single coordinated attack, are the headlines a Senior Engineer can name. The thousand smaller incidents are the ones nobody tracks.

The Bill, in Three Layers Downstream — Client CPU, GPU, fan, battery, mobile data, screen reader, headless crawler paid in watts, milliseconds, and accessibility Upstream — Server Node.js displaces PHP, Ruby, Python, Go, not for speed or size paid in CPU cycles and megawatt-hours at the rack Around it — Dependency Cloud npm: ~500k malicious packages observed; Sept 2025: 18 pkgs, 2.6B weekly downloads supervisory cloud: Dependabot, Snyk, Renovate, opening PRs for other servers paid in pull requests, Slack triage, and another vendor

The maintenance of the dependency cloud is then outsourced to a second cloud, the supervisory cloud: Dependabot, Snyk, Renovate, GitHub Advanced Security, the long tail of supply-chain-monitoring SaaS. Each one runs on servers, consumes electricity, opens pull requests for other servers to review, sends Slack notifications for engineers to triage, and produces dashboards for managers to look at. The supervisory cloud has, in turn, its own dependencies, its own build pipelines, its own pricing tiers. The wrappers wrap the wrappers. One does, at some point, lose count of the clouds.

The question that closes this section is the obvious one, and the one the industry has spent fifteen years pretending not to ask. If the answer to a security failure in your dependencies is to add another vendor, another runtime, another monitoring service, and another bill, you have not solved the problem. You have rented a louder version of it. None of this would exist in a language whose standard library could carry the weight of a real application. Rust, Go, Python, Ruby, even PHP all ship with first-party batteries that npm has spent a decade and a half badly approximating, with the supply chain to match. The vacuum was self-inflicted, in the choice of the language at the bottom of the stack.

The Question

The proofs are not theoretical. They are in production, and they have been for years.

Wikipedia: roughly 60 million articles, MediaWiki on PHP 8.3 with MySQL/MariaDB and an aggressive caching layer, server-rendered, faster on a budget Android over a coffee-shop wireless than most marketing single-page applications. The Wikimedia infrastructure handles hundreds of thousands of views per second, and it does so by caching server-rendered HTML, not by shipping a runtime to the client.

GOV.UK: 67 million citizens, designed by the Government Digital Service from 2012 onward, with progressive enhancement mandated by the GOV.UK service manual for every government service. HTML first, CSS second, JavaScript only as enhancement. The principle is documented, the policy is enforced, the result is a public-sector portal that opens on a Nokia from 2007 and an iPhone from 2024 alike. It is not glamorous. It is, demonstrably, robust.

Hacker News: server-rendered, written in Arc (a Common Lisp dialect) by Paul Graham and Robert Morris from 2007, opens before the loading spinner of any modern dashboard. The site is famously austere, deliberately so. It is also, after eighteen years, still online and still fast.

HEY and Basecamp, the products of 37signals: server-rendered with Hotwire (a Ruby on Rails extension that DHH and his team published in 2020), fully interactive, no client-side framework. The interactive elements arrive over the wire as HTML fragments, swapped into place by a tiny client-side library. The model is older than the SPA, applied to a modern email client, and it works.

HTMX, by Carson Gross at Big Sky Software (2020): 14 KB minified, around 5 KB gzipped, doing what JavaScript frameworks an order of magnitude larger claim to do. Carson Gross's argument is the simplest version of the case in this post: most of what we want a SPA for can be done with attributes on plain HTML, and the rest can wait.

Netflix's logged-out homepage, 2017: the team removed client-side React entirely from the landing page and rebuilt the language switcher in fewer than 300 lines of vanilla JavaScript. Time to Interactive improved by more than 50%, and Jake Archibald's writeup of the change remains one of the canonical case studies for the cost of unnecessary client-side rendering. Netflix did not abandon React; it kept it server-side, where the value is, and removed it from the client, where the cost is.

Cloudflare Pingora, 2022: Cloudflare published a blog post describing how their nginx-and-Lua proxy, after years of scaling, had hit the limits of its architecture, and was replaced with a Rust-based proxy of their own design. The new proxy serves over a trillion requests per day, with about 70% less CPU and 67% less memory than the old one. The lesson is not "Rust beats Lua" in any tribal sense; it is "a small, focused, efficient server kernel beats a tower of accreted layers, and the savings show up on the electricity bill the same week".

Seven Proofs in Production Wikipedia MediaWiki on PHP 8.3 ~60M articles, server-rendered faster than most SPAs GOV.UK 67M citizens served progressive enhancement mandated by service manual Hacker News Arc (Common Lisp), 2007 opens before any dashboard's loading spinner does HEY / Basecamp Hotwire, since 2020 HTML fragments over the wire no client-side framework HTMX 14 KB min, ~5 KB gzip attributes on plain HTML Carson Gross, Big Sky, 2020 Netflix (logged-out) React removed from client +50% Time to Interactive switcher in <300 LOC vanilla JS Cloudflare Pingora Rust replaces nginx + Lua >1 trillion requests per day ~70% less CPU, ~67% less memory Each shipped, each interactive. Reactivity solved in kilobytes.

A word on the standard objection, before it appears. The reflexive defence of the SPA is "but we need reactivity". Reactivity has not been an obstacle to server-rendered architectures for at least a decade. Hotwire's Turbo Streams, HTMX's hx-swap and hx-trigger attributes, Phoenix LiveView's WebSocket-driven HTML diffs, Laravel Livewire, ASP.NET Blazor Server: all of them deliver partial updates, optimistic UI, two-way binding, even real-time collaborative behaviour, by sending small HTML fragments over the wire instead of shipping a runtime to interpret JSON. The total client-side weight, in every one of these patterns, is in single-digit kilobytes. Reactivity is a solved problem. The framework industry sells it back as if it were not.

Each of these stacks chose a different server language: PHP for MediaWiki, Ruby for Rails and Hotwire, Common Lisp for Hacker News, Python for Django sites in the same family, Go for many modern static-content tools, Rust for Pingora and the modern wave. On a FreeBSD server, any of these is a single pkg install away, with one service manager and one configuration tree. The language is the smaller question; what matters is that the rendering happens on the server, where the data and the session already live, in a runtime that does not have to be re-shipped to every visitor.

Rust is, for the modern build, particularly hard to argue against. The combination of a single static binary, no garbage collector, native concurrency, and the kind of compile-time safety that removes whole classes of runtime error is, today, the easiest case to make. But the case is not "use Rust". The case is "render where the data is".

The larger question is this. What if rendering happened where the data already lived, and the data did not need to be guarded from a thousand strangers we never invited into the project? Most of the cost of the modern web, paid in milliseconds, in power, in megabytes, in accessibility, in indexability, in carbon, in security incidents, in supervisory SaaS, would simply not exist. The fastest bundle is the one that is never shipped. The safest dependency is the one that was never installed.

JavaScript was written in ten days in 1995, without a standard library. AJAX in 2005 had a real argument; Gmail and Maps earned it. The brochure inherited it without re-examination. The 2025 median page ships 697 KB of JavaScript before any content arrives. The bill arrives in three layers: client power and battery, a duplicated server runtime, and a dependency cloud supervised by a supervisory cloud. Wikipedia on PHP, GOV.UK with mandated progressive enhancement, Hacker News in Arc, HEY on Hotwire, HTMX at 14 KB, Netflix removing client React for a +50% TTI gain, and Cloudflare Pingora replacing nginx-and-Lua at 70% less CPU at one trillion requests per day prove the alternative is in production. The fastest bundle is the one that is never shipped.