Progressive Abandonment Why We Stopped Supporting IE5 (And Why We Didn't Have To) A philosophical exploration of web development, progressive enhancement, and the cost of "progress" |
"You Call That Classic Web Dev?" |
After spending an hour debugging why wide ANSI terminal art was breaking our page layout on a retro browser interface, I called it "classic web dev."
The response: "This is literally why I'm a systems architect and not a frontend designer, LOL"
And honestly? Fair. We had just debugged "make wide content stay inside a box" - supposedly the most basic thing in web development. But the journey involved:
All of this just to make a rectangle stay inside another rectangle.
Meanwhile, in systems architecture: "Distributed consensus across 10,000 nodes? Yeah, I can reason about that." But CSS? "Why won't this box behave??"
Frontend is chaos held together by duct tape and the tears of developers who just wanted things to line up.
The Disgruntled Truth About Node.js and Progressive Enhancement |
Then came the kicker: "Insert disgruntled comment here about nodejs bleeding frontend into backend and progressive enhancement kicking all ya'lls ass"
The irony is rich. The web development community spent years touting progressive enhancement as a best practice:
The Promise: "Build for the baseline, enhance for modern browsers!"
The Reality: Builds entire app in React with 47 npm packages, requires ES2020 and WebAssembly, breaks on anything older than Chrome 90
We're not doing progressive enhancement. We're doing progressive hope - starting with modern browsers and praying it degrades gracefully to... well, not IE5, that's for sure.
And Node.js bleeding frontend chaos into the backend? It gave backend developers a whole new set of problems:
Meanwhile, this project runs a Go binary that:
Frontend complexity is kicking everyone's ass, even when we actively try to avoid it.
The Three Phases of Web Development |
The real story isn't just about technical choices. It's about a philosophical shift that happened in three distinct phases:
Phase 1: Progressive Enhancement (1990s-2000s) |
Phase 2: The Great Abandonment (2010s - The SPA Era) |
Phase 3: The Rediscovery (Late 2010s-Present) |
---|---|---|
|
|
|
The Punchline:
We spent a decade:
What This Project Proves |
Meanwhile, this 1999-browser-compatible approach does all of this correctly by default:
The architecture proves something important: Nothing about web technology required breaking backwards compatibility. We just stopped caring.
The tech stack from 25 years ago is completely capable:
This project runs a 2025 AI API through a 1999 browser. The limiting factor is CSS box model quirks, not fundamental impossibility.
What changed wasn't capability, it was willingness:
But none of that was necessary. It was a choice. We traded universal compatibility for developer convenience.
The Deeper Realization: We Could Still Support IE5 |
Here's where it gets interesting. Not only could we have maintained compatibility, but had we stuck to progressive enhancement principles, we could theoretically still support browsers as old as IE5 today.
Imagine a modern framework written with this mindset:
Baseline: HTML 4.01, CSS 2.1, ES3 (works on IE5) |
|
Layer 1: ES5 features (IE9+) |
|
Layer 2: Modern CSS (2010s+) |
|
Layer 3: ES6+ (2015+) |
|
Layer 4: Cutting Edge (present) |
|
At every layer, it degrades gracefully. This project proves it's possible because it does exactly this:
The counterarguments fall apart under scrutiny:
The Thought Experiment: Adaptive Server-Side Rendering |
Taking it even further: with a little inventiveness, you could have HTML 4.01 upgrade to newer HTML versions purely server-side. This goes beyond normal progressive enhancement into something more interesting.
Traditional Progressive Enhancement | Adaptive Server-Side Rendering |
---|---|
|
|
We used to do this! Early 2000s had WAP vs desktop detection. Mobile sites served different HTML. Server-side browser sniffing was standard practice.
This goes beyond progressive enhancement - it's capability-aware rendering. Same backend logic, adaptive presentation layer. And CGI scripts could have done this in 1995. We just chose not to keep doing it.
Historical Precedent: eBay's C CGI Architecture |
This isn't just theory. Real companies scaled to massive success with this exact philosophy.
Early eBay (AuctionWeb, 1995-1997) ran on C/C++ CGI scripts on a single Linux box in Pierre Omidyar's house:
The Stack | What It Handled | Why It Worked |
---|---|---|
|
|
|
They scaled this to millions of dollars in transactions before having to rewrite. When they did rewrite (to Java/J2EE in the 2000s), it was arguably more complex and harder to maintain.
Compare to today:
Why Go is the Spiritual Successor |
This is why Go feels right for this project. It's the closest spiritual successor to "compile a binary that you just run" turning into a webapp - short of actually doing it in C.
Go's Philosophy | The Connection to C CGI |
---|---|
|
The eBay engineers would look at this Go code and say "Yeah, that's basically what we were doing, but with less segfaults." |
Compared to Node.js:
"First install node, then npm install, oh wait wrong node version, install nvm, now install dependencies, oh this package needs python2.7 and node-gyp, oh this package is deprecated, oh security vulnerabilities..."
Compared to Go:
go build → done.
Go gives you C's simplicity and speed but with:
Go is "worse is better" incarnate. It deliberately limits features to maintain simplicity. It refuses to add generics for years because they complicate the language. It has a GC because memory safety matters more than the last 5% of performance for most applications.
For web services, CLIs, and network tools? Go is perfect. It's C for people who want to ship code instead of debug pointer arithmetic.
The Uncomfortable Truth |
This project isn't just a fun retro experiment or an exercise in nostalgia. It's proof of concept that the entire web industry chose wrong.
We didn't have to break backwards compatibility. The web didn't have to leave anyone behind. We could have:
Instead we chose:
And the industry is slowly admitting this (htmx, Astro, Remix, server components), but still won't go all the way back to true progressive enhancement because that would mean admitting we wasted 15 years.
What We Lost |
When we abandoned progressive enhancement and backwards compatibility, we lost more than just support for old browsers:
What did we gain?
Was it worth the trade? This project suggests: probably not.
The Path Forward |
The uncomfortable realization is that we can't put the genie back in the bottle. The web has moved on. But we can learn from this:
Projects like this serve as reminders. The old way wasn't primitive - it was correct. We abandoned working principles for hype, then had to painfully rediscover them.
The modern web is finally circling back to admitting the old way was right, but with 10x the complexity and 0x the backwards compatibility.
Conclusion |
This project runs a 2025 AI API on a 1999 browser. It shouldn't be remarkable, but it is - because we've convinced ourselves it's impossible.
It's not impossible. It never was. We just stopped trying. |
Written after debugging CSS box model issues for an hour and having an existential crisis about the state of web development. The boxes finally stayed inside other boxes. Frontend is still chaos. We still chose wrong. Go is still great. "Worse is better" is still vindicated.
Generated as a demonstration of the principles discussed within. This HTML was written in the style of 1995 eBay - pure, simple, and capable of handling millions. No CSS files. No JavaScript frameworks. No build pipeline. Just HTML. It just works. |