Progressive Abandonment
Why We Stopped Supporting IE5 (And Why We Didn't Have To)
A philosophical exploration of web development, progressive enhancement, and the cost of "progress"

"You Call That Classic Web Dev?"

After spending an hour debugging why wide ANSI terminal art was breaking our page layout on a retro browser interface, I called it "classic web dev."

The response: "This is literally why I'm a systems architect and not a frontend designer, LOL"

And honestly? Fair. We had just debugged "make wide content stay inside a box" - supposedly the most basic thing in web development. But the journey involved:

  1. "Elements are disappearing!" (they weren't, they were scrolled off-screen)
  2. "Is it CSS corruption?" (no, it was horizontal layout explosion)
  3. "Try overflow-x!" (CSS3, won't work on IE5)
  4. "Try max-width!" (breaks padding calculations due to box model)
  5. "Try table-layout: fixed!" (not sufficient alone)
  6. "Try max-width: 0 on cells!" (breaks the entire header)
  7. "Add a specific class!" (finally works)
  8. "Do we even need overflow-x?" (nope, simpler without it)

All of this just to make a rectangle stay inside another rectangle.

Meanwhile, in systems architecture: "Distributed consensus across 10,000 nodes? Yeah, I can reason about that." But CSS? "Why won't this box behave??"

Frontend is chaos held together by duct tape and the tears of developers who just wanted things to line up.


The Disgruntled Truth About Node.js and Progressive Enhancement

Then came the kicker: "Insert disgruntled comment here about nodejs bleeding frontend into backend and progressive enhancement kicking all ya'lls ass"

The irony is rich. The web development community spent years touting progressive enhancement as a best practice:

The Promise: "Build for the baseline, enhance for modern browsers!"

The Reality: Builds entire app in React with 47 npm packages, requires ES2020 and WebAssembly, breaks on anything older than Chrome 90

We're not doing progressive enhancement. We're doing progressive hope - starting with modern browsers and praying it degrades gracefully to... well, not IE5, that's for sure.

And Node.js bleeding frontend chaos into the backend? It gave backend developers a whole new set of problems:

Meanwhile, this project runs a Go binary that:

Frontend complexity is kicking everyone's ass, even when we actively try to avoid it.


The Three Phases of Web Development

The real story isn't just about technical choices. It's about a philosophical shift that happened in three distinct phases:

Phase 1: Progressive Enhancement
(1990s-2000s)
Phase 2: The Great Abandonment
(2010s - The SPA Era)
Phase 3: The Rediscovery
(Late 2010s-Present)
  • Server-side rendering was the default
  • JavaScript enhanced functionality, it didn't require it
  • Sites worked on everything, degraded gracefully
  • Philosophy: "Build for baseline, enhance up"
  • Accessibility was a natural outcome, not a checkbox
  • "JavaScript is the platform now"
  • Client-side rendering for everything
  • Single Page Applications became the default for EVERYTHING
  • "Progressive enhancement is outdated"
  • SEO died, accessibility died, performance tanked
  • "Just require modern browsers lol"
  • npm packages for the simplest operations
  • Build pipelines that take longer than the code
  • "Wait, SPAs are slow and break everything"
  • Next.js/Remix/Astro bring back server-side rendering (and call it "innovation")
  • htmx/Alpine/"HTML-first" frameworks emerge
  • "Islands architecture" is invented (it's just progressive enhancement with a new name)
  • Performance budgets force everyone to rethink their approach
  • Core Web Vitals punish heavy JavaScript

The Punchline:

We spent a decade:

  1. Abandoning progressive enhancement
  2. Building SPAs that broke accessibility, SEO, and performance
  3. Creating frameworks to "solve" problems we ourselves created
  4. Rebranding progressive enhancement as "modern best practices"
  5. Acting like we invented something new

What This Project Proves

Meanwhile, this 1999-browser-compatible approach does all of this correctly by default:

The architecture proves something important: Nothing about web technology required breaking backwards compatibility. We just stopped caring.

The Proof Points

The tech stack from 25 years ago is completely capable:

This project runs a 2025 AI API through a 1999 browser. The limiting factor is CSS box model quirks, not fundamental impossibility.

What changed wasn't capability, it was willingness:

But none of that was necessary. It was a choice. We traded universal compatibility for developer convenience.


The Deeper Realization: We Could Still Support IE5

Here's where it gets interesting. Not only could we have maintained compatibility, but had we stuck to progressive enhancement principles, we could theoretically still support browsers as old as IE5 today.

Imagine a modern framework written with this mindset:

Baseline: HTML 4.01, CSS 2.1, ES3 (works on IE5)
  • Server-side rendering
  • Semantic HTML
  • Form-based interactions
  • Progressive enhancement via feature detection
Layer 1: ES5 features (IE9+)
  • JSON parsing
  • Array methods
  • Better DOM APIs
Layer 2: Modern CSS (2010s+)
  • Flexbox and Grid for modern browsers
  • CSS variables where available
  • Fallbacks to tables/floats for old browsers
Layer 3: ES6+ (2015+)
  • Arrow functions, modules
  • Async/await
  • Modern JavaScript conveniences
Layer 4: Cutting Edge (present)
  • WebAssembly
  • Web Components
  • Modern APIs

At every layer, it degrades gracefully. This project proves it's possible because it does exactly this:

The counterarguments fall apart under scrutiny:


The Thought Experiment: Adaptive Server-Side Rendering

Taking it even further: with a little inventiveness, you could have HTML 4.01 upgrade to newer HTML versions purely server-side. This goes beyond normal progressive enhancement into something more interesting.

Traditional Progressive Enhancement Adaptive Server-Side Rendering
  • Server sends same HTML to everyone
  • Client detects features, enhances with JavaScript
  • Degradation happens client-side
  • Server detects browser capabilities
  • IE5 gets: <table>, <div>, inline styles
  • Modern browsers get: <article>, <section>, CSS Grid
  • Same logical content, optimal markup for each client
  • No polyfills, no feature detection JavaScript, no wasted bytes

We used to do this! Early 2000s had WAP vs desktop detection. Mobile sites served different HTML. Server-side browser sniffing was standard practice.

This goes beyond progressive enhancement - it's capability-aware rendering. Same backend logic, adaptive presentation layer. And CGI scripts could have done this in 1995. We just chose not to keep doing it.


Historical Precedent: eBay's C CGI Architecture

This isn't just theory. Real companies scaled to massive success with this exact philosophy.

Early eBay (AuctionWeb, 1995-1997) ran on C/C++ CGI scripts on a single Linux box in Pierre Omidyar's house:

The Stack What It Handled Why It Worked
  • C/C++ CGI scripts
  • Apache on Linux
  • Flat files, then eventually a database
  • That's it.
  • Millions of auctions
  • Thousands of concurrent users
  • Payment processing
  • User authentication
  • Search functionality
  • On ONE server for quite a while
  • Server-side rendering = no client complexity
  • Stateless CGI = naturally scalable (just add more processes)
  • Simple HTML forms = worked on ANY browser
  • Direct to the point, no abstraction layers

They scaled this to millions of dollars in transactions before having to rewrite. When they did rewrite (to Java/J2EE in the 2000s), it was arguably more complex and harder to maintain.

Compare to today:


Why Go is the Spiritual Successor

This is why Go feels right for this project. It's the closest spiritual successor to "compile a binary that you just run" turning into a webapp - short of actually doing it in C.

Go's Philosophy The Connection to C CGI
  • ./proxy ← that's it. No runtime, no interpreter, no node_modules
  • Compiles to a single static binary
  • Cross-compile for any platform
  • No dependency hell
  • Standard library has HTTP server, templating, everything you need
  • Fast
  • Just works
  • C CGI (1995): Compile → Binary → Run → Serve HTML
  • Go (2025): Compile → Binary → Run → Serve HTML
  • 30 years apart, same mental model

The eBay engineers would look at this Go code and say "Yeah, that's basically what we were doing, but with less segfaults."

Compared to Node.js:

"First install node, then npm install, oh wait wrong node version, install nvm, now install dependencies, oh this package needs python2.7 and node-gyp, oh this package is deprecated, oh security vulnerabilities..."

Compared to Go:

go build → done.

Go gives you C's simplicity and speed but with:

Go is "worse is better" incarnate. It deliberately limits features to maintain simplicity. It refuses to add generics for years because they complicate the language. It has a GC because memory safety matters more than the last 5% of performance for most applications.

For web services, CLIs, and network tools? Go is perfect. It's C for people who want to ship code instead of debug pointer arithmetic.


The Uncomfortable Truth

This project isn't just a fun retro experiment or an exercise in nostalgia. It's proof of concept that the entire web industry chose wrong.

We didn't have to break backwards compatibility. The web didn't have to leave anyone behind. We could have:

Instead we chose:

And the industry is slowly admitting this (htmx, Astro, Remix, server components), but still won't go all the way back to true progressive enhancement because that would mean admitting we wasted 15 years.


What We Lost

When we abandoned progressive enhancement and backwards compatibility, we lost more than just support for old browsers:

  1. Simplicity - A single HTML file could be a complete, functional application
  2. Resilience - Sites worked even when JavaScript failed to load
  3. Accessibility - Semantic HTML made screen readers work by default
  4. Performance - Server-rendered pages loaded instantly
  5. Longevity - Old sites still work decades later (if they followed standards)
  6. Universal Access - The web was truly for everyone, not just the latest hardware

What did we gain?

Was it worth the trade? This project suggests: probably not.


The Path Forward

The uncomfortable realization is that we can't put the genie back in the bottle. The web has moved on. But we can learn from this:

  1. Progressive enhancement still matters - Build for resilience, not just the happy path
  2. Simplicity is a feature - Fewer dependencies = fewer problems
  3. Server-side rendering works - It always has, we just forgot
  4. Backwards compatibility is possible - We chose to abandon it, not forced to
  5. "Worse is better" isn't just a slogan - It's proof that working solutions beat complex ones

Projects like this serve as reminders. The old way wasn't primitive - it was correct. We abandoned working principles for hype, then had to painfully rediscover them.

The modern web is finally circling back to admitting the old way was right, but with 10x the complexity and 0x the backwards compatibility.


Conclusion

This project runs a 2025 AI API on a 1999 browser. It shouldn't be remarkable, but it is - because we've convinced ourselves it's impossible.

It's not impossible. It never was.

We just stopped trying.

Written after debugging CSS box model issues for an hour and having an existential crisis about the state of web development. The boxes finally stayed inside other boxes. Frontend is still chaos. We still chose wrong. Go is still great. "Worse is better" is still vindicated.


Generated as a demonstration of the principles discussed within.
This HTML was written in the style of 1995 eBay - pure, simple, and capable of handling millions.
No CSS files. No JavaScript frameworks. No build pipeline. Just HTML.
It just works.