Performance Means Progressive Enhancement
Let's assume JavaScript execution is instant. For a single-page application, here's the process to do something useful:
- Fetch the bootstrapping document
- Fetch the JS renderer
- Fetch the content
You can mitigate this in various ways, but best-case scenario is at least two round-trips, paying the latency cost twice. That doesn't sound bad until you see the latencies users put up with — the bottom 10% can only be described as pitiful. And filesize, well...
Let's no longer assume JS execution is instant. Malte Ubl says phones take 1ms to parse 1kB of decompressed JavaScript. DNS can take 100–800ms before even finding the right site. And TCP requires another back-and-forth to start downloading. HTTPS? Have some more round trips. Using a CDN for the framework? That's another DNS query and HTTPS handshake. Because of all that, one request over 3G can stall anywhere from 200 milliseconds, to 3.5 seconds — before the file itself starts loading.
Two HTTP requests isn't fast enough, if Critical CSS is any indication. (Paying the latency cost once may not be fast enough either, as AMP-HTML emphasizes with its prerendering.)
Frameworks know this. They all have "isomorphic rendering" efforts, where the same code renders on both the client and the server. But it won't be popular until they require it. And their current implementations run into the "Interface Uncanny Valley" problem.
This is why I prefer Progressive Rendering + Bootstrapping. I’d love to see more frameworks support this approach:
Users need real HTML. "Placeholder HTML" that only looks functional until the JS is ready isn't good enough, because they do things like click links and submit searches before the page is done executing, or after something goes wrong with the JavaScript.
Our frameworks are a decade behind
The Ember framework's official stance on performance is that browsers will catch up. Indeed, now that Apple devices are "fast enough" for Ember, it's really caught on.
That took 9 years. And if you care about Android, it'll be a while yet.
I do agree with Ember's head, Tom Dale, when he says without some sort of unifying framework, it's too easy to grow your own that performs even worse. But our current choices aren't enough, either. They're always a decade behind the next thing for the Web. Once a platform catches up, it's the next highly-limited platform that can't handle them.
Some computers and networks get faster, but others just get more widespread. Now that Ember's almost fast enough for mobile, the Internet of Things, Physical Web, and other exciting developments are poised.
For example, the proposed Physical Web "fat beacons" are maximum 40 kilobytes. Technologies like Service Worker are shaking up web thang architecture as we know it. Heck, Mozilla's got a new browser engine that turns performance knowledge on its head — HTML+CSS is faster than <canvas>
! The golden path shifts under our feet.
No-JS is future-friendly
The only performance strategy that holds true across every platform and network is "do as little as possible". The ability to exclude JS from those that can't handle it is critical. What else could work across:
- Game consoles with hyperspecialized capabilities (games never looked better, but their browsing stutters)
- Cheapo tablets and smart TVs, with scads of pixels but a wimpy chip to paint them all with
- Outdated hardware, which grows a longer tail every day
- One core, instead of multiple
- Many slow cores, instead of a few fast ones
- Devices lacking hardware acceleration
- Smart refrigerators and other blasted "Internet of Things" fripperies
- CPU architectures that aren't x64 or ARM
- New devices we've only begun to imagine
- Crowded public WiFi with unpredictable latency
- Fallback data connections because carriers exaggerate their coverage for some reason
- Shaky connections and their Lie-Fi
- Network snags like retransmitted packets, bad routing, reception changes, moving around with your phone, etc.
The ability to cut the mustard is critical. If your site works with just HTML, then you can choose to send bad performers only that HTML on the fly.
That’s not to say no-JavaScript support isn’t important. Lanyrd’s mobile site works fine without JavaScript. In fact, we avoid parsing JavaScript on older devices to keep things simple and quick.
Whether or not such devices "should" include the Web, they will. People will visit with whatever bizarro browser is nearest, whether they're supposed to or not. As device diversity accelerates, "normal" browsers will no longer be a majority, and the average experience becomes less and less of a useful metric. See Tom's "Unpredictable Performance" heading.
We can't rely on browsers, HTTP/2, CDNs, or anything else to make our web fast. (Indeed, devs get angry if they do, if Opera Mini and Google Web Light are any sign.) And we have to start from the bottom up, with the simplest thing that could possibly work.
[This post is written for #startYourShift's August theme of performance. And the previous month's theme, really.]