picostitch
crafting (and) JavaScript
#web

We don’t give people a website any more

The headline is a quote from Hammer and Nails by Stuart Langridge where he states how we make every user's browser a "fat client" (I called it that) by making them work far beyond rendering some HTML+CSS.

Instead of an HTML page, you get some templates and some JSON data and some build tools, and then that compiler runs in your browser and assembles a website out of the component parts. That’s what a “framework” does… it builds the website, in real time, from separate machine-readable pieces, on the user’s computer, every time they visit the website.

I found this blog post through Brian's tweet:

Tom MacWright in Second-guessing the modern web states very rightful a thing I had also thought about a lot: keep around aging JS files and keep serving them. Especially since I had seen this at HolidayCheck when we ran into a serious SEO issue due to exactly this problem. He describes it like this:

So if they open the ‘about page’, keep the tab open for a week, and then request the ‘home page’, then the home page that they request is dictated by the index bundle that they downloaded last week. This is a deeply weird and under-discussed situation. There are essentially two solutions to it:

  • You keep all generated JavaScript around, forever, and people will see the version of the site that was live at the time of their first page request.

I skipped the second solution. I know we all optimize for pings and request run times, etc. but in this case, why not just let this extra ping take place and use ETag as they were meant, MDN says this about it:

The ETag HTTP response header is an identifier for a specific version of a resource. It lets caches be more efficient and save bandwidth, as a web server does not need to resend a full response if the content has not changed.

Most of us don't run pages or work for companies that run pages the size of Google or Facebook. This one extra request might save some overall complexity and allows you to get rid of hashes at the end of every file you ship. My 2 cents.

Tom writes later

And then there’s the authentication story. If you do SSR on any pages that are custom to the user, then you need to forward any cookies or authentication-relevant information to your API backend and make sure that you never cache the server-rendered result.

And I just had to quote this part of his article:

It's sad and I have this view on web tech too, we just make lives hard ourselves. That's why I try to keep this blog as lean as possible. Tell me if you see potential to make it better (and for design tips I am most thankful).

Read it all on https://www.kryogenix.org/days/2020/05/06/hammer-and-nails/
and https://macwright.org/2020/05/10/spa-fatigue.html
and https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/ETag.