Item Flow, Part 1: A new unified concept for layout | WebKit
I really like the idea of unifying some layout values in CSS. If you’ve got any feedback, please chip in!
I really like the idea of unifying some layout values in CSS. If you’ve got any feedback, please chip in!
There’s a new proposal for giving developers more control over styling form controls. I like it.
It’s clearly based on the fantastic work being done by the Open UI group on the select
element. The proposal suggests that authors can opt-in to the new styling possibilities by declaring:
appearance: base;
So basically the developer is saying “I know what I’m doing—I’m taking the controls.” But browsers can continue to ship their default form styles. No existing content will break.
The idea is that once the developer has opted in, they can then style a number of pseudo-elements.
This proposal would apply to pretty much all the form controls you can think of: all the input
types, along with select
, progress
, meter
, buttons and more.
But there’s one element more that I wish were on the list:
legend
I know, technically it’s not a form control but legend
and fieldset
are only ever used within forms.
The legend
element is notoriously annoying to style. So a lot of people just don’t bother using it, which is a real shame. It’s like we’re punishing people for doing the right thing.
Wouldn’t it be great if you, as a developer, had the option of saying “I know what I’m doing—I’m taking the controls”:
legend {
appearance: base;
}
Imagine if that nuked the browser’s weird default styles, effectively turning the element into a span
or div
as far as styling is concerned. Then you could style it however you wanted. But crucially, if browsers shipped this, no existing content would break.
The shitty styling situation for legend
(and its parent fieldset
) is one of those long-standing annoyances that seems to have fallen down the back of the sofa of browser vendors. No one’s going to spend time working on it when there are more important newer features to ship. That’s why I’d love to see it sneak in to this new proposal for styling form controls.
I was in Amsterdam last week. Just like last year I was there to help out Vasilis’s students with a form-based assignment:
They’re given a PDF inheritance-tax form and told to convert it for the web.
Yes, all the excitement of taxes combined with the thrilling world of web forms.
(Side note: this time they were told to style it using the design system from the Dutch railway because the tax office was getting worried that they were making phishing sites.)
I saw a lot of the same challenges again. I saw how students wished they could specify a past date or a future date in a date picker without using JavaScript. And I saw them lamenting the time they spent styling legend
s that worked across all browsers.
Right now, Mason Freed has an open issue on the new proposal with his suggestion to add some more elements to consider. Both legend
and fieldset
are included. That gets a thumbs-up from me.
This looks handy: a service to extract the RSS feed of a podcast (y’know—the thing that actually makes a podcast a podcast) from walled gardens that obfuscate the feed’s location: Apple Podcasts, Spotify, and Soundcloud.
Checked in at Fox On the Downs. Sunday roast — with Jessica
The web is open, apps are closed. The majority of web users have installed an ad blocker (which is also a privacy blocker). But no one installs an ad blocker for an app, because it’s a felony to distribute that tool, because you have to reverse-engineer the app to make it. An app is just a website wrapped in enough IP so that the company that made it can send you to prison if you dare to modify it so that it serves your interests rather than theirs.
Rich suggests another reason why the UX of websites on mobile is so shit these days:
The path to installing a native app is well trodden. We search the App Store (or ironically follow a link from a website), hit ‘Get’ and the app is downloaded to our phone’s home screen, ready to use any time with a simple tap.
A PWA can also live on your home screen, nicely indistinguishable from a native app. But the journey to getting a PWA – or indeed any web app – onto your home screen remains convoluted to say the least. This is the lack of equivalence I’m driving at. I wonder if the mobile web experience would suck as badly if web apps could be installed just as easily as native apps?
Here’s a post outlining all the great things you can do in mobile web browsers today: Your App Should Have Been A Website (And Probably Your Game Too):
Today’s browsers are powerhouses. Notifications? Check. Offline mode? Check. Secure payments? Yep, they’ve got that too. And with technologies like WebAssembly and WebGPU, web games are catching up to native-level performance. In some cases, they’re already there.
This is all true. But this post from John Gruber is equally true: One Bit of Anecdata That the Web Is Languishing Vis-à-Vis Native Mobile Apps:
I won’t hold up this one experience as a sign that the web is dying, but it sure seems to be languishing, especially for mobile devices.
As John points out, the problems aren’t technical:
There’s absolutely no reason the mobile web experience shouldn’t be fast, reliable, well-designed, and keep you logged in. If one of the two should suck, it should be the app that sucks and the website that works well. You shouldn’t be expected to carry around a bundle of software from your utility company in your pocket. But it’s the other way around.
He’s right. It makes no sense, but this is the reality.
Ten or fifteen years ago, the gap between the web and native apps on mobile was entirely technical. There were certain things that you just couldn’t do in web browsers. That’s no longer the case now. The web caught up quite a while back.
But the experience of using websites on a mobile device is awful. Never mind the terrible performance penalties incurred by unnecessary frameworks and libraries like React and its ilk, there’s the constant game of whack-a-mole with banners and overlays. What’s just about bearable in a large desktop viewport becomes intolerable on a small screen.
This is not a technical problem. This doesn’t get solved by web standards. This is a cultural problem.
First of all, there’s the business culture. If your business model depends on tracking people or pushing newsletter sign-ups, then it’s inevitable that your website will be shite on mobile.
Mind you, if your business model depends on tracking people, you’re more likely to try push people to download your native app. Like Cory Doctorow says:
50% of web users are running ad-blockers. 0% of app users are running ad-blockers, because adding a blocker to an app requires that you first remove its encryption, and that’s a felony (Jay Freeman calls this ‘felony contempt of business-model’).
Matt May brings up the same point in his guide, How to grey-rock Meta:
Remove Meta apps from your devices and use only the mobile web versions. Mobile apps have greater access to your personal data, provided the app requests those privileges, and Facebook and Instagram in particular (more so than WhatsApp, another Meta property) request the vast majority of those privileges. This includes precise GPS data on where you are, whether or not you are using the app.
Ironically, it’s the strength of the web—and web browsers—that has led to such shitty mobile web experiences. The pretty decent security model on the web means that sites have to pester you.
Part of the reason why you don’t see the same egregious over-use of pop-ups and overlays in native apps is that they aren’t needed. If you’ve installed the app, you’re already being tracked.
But when I describe the dreadful UX of most websites on mobile as a cultural problem, I don’t just mean business culture.
Us, the people who make websites, designers and developers, we’re responsible for this too.
For all our talk of mobile-first design for the last fifteen years, we never really meant it, did we? Sure, we use media queries and other responsive techniques, but all we’ve really done is make sure that a terrible experience fits on the screen.
As developers, I’m sure we can tell ourselves all sorts of fairy tales about why it’s perfectly justified to make users on mobile networks download React, Tailwind, and megabytes more of third-party code.
As designers, I’m sure we can tell ourselves all sorts of fairy tales about why intrusive pop-ups and overlays are the responsibility of some other department (as though users make any sort of distinction).
Worst of all, we’ve spent the last fifteen years teaching users that if they want a good experience on their mobile device, they should look in an app store, not on the web.
Ask anyone about their experience of using websites on their mobile device. They’ll tell you plenty of stories of how badly it sucks.
It doesn’t matter that the web is the perfect medium for just-in-time delivery of information. It doesn’t matter that web browsers can now do just about everything that native apps can do.
In many ways, I wish this were a technical problem. At least then we could lobby for some technical advancement that would fix this situation.
But this is not a technical problem. This is a people problem. Specifically, the people who make websites.
We fucked up. Badly. And I don’t see any signs that things are going to change anytime soon.
But hey, websites on desktop are just great!
Working on this project is great but ten minutes into it and I already miss the resilience of the web. I miss how you have to really fuck things up to make a browser yell at you or implode.
Many interactions are not possible without JavaScript, but that doesn’t mean we should look to write more than we have to. The server doing something useful is a requirement for building an interesting business. The client doing something is often a nice-to-have.
There’s also this:
It’s really fast
One of the arguments for a SPA is that it provides a more reactive customer experience. I think that’s mostly debunked at this point, due to the performance creep and complexity that comes in with a more complicated client-server relationship.
LLMs are good at transforming text into less text
Laurie is really onto something with this:
This is the biggest and most fundamental thing about LLMs, and a great rule of thumb for what’s going to be an effective LLM application. Is what you’re doing taking a large amount of text and asking the LLM to convert it into a smaller amount of text? Then it’s probably going to be great at it. If you’re asking it to convert into a roughly equal amount of text it will be so-so. If you’re asking it to create more text than you gave it, forget about it.
Depending how much of the hype around AI you’ve taken on board, the idea that they “take text and turn it into less text” might seem gigantic back-pedal away from previous claims of what AI can do. But taking text and turning it into less text is still an enormous field of endeavour, and a huge market. It’s still very exciting, all the more exciting because it’s got clear boundaries and isn’t hype-driven over-reaching, or dependent on LLMs overnight becoming way better than they currently are.
I have to agree with John here:
There’s absolutely no reason the mobile web experience shouldn’t be fast, reliable, well-designed, and keep you logged in. If one of the two should suck, it should be the app that sucks and the website that works well. You shouldn’t be expected to carry around a bundle of software from your utility company in your pocket. But it’s the other way around.
There’s absolutely no technical reason why it should be this way around. This is a cultural problem with “modern front-end web development”.
Remember when every company rushed to make an app? Airlines, restaurants, even your local coffee shop. Back then, it made some sense. Browsers weren’t as powerful, and apps had unique features like notifications and offline access. But fast-forward to today, and browsers can do all that. Yet businesses still push native apps as if it’s 2010, and we’re left downloading apps for things that should just work on the web.
This is all factually correct, but alas as Cory Doctorow points out, you can’t install an ad-blocker in a native app. To you and me, that’s a bug. To short-sighted businesses, it’s a feature.
(When I say “ad-blocker”, I mean “tracking-blocker”.)
Yesterday when I mentioned my paranoia of third-party dependencies on The Session, I said:
I’ve built in the option to switch between multiple geocoding providers. When one of them inevitably starts enshittifying their service, I can quickly move on to another. It’s like having a “go bag” for geocoding.
(Geocoding, by the way, is when you provide a human-readable address and get back latitude and longitude coordinates.)
My paranoia is well-founded. I’ve been using Google’s geocoding API, which is changing its pricing model from next March.
You wouldn’t know it from the breathlessly excited emails they’ve been sending about it, but this is not a good change for me. I don’t do that much geocoding on The Session—around 13,000 or 14,000 requests a month. With the new pricing model that’ll be around $15 to $20 a month. Currently I slip by under the radar with the free tier.
So it might be time for me to flip that switch in my code. But which geocoding provider should I use?
There are plenty of slop-like listicles out there enumerating the various providers, but they’re mostly just regurgitating the marketing blurbs from the provider websites. What I need is more like a test kitchen.
Here’s what I did…
I took a representative sample of six recent additions to the sessions section of thesession.org. These examples represent places in the USA, Ireland, England, Scotland, Northern Ireland, and Spain, so a reasonable spread.
For each one of those sessions, I’m taking:
I’m deliberately not including the street address. Quite often people don’t bother including this information so I want to see how well the geocoding APIs cope without it.
I’ve scored the results on a simple scale of good, so-so, and just plain wrong.
Then I tot up those results for an overall score for each provider.
When I tried my six examples with twelve different geocoding providers, these were the results:
Provider | USA | England | Ireland | Spain | Scotland | Northern Ireland | Total |
---|---|---|---|---|---|---|---|
1 | 1 | 1 | 1 | 1 | 1 | 7 | |
Mapquest | 1 | 1 | 1 | 1 | 1 | 1 | 7 |
Geoapify | 0 | 1 | 1 | 0 | 1 | 0 | 3 |
Here | 1 | 1 | 0 | 1 | 0 | 0 | 3 |
Mapbox | 1 | 1 | 0 | 1 | 1 | -1 | 3 |
Bing | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
Nominatim | 0 | 0 | 0 | 0 | -1 | 1 | 0 |
OpenCage | -1 | 1 | 0 | 0 | 0 | -1 | -1 |
Tom Tom | -1 | -1 | 0 | 0 | -1 | 1 | -2 |
Positionstack | 0 | -1 | 0 | -1 | 1 | -1 | -2 |
Locationiq | -1 | 0 | -1 | 0 | 0 | -1 | -3 |
Map Maker | -1 | 0 | -1 | -1 | -1 | -1 | -5 |
Some interesting results there. I was surprised by how crap Bing is. I was also expecting better results from Mapbox.
Most interesting for me, Mapquest is right up there with Google.
So now that I’ve got a good scoring system, my next question is around pricing. If Google and Mapquest are roughly comparable in terms of accuracy, how would the pricing work out for each of them?
Let’s say I make 15,000 API requests a month. Under Google’s new pricing plan, that works out at $25. Not bad.
But if I’ve understood Mapquest’s pricing correctly, I reckon I’ll just squeek in under the free tier.
Looks like I’m flipping the switch to Mapquest.
If you’re shopping around for geocoding providers, I hope this is useful to you. But I don’t think you should just look at my results; they’re very specific to my needs. Come up with your own representative sample of tests and try putting the providers through their paces with your data.
If, for some reason, you want to see the terrible PHP code I’m using for geocoding on The Session, here it is.
The Session has been online for over 20 years. When you maintain a site for that long, you don’t want to be relying on third parties—it’s only a matter of time until they’re no longer around.
Some third party APIs are unavoidable. The Session has maps for sessions and other events. When people add a new entry, they provide the address but then I need to get the latitude and longitude. So I have to use a third-party geocoding API.
My code is like a lesson in paranoia: I’ve built in the option to switch between multiple geocoding providers. When one of them inevitably starts enshittifying their service, I can quickly move on to another. It’s like having a “go bag” for geocoding.
Things are better on the client side. I’m using other people’s JavaScript libraries—like the brilliant abcjs—but at least I can self-host them.
I’m using Leaflet for embedding maps. It’s a great little library built on top of Open Street Map data.
A little while back I linked to a new project called OpenFreeMap. It’s a mapping provider where you even have the option of hosting the tiles yourself!
For now, I’m not self-hosting my map tiles (yet!), but I did want to switch to OpenFreeMap’s tiles. They’re vector-based rather than bitmap, so they’re lovely and crisp.
But there’s an issue.
I can use OpenFreeMap with Leaflet, but to do that I also have to use the MapLibre GL library. But whereas Leaflet is 148K of JavaScript, MapLibre GL is 800K! Yowzers!
That’s mahoosive by the standards of The Session’s performance budget. I’m not sure the loveliness of the vector maps is worth increasing the JavaScript payload by so much.
But this doesn’t have to be an either/or decision. I can use progressive enhancement to get the best of both worlds.
If you land straight on a map page on The Session for the first time, you’ll get the old-fashioned bitmap map tiles. There’s no MapLibre code.
But if you browse around The Session and then arrive on a map page, you’ll get the lovely vector maps.
Here’s what’s happening…
The maps are embedded using an HTML web component called embed-map
. The fallback is a static image between the opening and closing tags. The web component then loads up Leaflet.
Here’s where the enhancement comes in. When the web component is initiated (in its connectedCallback
method), it uses the Cache API to see if MapLibre has been stored in a cache. If it has, it loads that library:
caches.match('/path/to/maplibre-gl.js')
.then( responseFromCache => {
if (responseFromCache) {
// load maplibre-gl.js
}
});
Then when it comes to drawing the map, I can check for the existence of the maplibreGL
object. If it exists, I can use OpenFreeMap tiles. Otherwise I use the old Leaflet tiles.
But how does the MapLibre library end up in a cache? That’s thanks to the service worker script.
During the service worker’s install
event, I give it a list of static files to cache: CSS, JavaScript, and so on. That includes third-party libraries like abcjs, Leaflet, and now MapLibre GL.
Crucially this caching happens off the main thread. It happens in the background and it won’t slow down the loading of whatever page is currently being displayed.
That’s it. If the service worker installation works as planned, you’ll get the nice new vector maps. If anything goes wrong, you’ll get the older version.
By the way, it’s always a good idea to use a service worker and the Cache API to store your JavaScript files. As you know, JavaScript is unduly expensive to performance; not only does the JavaScript file have to be downloaded, it then has to be parsed and compiled. But JavaScript stored in a cache during a service worker’s install
event is already parsed and compiled.
I wrote a book about service workers. It’s called Going Offline. It was first published by A Book Apart in 2018. Now it’s available to read for free online.
If you want you can read the book as a PDF, an ePub, or .mobi, but I recommend reading it in your browser.
Needless to say the web book works offline. Once you go to goingoffline.adactio.com you can add it to the homescreen of your mobile device or add it to the dock on your Mac. After that, you won’t need a network connection.
The book is free to read. Properly free. Not the kind of “free” where you have to supply an email address first. Why would I make you go to the trouble of generating a burner email account?
The site has no analytics. No tracking. No third-party scripts of any kind whatsover. By complete coincidence, the site is fast. Funny that.
For the styling of this web book, I tweaked the stylesheet I used for HTML5 For Web Designers. I updated it a little bit to use logical properties, some fluid typography and view transitions.
In the process of converting the book to HTML, I got reaquainted with what I had written almost seven years ago. It was kind of fun to approach it afresh. I think it stands up pretty darn well.
Ethan wrote about his feelings when he put two of his books online, illustrated by that amazing photo that always gives me the feels:
I’ll miss those days, but I’m just glad these books are still here. They’re just different than they used to be. I suppose I am too.
Anyway, if you’re interested in making your website work offline, have a read of Going Offline. Enjoy!
Back in June I documented a bug on macOS in how Spaces (or whatever they call they’re desktop management thingy now) works with websites added to the dock.
I’m happy to report that after upgrading to Sequoia, the latest version of macOS, the bug has been fixed! Excellent!
Not only that, but there’s another really great little improvement…
Let’s say you’ve installed a website like The Session by adding it to the dock. Now let’s say you get an email in Apple Mail that includes a link to something on The Session. It used to be that clicking on that link would open it in your default web browser. But now clicking on that link opens it in the installed web app!
It’s a lovely little enhancement that makes the installed website truly feel like a native app.
Websites in the dock also support the badging API, which is really nice!
I wonder if there’s much point using wrappers like Electron any more? I feel like they were mostly aiming to get that parity with native apps in having a standalone application launched from the dock.
Now all you need is a website.
The biggest issue remains discovery. Unless you already know that it’s possible to add a website to the dock, you’re unlikely to find out about it. That’s why I’ve got a page with installation instructions on The Session.
Still, the discovery possibilities on Apples’s desktop devices are waaaaay better than on Apple’s mobile devices.
Apple are doing such great work on their desktop operating system to make websites first-class citizens. Meanwhile, they’re doing less than nothing on their mobile operating system. For a while there, they literally planned to break all websites added to the homescreen. Fortunately they were forced to back down.
But it’s still so sad to see how Apple are doing everything in their power to prevent people from finding out that you can add websites to your homescreen—despite (or perhaps because of) the fact that push notifications on iOS only work if the website has been added to the home screen!
So while I’m really happy to see the great work being done on installing websites for desktop computers, I’m remain disgusted by what’s happening on mobile:
At this point I’ve pretty much given up on Apple ever doing anything about this pathetic situation.
“And so what we did is we started looking at, internally, all of the places where we’re using web technology — so all of our internal web UIs — and realized that they were just really unacceptably slow.”
Why were they slow? The answer: React.
“We realized that our performance, especially on low-end machines, was really terrible — and that was because we had adopted this React framework, and we had used React in probably one of the worst ways possible.”
Checked in at La Corde à Linge. Spätzle
Checked in at Royal 26. Pairing a good book with a glass of Pinot Gris
Checked in at Chez Yvonne. Choucroute garní — with Jessica