Permacomputing principles
Here are some design princples I can get behind: long-term thinking, resilience, flexibility and seamfulness.
Here are some design princples I can get behind: long-term thinking, resilience, flexibility and seamfulness.
I don’t like magic.
I’m not talking about acts of prestidigitation and illusion. I mean the kind of magic that’s used to market technologies. It’s magic. It just works. Don’t think about it.
I’ve written about seamless and seamful design before. Seamlessness is often touted as the ultimate goal of UX—“don’t make me think!”—but it comes with a price. That price is the reduction of agency.
When it comes to front-end development, my distrust of magic tips over into being a complete control freak.
I don’t like using code that I haven’t written and understood myself. Sometimes its unavoidable. I use two JavaScript libraries on The Session. One for displaying interactive maps and another for generating sheet music. As dependencies go, they’re very good but I still don’t like the feeling of being dependant on anything I don’t fully understand.
I can’t stomach the idea of using npm to install client-side JavaScript (which then installs more JavaScript, which in turn is dependant on even more JavaScript). It gives me the heebie-jeebies. I’m kind of astonished that most front-end developers have normalised doing daily trust falls with their codebases.
While I’m mistrustful of libraries, I’m completely allergic to frameworks.
Often I don’t distinguish between libraries and frameworks but the distinction matters here. Libraries are bits of other people’s code that I call from my code. Frameworks are other people’s code that call bits of my code.
Think of React. In order to use it, you basically have to adopt its idioms, its approach, its syntax. It’s a deeper level of dependency than just dropping in a regular piece of JavaScript.
I’ve always avoided client-side React because of its direct harm to end users (over-engineered bloated sites that take way longer to load than they need to). But the truth is that I also really dislike the extra layer of abstraction it puts between me and the browser.
Now, whenever there’s any talk about abstractions someone inevitably points out that, when it comes to computers, there’s always some layer of abstraction. If you’re not writing in binary, you don’t get to complain about an extra layer of abstraction making you uncomfortable.
I get that. But I still draw a line. When it comes to front-end development, that line is for me to stay as close as I can to raw HTML, CSS, and JavaScript. After all, that’s what users are going to get in their browsers.
My control freakery is not typical. It’s also not a very commercial or pragmatic attitude.
Over the years, I’ve stopped doing front-end development for client projects at work. Partly that’s because I’m pretty slow; it makes more sense to give the work to a better, faster developer. But it’s also because of my aversion to React. Projects came in where usage of React was a foregone conclusion. I wouldn’t work on those projects.
I mention this to point out that you probably shouldn’t adopt my inflexible mistrustful attitude if you want a career in front-end development.
Fortunately for me, front-end development still exists outside of client work. I get to have fun with my own website and with The Session. Heck, they even let me build the occasional hand-crafted website for a Clearleft event. I get to do all that the long, hard stupid way.
Meanwhile in the real world, the abstractions are piling up. Developers can now use large language models to generate code. Sometimes the code is good. Sometimes its not. You should probably check it before using it. But some developers just YOLO it straight to production.
That gives me the heebie-jeebies, but then again, so did npm. Is it really all that different? With npm you dialled up other people’s code directly. With large language models, they first slurp up everyone’s code (like, the whole World Wide Web), run a computationally expensive process of tokenisation, and then give you the bit you need when you need it. In a way, large language model coding tools are like a turbo-charged npm with even more layers of abstraction.
It’s not for me but I absolutely understand why it can work in a pragmatic commercial environment. Like Alice said:
Knitting is the future of coding. Nobody knits because they want a quick or cheap jumper, they knit because they love the craft. This is the future of writing code by hand. You will do it because you find it satisfying but it will be neither the cheapest or quickest way to write software.
But as Dave points out:
And so now we have these “magic words” in our codebases. Spells, essentially. Spells that work sometimes. Spells that we cast with no practical way to measure their effectiveness. They are prayers as much as they are instructions.
I shudder!
But again, this too is nothing new. We’ve all seen those codebases that contain mysterious arcane parts that nobody dares touch. coughWebpackcough. The issue isn’t with the code itself, but with the understanding of the code. If the understanding of the code was in one developer’s head, and that person has since left, the code is dangerous and best left untouched.
This, as you can imagine, is a maintenance nightmare. That’s where I’ve seen the real cost of abstractions. Abstractions often really do speed up production, but you pay the price in maintenance later on. If you want to understand the codebase, you must first understand the abstractions used in the codebase. That’s a lot to document, and let’s face it, documentation is the first casuality of almost every project.
So perhaps my aversion to abstraction in general—and large language models in particular—is because I tend to work on long-term projects. This website and The Session have lifespans measured in decades. For these kinds of projects, maintenance is a top priority.
Large language model coding tools truly are magic.
I don’t like magic.
People use “enshittification” to describe platform decay. What I’m describing here is one of the mechanisms that makes that decay feel personal. It’s the constant conversion of your attention into a KPI.
Update: Never mind! It turns that Google’s issue is with unreachable robots.txt files, not absent robots.txt files. They really need to improve their messaging. Stand down everyone.
A bit has been flipped on Google Search.
Previously, the Googlebot would index any web page it came across, unless a robots.txt file said otherwise.
Now, a robots.txt file is required in order for the Googlebot to index a website.
This puzzles me. Until now, Google was all about “organising the world’s information and making it accessible.” This switch-up will limit “the world’s information” to “the information on websites that have a robots.txt file.”
They’re free to do this. Despite what some people think, Google isn’t a utility. It’s a business. Other search engines are available, with different business models. Kagi. Duck Duck Go. Google != the World Wide Web.
I am curious about this latest move with Google Search though. I’d love to know if it only applies to Google’s search bot. Google has other bots out crawling the web: Adsbot-Google, Google-Extended, Googlebot-Image, GoogleOther, Mediapartners-Google. I’m probably missing a few.
If the new default only applies to the searchbot and doesn’t include say, the crawler that’s fracking the web in order train Google’s large language model, then this is how things work now:
It would be good to get some clarity on this. Alas, the Google Search team are notoriously tight-lipped so I’m not holding my breath.
Great minds think alike! I have a very similar HTML web component on the front page of The Session called input-autosuggest.
But perhaps the death of search is good for the future of the web. Perhaps websites can be free of dumb rankings and junky ads that are designed to make fractions of a penny at a time. Perhaps the web needs to be released from the burden of this business model. Perhaps mass readership isn’t possible for the vast majority of websites and was never really sustainable in the first place.
My mind boggles at the thought of using a generative tool based on a large language model to do any kind of qualatitive user research, so every single thing that Gregg says here makes complete sense to me.
Suppose somebody is using a blade. Perhaps they’re in the bathroom, shaving. Or maybe they’re in the kitchen, preparing food.
Suppose they cut themselves with that blade. This might have happened because the blade was too sharp. Or perhaps the blade was too dull.
Either way, it’s going to be tricky to figure out the reason just by looking at the wound.
But if you talk to the person, not only will you find out the reason, you’ll also understand their pain.
It’s October. Autumn is wrapping itself around us, squeezing the leaves from the trees. Summer has slipped away, though it gave us a parting gift of a few pleasant days this week to sit outside at lunchtime.
I’ve got a bit of a ritual now for the end of September. I go to Spain and soak up the last of the sun. There’s an Irish music festival in the beautiful medieval town of Cáceres.
It’s not easy to get to, but that’s all part of the ritual. Set out for Madrid or Seville and spend a night there. Then get on a train for the long journey through a landscape straight out of a spaghetti western.
Once you get to Cáceres there’s nothing to do but enjoy the sun, the food, and the music. So much music! Open-air concerts in a medieval square that go well past midnight. Non-stop sessions scattered throughout the town’s pedestrianised streets.
For me, it’s the perfect way to see out the summer.
A fascinating look at the importance of undersea cables, taken from a new book called The Web Beneath the Waves.
Hi Chris. You mentioned you were off to Portugal soon to explore Lisbon and Porto and I promised I’d send along some food tips from my previous visits.
I’ll skip over the obvious. No doubt you’ll seek out pasteis de nata in Lisbon. And I’m sure someone will convince you to have a francesinha in Porto (perhaps at the tail end of a beery night out).
Personally, I think one of Portugal’s treasures is its tinned fish. Find a spot where you can peruse a selection and have a tin with a beer or a glass of excellent Portugese wine.
In Lisbon there’s Sol E Pesca, just down the street from the Time Out market.
In Porto there’s Prova, though the focus is there is more on cheese.
A lot of the best tinned fish will hail from Matosinhos, a northern suburb of Porto. I recommend making your way up there.
Check out the fish market there, which is also the former home to a digital design school where I spent a week teaching a few years back. At lunch time you can pick out a fish from the market and take it straight to Taberna Lusitana to have them cook it for you.
In the evening, every place in Matosinhos hauls a grill out onto the street to cook sardines. It smells wonderful!
Take every opportunity that comes your way to eat the local percebes—goose barnacles—hand-harvested in risky conditions from the Atlantic coastline.
There are lots of seafood restaurants in Matosinhos but I can personally recommend O Gaveto. Myself and Jessica were enticed in by the owner one evening as we stood outside admiring the fish tank. We ended up having an astoundingly delicious seafood rice.
We also witnessed a mysterious gathering of robed figures bedecked with chains who ate from a large pot filled with a dark mixture. When we asked our waiter about it, he told us it was “the brotherhood of the lamprey!”
Oh, and when you’re in Porto you absolutely must have tripas à moda do Porto—an excellent tripe stew that costs next to nothing and tastes great no matter where you get it.
If you’re eating out along the waterfront, there’s a spot a little further along from the usual touristy spots called Vinhas d’Alho. Get one of the outside tables if you can for a great view of the Port places across the river. Pick out one you like the look of and go for a Port tasting.
Even if you don’t go for a Port tasting, be sure to have a Port Tonico at some point—it’s like a more refreshing version of a gin and tonic, made with white Port.
That’s all I can think of right now. I’m afraid I can’t give you an address for the most memorable meal I had in Porto:
The most unexpected thing I ate in Porto was when I wandered off for lunch on my own one day. I ended up in a little place where, when I walked in, it was kind of like that bit in the Western when the music stops and everyone turns to look. This was clearly a place for locals. The owner didn’t speak any English. I didn’t speak any Portuguese. But we figured it out. She mimed something sandwich-like and said a word I wasn’t familiar with: bifana. Okay, I said. Then she mimed the universal action for drinking, so I said “agua.” She looked at with a very confused expression. “Agua!? Não. Cerveja!” Who am I to argue? Anyway, she produced this thing which was basically some wet meat in a bun. It didn’t look very appetising. But this was the kind of situation where I couldn’t back out of eating it. So I took a bite and …it was delicious! Like, really, really delicious.
Search has bent in quality towards its earliest days, difficult to navigate and often unhelpful. And the remedy may be the same as it was a quarter century ago.
After the sort of winters we have had to endure recently, the spring does seem miraculous, because it has become gradually harder and harder to believe that it is actually going to happen.
George Orwell on the coming of spring during the darkest of times:
It comes seeping in everywhere, like one of those new poison gases which pass through all filters.
The atom bombs are piling up in the factories, the police are prowling through the cities, the lies are streaming from the loudspeakers, but the earth is still going round the sun, and neither the dictators nor the bureaucrats, deeply as they disapprove of the process, are able to prevent it.
This was a day of big conversations, but also one of connection, curiosity, and optimism.
Seeing it all laid out like this really drives home just how much was packed into Research By The Sea.
Throughout the day, speakers shared personal reflections, bold ideas, and practical insights, touching on themes of community, resilience, ethics, and the evolving role of technology.
Some talks brought hard truths about the impact of AI, the complexity of organisational change, and the ethical dilemmas researchers face. Others offered hope and direction, reminding us of the power of community, the importance of accessibility, and the need to listen to nature, to each other, and to the wider world.
Research By The Sea was last Thursday. I’m still digesting it all.
In short, it was excellent. The venue, how smoothly every thing was organised, the talks …oh boy, the talks!
Benjamin did a truly superb job curating this line-up. Everyone really brought their A-game.
As predicted, this wasn’t a day of talks just for researchers. It was far more like a dConstruct. This was big, big picture stuff. Themes of hope, community, nature, technology, inclusion and resilience.
I overheard more than one person in the breaks saying “this was not what I was expecting!” They were saying it in a very positive way, though I wouldn’t be surprised if there were a silent minority in the audience who were miffed that they weren’t getting a day of practical research techniques devoid of politics.
As host, I had the easiest job of the day. All I had to do was say a few words of introduction for each speaker, then sit back down and enjoy every minute of every talk.
The one time when I had to really work was the panel discussion at the end of the day. I really enjoy moderating panels. I’ve seen enough bad panels to know what does and doesn’t work. But this one was tough. The panelists were all great, but because the themes were soooo big, I was worried about it all getting a bit too high-falutin’. People seemed to enjoy it though.
All in all, it was a superb day. If you came along, thank you!
Gotta be honest, #ResearchByTheSea is one of the best conferences I’ve been to in yeeeeeears. So many good, useful, inspiring, thoughtful, provocative talks. Much more about ethics and power and possibility than I’d expected.
Loved it. Thank you, @clearleft.com!
Research by the Sea was one of the best conferences I’ve been to in yeeeeeears. So many good, useful, inspiring, thoughtful, provocative talks. Much more about ethics and power and possibility than I’d expected. None of the ‘utopian bullshit’ you usually get at a product or digital conference, to quote one of the speakers!
From 2005 to 2015 Clearleft ran the dConstruct event here in Brighton (with one final anniversary event in 2022).
I had the great pleasure of curating dConstruct for a while. I’m really proud of the line-ups I put together.
It wasn’t your typical tech event, to put it mildy. You definitely weren’t going to learn practical techniques to bring back into work on Monday morning. If anything, it was the kind of event that might convince you to quit your job on Monday morning.
The talks were design-informed, but with oodles of philosophy, culture and politics.
As you can imagine, that’s not an easy sell. Hence why we stopped running the event. It’s pretty hard to convince your boss to send you to a conference like that.
Sometimes I really miss it though. With everything going on in the tech world right now (and the world in general), it sure would be nice to get together in a room full of like-minded people to discuss the current situation.
Well, here’s the funny thing. There’s a different Clearleft event happening next week. Research By The Sea. On the face of it, this doesn’t sound much like dConstruct. But damn if Benjamin hasn’t curated a line-up of talks that sound very dConstructy!
Those all sound like they’d fit perfectly in the dConstruct archive.
Research By The Sea is most definitely not just for UX researchers—this sounds to me like the event to attend if, like me, you’re alarmed by everything happening right now.
Next Thursday, February 27th, this is the place to be if you’ve been missing dConstruct. See you there!
I’m going to be hosting Research By The Sea on Thursday, February 27th right here in Brighton. I’m getting very excited and nervous about it.
The nervousness is understandable. I want to do a good job. Hosting a conference is like officiating a wedding. You want to put people at ease and ensure everything goes smoothly. But you don’t want to be the centre of attention. People aren’t there to see you. This is not your day.
As the schedule has firmed up, my excitement has increased.
See, I’m not a researcher. It would be perfectly understandable to expect this event to be about the ins and outs of various research techniques. But it’s become clear that that isn’t what Benjamin has planned.
Just as any good researcher or designer goes below the surface to explore the root issues, Research By The Sea is going to go deep.
I mean, just take a look at what Steph will be covering:
Steph discusses approaches in speculative fiction, particularly in the Solarpunk genre, that can help ground our thinking, and provide us—as researchers and designers—tenets to consider our work, and, as humans, to strive towards a better future.
Sign me up!
Michael’s talk covers something that’s been on my mind a lot lately:
Michael will challenge the prevailing belief that as many people as possible must participate in our digital economies.
You just know that a talk called In defence of refusal isn’t going to be your typical conference fare.
Then there are talks about accessibility and intersectionality, indigenous knowledge, designing communities, and navigating organisational complexity. And I positively squeeled with excitement when I read Cennydd’s talk description:
The world is crying out for new visions of the future: worlds in which technology is compassionate, not just profitable; where AI is responsible, not just powerful.
See? It’s very much not just for researchers. This is going to be a fascinating day for anyone who values curiosity.
If that’s you, you should grab a ticket. To sweeten the deal, use the discount code JOINJEREMY to get a chunky 20% of the price — £276 for a conference ticket instead of £345.
Be sure to nab your ticket before February 15th when the price ratchets up a notch.
And if you are a researcher, well, you really shouldn’t miss this. It’s kind of like when I’ve run Responsive Day Out and Patterns Day; sure, the talks are great, but half the value comes from being in the same space as other people who share your challenges and experiences. I know that makes it sound like a kind of group therapy, but that’s because …well, it kind of is.
If someone uses an LLM as a replacement for search, and the output they get is correct, this is just by chance. Furthermore, a system that is right 95% of the time is arguably more dangerous tthan one that is right 50% of the time. People will be more likely to trust the output, and likely less able to fact check the 5%.