Link tags: min

588

sparkline

‘An Overwhelmingly Negative And Demoralizing Force’: What It’s Like Working For A Company That’s Forcing AI On Its Developers - Aftermath

Grim reading from the games industry, especially if you work at Shopify where the CEbrO has just mandated that you have to use this shite.

Open source devs say AI crawlers dominate traffic, forcing blocks on entire countries - Ars Technica

As it currently stands, both the rapid growth of AI-generated content overwhelming online spaces and aggressive web-crawling practices by AI firms threaten the sustainability of essential online resources. The current approach taken by some large AI companies—extracting vast amounts of data from open-source projects without clear consent or compensation—risks severely damaging the very digital ecosystem on which these AI models depend.

Go To Hellman: AI bots are destroying Open Access

AI companies with billions to burn are hard at work destroying the websites of libraries, archives, non-profit organizations, and scholarly publishers, anyone who is working to make quality information universally available on the internet.

FOSS infrastructure is under attack by AI companies

More on how large language bots are DDOSing the web:

LLM scrapers are taking down FOSS projects’ infrastructure, and it’s getting worse.

Please stop externalizing your costs directly into my face

Over the past few months, instead of working on our priorities at SourceHut, I have spent anywhere from 20-100% of my time in any given week mitigating hyper-aggressive LLM crawlers at scale.

This matches my experience with The Session. In fact, while I had this article open in a tab, I had to go deal with a tsunami of large language model bots. It’s really fucking depressing.

Please stop legitimizing LLMs or AI image generators or GitHub Copilot or any of this garbage. I am begging you to stop using them, stop talking about them, stop making new ones, just stop. If blasting CO2 into the air and ruining all of our freshwater and traumatizing cheap laborers and making every sysadmin you know miserable and ripping off code and books and art at scale and ruining our fucking democracy isn’t enough for you to leave this shit alone, what is?

Build It Yourself | Armin Ronacher’s Thoughts and Writings

We’re at a point in the most ecosystems where pulling in libraries is not just the default action, it’s seen positively: “Look how modular and composable my code is!” Actually, it might just be a symptom of never wanting to type out more than a few lines.

It always amazes me when people don’t view dependencies as liabilities. To me it feels like the coding equivalent of going to a loan shark. You are asking for technical debt.

There are entire companies who are making a living of supplying you with the tools needed to deal with your dependency mess. In the name of security, we’re pushed to having dependencies and keeping them up to date, despite most of those dependencies being the primary source of security problems.

But there is a simpler path. You write code yourself. Sure, it’s more work up front, but once it’s written, it’s done.

Plane GPS systems are under sustained attack - is the solution a new atomic clock? - BBC News

A fascinating look at the modern equivalent of the Longitude problem.

Cold Album Drumming - full-album drum covers by Brad Frost

This is a great new musical project from Brad:

Brad Frost plays drums to the albums he knows intimately, but has never drummed to before. Cover to cover. No warm-up. No prep. Totally cold. What could possibly go wrong?

I really enjoyed watching all of The Crane Wife and In Rainbows.

Hallucinations in code are the least dangerous form of LLM mistakes

The moment you run LLM generated code, any hallucinated methods will be instantly obvious: you’ll get an error. You can fix that yourself or you can feed the error back into the LLM and watch it correct itself.

Compare this to hallucinations in regular prose, where you need a critical eye, strong intuitions and well developed fact checking skills to avoid sharing information that’s incorrect and directly harmful to your reputation.

With code you get a powerful form of fact checking for free. Run the code, see if it works.

trot

Working on this project is great but ten minutes into it and I already miss the resilience of the web. I miss how you have to really fuck things up to make a browser yell at you or implode.

AI is Stifling Tech Adoption | Vale.Rocks

Want to use all those great features that have been in landing in browsers over the past year or two? View transitions! Scroll-driven animations! So much more!

Well, your coding co-pilot is not going to going to be of any help.

Large language models, especially those on the scale of many of the most accessible, popular hosted options, take humongous datasets and long periods to train. By the time everything has been scraped and a dataset has been built, the set is on some level already obsolete. Then, before a model can reach the hands of consumers, time must be taken to train and evaluate it, and then even more to finally deploy it.

Once it has finally released, it usually remains stagnant in terms of having its knowledge updated. This creates an AI knowledge gap. A period between the present and AI’s training cutoff. This gap creates a time between when a new technology emerges and when AI systems can effectively support user needs regarding its adoption, meaning that models will not be able to service users requesting assistance with new technologies, thus disincentivising their use.

So we get this instead:

I’ve anecdotally noticed that many AI tools have a ‘preference’ for React and Tailwind when asked to tackle a web-based task, or even to create any app involving an interface at all.

Software Folklore ― Andreas Zwinkau

Detective stories and tales of bughunting in software and hardware.

Sometimes bugs have symptoms beyond belief. This is a collection of such stories from around the web.

What I’ve learned about writing AI apps so far | Seldo.com

LLMs are good at transforming text into less text

Laurie is really onto something with this:

This is the biggest and most fundamental thing about LLMs, and a great rule of thumb for what’s going to be an effective LLM application. Is what you’re doing taking a large amount of text and asking the LLM to convert it into a smaller amount of text? Then it’s probably going to be great at it. If you’re asking it to convert into a roughly equal amount of text it will be so-so. If you’re asking it to create more text than you gave it, forget about it.

Depending how much of the hype around AI you’ve taken on board, the idea that they “take text and turn it into less text” might seem gigantic back-pedal away from previous claims of what AI can do. But taking text and turning it into less text is still an enormous field of endeavour, and a huge market. It’s still very exciting, all the more exciting because it’s got clear boundaries and isn’t hype-driven over-reaching, or dependent on LLMs overnight becoming way better than they currently are.

HTML Is Actually a Programming Language. Fight Me | WIRED

When haters deny HTML’s status as a programming language, they’re showing they don’t understand what a language really is. Language is not instructing an interlocutor what to do in a way that leaves no room for other interpretations; it is better and richer than that. Like human language, HTML is conversational. It is remarkably adept at adapting to context. It can take a different shape on any machine, from a desktop browser or an e-reader screen to a mobile app or a screen reader for the blind (so long as that device is built to present hypertext).

Hell, yeah!

Ultimately, even as HTML has become the province of professionals, it cannot be gatekept. This is what makes so many programmers so anxious about the web, and sometimes pathetically desperate to maintain the all-too-real walls they’ve erected between software engineers and web developers.

Hell, yeeeeaaaaahhh!!!

What other programmers might say dismissively is something HTML lovers embrace: Anyone can do it. Whether we’re using complex frameworks or very simple tools, HTML’s promise is that we can build, make, code, and do anything we want.

CSS wants to be a system - daverupert.com

CSS wants you to build a system with it. It wants styles to build up, not flatten down.

Truth!

Tabloid: the clickbait headline programming language

Tabloid is a turing-complete programming language for writing programs in the style of clickbait news headlines.

I don’t have time to learn React - Keith Cirkel

React is a non-transferable skill.

React proponents might claim that React will teach you modern UI, but from what I’ve seen it barely copes with modern UI. autofocus is broken, custom elements don’t work in all but the experimental version, using any “modern” features like dialog or popovers requires useEffect, and the synthetic event system teaches you so little about how DOM actually works. This isn’t modern UI, it’s UI from 2013 at its inception. I don’t have the time left in my career to pick up UI paradigms that haven’t evolved much beyond from when Barack Obama was in office.

When I mentor early career developers and they ask me what they should learn, I can’t say React, they don’t have time. I mean sure, pick up enough React to land you the inevitable job doing it, but it’s not going to level up your career.

New CSS that can actually be used in 2024 | Thomasorus

Logical properties, container queries, :has, :is, :where, min(), max(), clamp(), nesting, cascade layers, subgrid, and more.

Help us choose the final syntax for Masonry in CSS | WebKit

I really like the way that the thinking here is tied back to Bert Bos’s original design principles for CSS.

This is a deep dive into the future of CSS layout—make a cup of tea and settle in for some good nerdiness!

Living In A Lucid Dream

I love the way that Claire L. Evans writes.