These blog posts are fascinating to read. I don't have a personal blog, but if I did I'm sure I would've written a very similar post as I've been wrestling with similar thoughts over the last few weeks. I have the distinct sense that I will look back on February 2026 as an inflection point, where AI crossed over from being an interesting parlor trick to something that fundamentally and irreversibly altered what I do day-to-day. It's bittersweet, for sure - it feels inevitable that the craft of software development that I've loved for years will be seen as an archaic relic at some point in the not too distant future. It may be several years yet before the impact is broadly felt (the full impact of today's frontier models has yet to be felt by the general public - to say nothing of models that will be released in the next few years) but this train doesn't seem to be slowing down anytime soon. This post was a helpful reminder that who I am is not defined by the code I write (or don't write) - there's so much more to life than code.
One part of me tries to resist and tell you that our craft is not becoming an archaic relic, the other half already knows you‘re right. We just can‘t put the ghost back into the bottle and now‘s a good time to re-calibrate your passion.
I look at it like this: Yes, AI can write code. It can write it much faster than I can. Sometimes it can also write it better than I can.
But: programming languages, libraries, and abstractions are not going away. It is still possible (and might always be possible) to get deep into the weeds of Python or Rust or whatever to understand how those work and really harness them to their full potential, or develop them further. It just won't be _compulsary_ (in most industries) if your only goal is to trade lines of code for dollars in your bank account.
> (the full impact of today's frontier models has yet to be felt by the general public - to say nothing of models that will be released in the next few years)
We definitely saw some kind of non-linear step function jump in quality around the beginning of the year - it's hard to express how good Claude opus/sonnet 4.6 is now. However, I wonder if we're going to see the same kind of improvement from here? It's kind of like we got to the 80% point but the next 20% is going to be a lot harder/take longer than that first 80% (pareto principle). Also, as more and more code out there is AI generated it's going to be like the snake eating it's own tail. Training models on AI generated code doesn't seem like it will lead to improvements.
There was some 3D printer slicer software I needed that wouldnt run, when I finally figured out why it had to do with GLIBC being out of date. I have used Debian since like 2008, and Ubuntu since the mid 2010s so I am accustomed to doing PPA's and what not, but something in me broke and I wanted to finally try something more bleeding edge. I nearly went for Fedora but the version I wanted to try didn't even boot (I don't like to waste any time with command line incantations anymore) so I looked up EndeavourOS I don't remember how I found it, I think a friend said someone they knew used it (turns out they dont LOL) so I gave it a shot.
I had bad experiences with Arch before because of Manjaro, but in hindsight, I think the main issues I had were more to do with how Pacman can get insanely nuanced. When you update packages you have to know what you're doing, it will update all weird, its not like Debian or Ubuntu upgrades where it installs / uninstalls what you do and don't need unless you tell it to be that nuanced.
Long term stability is less important for gaming computers than having the most cutting edge (and theoretically highest performance) drivers. That's why the community leans so heavily towards arch.
I'm genuinely curious as to what the key differences are (especially those that would cause someone to switch), as someone who is pretty tech savvy but whose use of Linux as a daily driver is admittedly pretty weak.
You usually try a few distros, until you find the one that does whenever you needed, and then you stick with it for 15 years ;)
From my own experience: 15 years ago, when (except for academia), Linux was very nisje, it was hard to use it. Random rare errors would pop up. On Windows you would know someone who knew what to do, but with Linux? So I chose Ubuntu, because it had the most support. Solution to any error could be found on askubuntu (?) forums. But if you had a friend, you would choose his system and get help from him. I once had university admins very happy to help me with something and even give me some tips.
Nowadays it really doesn't matter that much, other than extra easy (with an LLM everything is already easy) installation of drivers (POP os?)/initial programs you used on Windows (on Mate it takes 10min due to a special GUI appstore).
BUT there are reasons to switch. Like Ubuntu's pushing of very annoying snaps, making it very hard to get Firefox without a snap. Snaps are annoying, because they don't have a cleaning mechanism and old versions just clog your hard drive. They take forever to launch and it's just not a good idea for a browser. Don't mind snaps for other things.
There is also Desktop Environment support and support for hidpi monitors and such.
Other than that, there is a little of philosophy. Like super FOSS and idealistic like Debian (i guess? Pls correct me if I'm wrong). Or more business aligned, like Redhat/Fedora. Or elitist that like to waste their users time and make them read manuals for fdisk like Arch, where you have to format your hard drive without GParted or any other GUI.
I'm no pro, but that's a little that came to mind if you wanted to know what mattered in the past.
not OP, but for some it might be availability of latest versions packages (say, you've heard about new major version of Bash or Vim being released today, and wondering how soon it might be available in your distro packages), and, as someone else mentioned, less update stress due to lack of "major version bumps" - just remember to subscribe to https://archlinux.org/news/ and watch out for entries requiring "manual intervention".
I would say EndeavourOS is the "Ubuntu" to "Arch" if you will. The installer is easy, and it comes with "yay" out of the box which is a frontend to Pacman which holds your hand in just the right ways. If I want to update my OS I type "yay" into a terminal, hit enter and confirm the packages needing updating (or select which ones I want) and type my password, and that's it. In the past with Manjaro I did a system update with Pacman, and problems ensued.
Fellow SourceTree apologist here. It remains one of the first things I install on a new machine. I'll do simple stuff directly in the CLI, but stick with SourceTree for anything moderately complicated (as you've mentioned).
> It'd be great to change the default branch used for creating new workspaces.
Yeah you can actually change this now! If you click the repo name you can make changes to the "setup script". If you added `git checkout -b "branch name"` it would run that on every new workspace instance.
At the moment it's mostly Cursor or VS Code, but I was actually thinking of SourceTree. I'd like to look at the pending changes and manage the commits myself, and I could do that if I could add "open -a SourceTree ." as a custom command. I didn't see a place to edit a setup script, is that just on the filesystem?
And if you're one of today's lucky 10,000 and haven't heard of the concept of "lucky 10,000", you can read the relevant XKCD here: https://xkcd.com/1053/
For me, it's the fact that content generated by an LLM is fundamentally different than content that comes directly from a search index, but displaying them alongside each other conflates the two. Most people don't know the difference, and place the same level of importance (or maybe even more importance) on AI-generated content. Yes, this content is convenient. However, if the content isn't accurate or correct (which it may or may not be, given that it's just a statistically likely sequence of tokens) then is it actually beneficial as a whole?
That's a big part of what makes this game enjoyable - a clue that is very obvious to one person might not even cross the mind of someone else. To anyone reading this who hasn't played, it's definitely worth giving it a try.
Agreed, big fan of codenames in general but it plays its best when you’re playing against / alongside people that you’ve known for a while. The metagaming aspect of structuring clues to who your partner is really takes it to the next level.
Great short story. Several times while reading it, I wished that I could download Abelique on the app store and try it out - I guess I'll have to settle for picking up my sketchbook instead.
reply