One Cup of Cappuccino Then I Go, by Paola Pivi

Saw this print while I was in Europe and liked it enough to buy a copy. Finally got it framed after several months, and now it’s on my wall. Turned out great.

Framed print of 'One Cup of Cappuccino Then I Go' by Paola Pivi. Print features a scene of many cappucinos on the ground with a leopard walking in the background.

🔗 The internet used to be ✨fun✨

Lot of interesting posts here about the personal web, both current and old school. I’ve been ducking in and out of this for a week now. Via the HV Discord.

Users of Go: don’t fear the zero value. Resist the urge to use string pointers for things that can be left unset. We need not live like Java developers (let’s not even mention null and undefined that our poor JavaScript brethren have to deal with). Learn to embrace the one nothing we have.

Complexity Stays At the Office

It’s interesting to hear what others like to look at during their spare time, like setting up Temporal clusters or looking at frontend frameworks built atop five other frameworks built on React. I guess the thinking is that since we use it for our jobs, it’s helpful to keep abreast of these technologies.

Not me. Not any more. Back in the day I may have though similar. I may even have had a passing fancy at stuff like this, revelling in its complexity with the misguided assumption that it’ll equal power (well, to be fair, it would equal leverage). But I’ve been burned by this complexity one to many times. Why just now, I’ve spent the last 30 minutes running into problem after problem trying to find a single root cause of something. It’s a single user interaction but because it involves 10 different systems, it means looking at 10 different places, each one having their own issues blocking me from forward progress.

So I am glad to say that those days are behind me. Sure, I’ll learn new tech like Temporal if I need to, but I don’t go out looking for these anymore. If I want to build something, it would be radically simple: Go, Sqlite or PostgreSQL, server-side rendered HTML with a hint of JavaScript. I may not achieve the leverage these technologies may offer, but by gosh I’m not going to put up with the complexity baggage that comes with it.

Bought some “genuine” EarPods as emergency headphones I can keep in my bag. Wired earbuds are not my preferred listening device. Well, at least they’ll act as a deterrent from forgetting my headphones again.

A box of white USB-C earbuds in a box with the name EarPods.

Well, it’s finally happened. I’ve left for work without my headphones. 🙁

Worse Is Better

Gabz latest post about ChatGPT’s ability to write “good” reviews gave me pause:

Here is the thing, every now and then I write about things that I like, among them, video games. I am also aware that although I am not a professional reviewer, I’d like my posts to come across with a certain level of, I don’t know, some quality/standards/mission/quest. […] Whenever I write a post about something I liked it is in a very unprofessional manner, or informal manner, rather, as if I was just talking to you. I just capture my thoughts as I type and perhaps that is the reason I am always digressing and all over the place.

Not to be someone who should tell Gabz how he should write on his own site, I will give you my opinion as a reader. And it’s this: I’d rather hear a review in your voice than some GPT. I read it because it sounds like another fellow human wrote it.

Anyone can put together a review for a piece of media with the “professional” (read monotonous) air of a GPT. I’m sure there are bunch of sites using ChatGPT for this right now (that is if you can get through the barrage of ads they throw in your face). But that’s not why I come to your site. I read what you write because you wrote it. It’s your opinion, written in your own style.

That’s not to say that you shouldn’t post things from a GPT, or use it to make your writing better. Anyone who’s seen what I’ve posted knows of the various DALL-E images I’ve made over the past several months. Use it as the tool that it is, but be cautious about using it to write for you.

P.S. That’s generally why I prefer podcasts with a more casual tone over the more “produced” shows.

Photo Bucket Galleries and Using the HTML Popover API

Spent a bit more on Photo Bucket this evening. Tonight I started working on galleries, which’ll work more or less like albums.

At the moment you can create a gallery and add a photo to it. Most of the work so far has been backend so the UI is pretty rough. Eventually you’ll be able to do things like rearrange photos within galleries, but for the moment they’ll just be added to the end.

I did need to add some modals to the UI, such as the New Gallery model that’s shown for the gallery name. This gave me the opportunity to try out the new popover API. And yeah, it does exactly what it says on the tin: add the popover attribute to an element and it becomes a working popover (at least in the latest version of Vivaldi). Must say it’s impressive that this is now possible with HTML alone.

The initial version of the modals used a <div> for the popover target. And while that worked, there were some small annoyances. First was that the form within the popover didn’t get focus when the popover was displayed. It would be nice to click “New” and start typing out the gallery name. But this is a small thing that’s easily solvable with JavaScript, so it’s no big deal.

The second, slightly larger one, was that dismissing the popover by clicking outside of it will not eat the input. If you were to click a sidebar link while the New Gallery model is opened, you’ll end up on that newly selected page. I’m not a fan of this. Dismissing the popover feels like its own user gesture, and I fear the user accidentally activating things when all they’re trying to do is dismiss the popover (it’s not in place now, but I am planning to dim the background when the Create Gallery modal is visible).

Fortunately, there’s a simple solution to this. It turns out that replacing the <div> element with a <dialog> element would solves both problems. It works seamlessly with the new popover attributes, yet showing the dialog will give focus to the form, and will eat the click when the user dismisses it.

Perfect. Looks like I’ve delayed the need for JavaScript a little longer (it will come eventually; it always will).

Yep, I think 22 minutes at 150°C works for toasting frozen hot cross buns in a cold oven. Toasty and warm without being burnt or too hot to eat. Might even be able to push it to 23 minutes but I probably wouldn’t go higher than 25.

I kinda wish there was a fan edit of Mad Men that condensed all 7 seasons into one that contain only the avertising aspets of the show. All the domestic, love interest, personal flashbacks, etc. aspects — unless they relate directly to the advertisement plot lines — I can probably live without.

Photo Bucket lives, at least in an alpha state. It’s being used for the Folio Red Gallery, which would eventually consist of project screenshots that didn’t make it into the posts themselves. It looks ugly, and there are pretty large feature gaps, but it’s finally serving images.

Lost Album, Found

For the past 3.5 years, I’ve been searching on and off for a particular album: the original soundtrack to the David Attenborough series The Private Life of Plants. This lost album was quite elusive. It wasn’t on any of streaming services, and I couldn’t find a digital copy to buy. The only place I found that had anything was some defunct online music store, archived by the Wayback Machine, that offered a handful of tracks to download.

Archived version of 2ndsight circa 2006 within the Wayback Machine
The album as it appeared in the archived version of 2ndsight. Note the five downloadable tracks.

It was better than nothing, so I download what they had. But I still did searches for the full album occasionally, believing it to be out there somewhere, in one form or another.

Tonight, while doing a few other things, I tried another web search, this time including “BBC” in the search query. And I don’t know what the heck is wrong with the public web searches because I actually got a hit. Turns out that someone — maybe the original site owner — acquired the domain, put a site together, and uploaded a copy of the album for visitors to listen to online.

The modern version of 2ndsight which had the full version of Music From The Private Life of Plants to listen
The full track list of Music From The Private Life of Plants on the modern version of 2ndsight.

You could imagine that I stopped what I was doing and focused on getting a copy. The site offered downloads but they weren’t completely working. But after a bit of time in the browser Dev Tools, I was able to organise my own. Yes, this may not be completely legal, but I’m justifying it this one time, since thee’s no legitimate way to buy it and no knowing for how long this site would be up.

It took a bit of time, but the full album now lives safely on my music server.

The album Music From The Private Life of Plants as it appears in Alto Catalogue, my music server
Music From The Private Life of Plants safe and sound on Alto Catalogue, my music server.

So the search is over. The long elusive album has been found! 🎉

Another attempt at working out how best to heat up hot cross buns from the freezer in a cold oven. Tried 150°C for 20 minutes. It’s better. I think it’s pretty close. But still not warm enough. Maybe 22 minutes next time.

Holding pattern.

A flock of seagulls flying around some chips on the ground while a person walks by, disturbing them.

Implicit Imports To Load Go Database Drivers Considered Annoying (By Me)

I wish Go’s approach to loading database drivers didn’t involve implicitly importing them as packages. At least that way, package authors would be more likely to get the driver from the caller, rather than load a driver themselves.

I’ve been bitten by this recently, twice. I’m using a GitHub Linux driver to build an ARM version of something that needs to use SQLite. As far as I can tell, it’s not possible to build an ARM binary with CGO enabled with these runners (at-least, not without installing a bunch of dependencies — I’m not that desperate yet).

I’m currently using an SQLite driver that doesn’t require CGO, so all my code builds fine. There also exists a substantially more popular SQLite driver that does require CGO, and twice I’ve tried importing packages which used this driver, thereby breaking the build. These packages don’t allow me pass in a database connection explicitly, and even if they did, I’m not sure if would help: they’re still importing this SQLite driver that needs CGO.

So what am I to do? As long as I need to build ARM versions, I can’t use these packages (not that I need an ARM version, but it makes testing in a Linux VM running on an M1 Mac easier). I suppose I could roll my own, but it would be nice not to do so. It’d be much better for me to load the driver myself, and pass it to these packages explicitly.

So yeah, I wish this was better.

P.S. When you see the error message “unable to open database file: out of memory (14)” when you try to open an SQLite database, it may just mean the directory it’s in doesn’t exist.

Deciding where data files should be placed on a Linux system. It’s a bit strange how /var/lib was chosen for this, instead of something like /var/local. I would’ve thought that’d make more sense, much like how binaries are placed in /usr/local.

Shows how much of a romantic I am: it took a 10 minute tram ride before I realised why so many people were carrying flowers. 💐

🔗 Most People Won’t

Via A Learning a Day by Rohan. It resonated with me as well.

Spent some time this evening working on my image hosting tool. It’s slowly coming along, but wow do I suck at UI design (the “Edit Photo” screen needs some rebalancing).

Screenshot of a browser showing an image admin section with a grid of images Screenshot of a browser showing a single image form with an image on the right and two text fields on the left

It’s hot cross bun season again, and as always, I have to relearn how I heated them up last year. I thought the 15:150 rule would suffice: 15 minutes from frozen in an 150°C oven from cold. They were editable, but they weren’t warm enough for my taste. I’ll have to bump it up next time.