Long Form Posts

    2024 Song of The Year

    It’s Christmas Eve once again, which means it’s time for the Song of The Year for 2024. Looking at the new and rediscovered albums for the year, there are quite a few to choose from.

    The runners up are pretty much all from Lee Resevere, a new artist I’ve started listening to, and includes:

    But there can only be one winner, and this year it’s Oxygene, Pt. 20 by Jean-Michel Jarre. 👏

     A globe is depicted with a skull emerging from its surface, set against a blue background with the text 'Jean-Michel Jarre Oxygene Trilogy' above it.

    Oxygene is actually an album in my regular rotation, but I always stopped listening to it after Part 19. The strange organ at the start of Part 20 put me off. But one day this year, feeling a little down, I decided to work my way through it and give it a listen, and after the 30 or so seconds, it turned into quite a lovely piece. A nice contrast to the rest of the disk, and a suitable conclusion to the album itself. I’ve even grown to like the organ at the start.

    Honourable Mentions this year include:

    An actually bumper crop this year in terms of music. Let’s hope 2025 is just as good.

    That Which Didn't Make The Cut

    I did a bit of a clean-up of my projects folder yesterday, clearing out all the ideas that never made it off the ground. I’d figured it’d be good to write a few words about each one before erasing them from my hard drive for good.

    I suppose the healthiest thing to do would be to just let them go. But what can I say? Should a time come in the future where I wish to revisit them, it’d be better to have something written down than not. It wouldn’t be the first time I wished this was so.

    Anyway, here are the ones that were removed today. I don’t have dates of when these were made or abandoned, but it’s likely somewhere between 2022 and 2024.

    Interlaced

    This was an idea for a YouTube client1 that would’ve used YouTube’s RSS feeds to track subscriptions. The idea came about during a time when I got frustrated with YouTube’s ads. I think it was an election year and I was seeing some distasteful political ads that really turned me off. This would’ve been a mobile app, most likely built using Flutter, and possibly with a server component to get this working with Chromecast, although I had no idea how that would work.

    This never got beyond the UI mock-up stage, mainly because the prospect of working on something this large seemed daunting. Probably just as well, as YouTube solved the ads problem for me, with the release of YouTube Premium.

    Auto-generated description: A smartphone interface mockup displays a channels list with annotations highlighting features like a navigation tab, subscription indicators, filter options, and a Chromecast button.

    Red Crest

    I thought I could build my own blogging engine and this is probably the closest I got (well, in recent years). This project began as an alternative frontend for Dave Winer’s Drummer, rendering posts that would be saved in OPML. But it eventually grew into something of it’s own with the introduction of authoring features.

    I got pretty far on that front, allowing draft posts and possibly even scheduled posts (or at least the mechanics for scheduled posts). One feature I did like was the ability to make private posts. These would be interleaved with the public ones once I logged in, giving me something of a hybrid between a blogging CMS and a private journal. It was also possible to get these posts via a private RSS feed. I haven’t really seen a CMS do something quite like this. I know of some that allow posts to be visible to certain cohorts of readers, but nothing for just the blog author.

    In the end, it all got a bit much. I started preparing the screen for uploading and managing media, I decided it wasn’t worth the effort. After all, there were so many other blogging CMS’s already out there that did 90% of what I wanted.

    Reno

    As in “Renovation”. Not much to say about this one, other than it being an attempt to make a Pipe Dreams clone. I think I was exploring a Go-based game library and I wanted to build something relatively simple. This didn’t really go any further that what you see here.

    Auto-generated description: A grid of dark squares is displayed on a computer screen, with one square featuring two horizontal white lines.
    Auto-generated description: A grid of interconnected circuit-like lines on a dark background.
    Tileset free for anyone who wants it.

    SLog

    Short for “Structured Log”. This was a tool for reading JSON log messages, like the ones produce by zerolog. It’s always difficult to read these in a regular text editor, and to be able to list them in a table made sense to me. This one was built for the terminal but I did make a few other attempts building something for this; one using a web-based GUI tool, and another as a native MacOS app. None of these went very far — turns out there’s a lot of tedious code involved — but this version was probably the furthest along before I stopped work.

    Despite appearing on this list, I think I’ll keep this one around. The coding might be tedious, but I still have need something like this, and spending the time to build this properly might be worth it one day.

    Auto-generated description: A terminal window displays log messages with levels and a table summarizing error, ID, level, message, and time values.

    Miscellany

    Here are all the others that didn’t even get to the point that warranted a screenshot or a paragraph of text:

    • s3-browse: a TUI tool for browsing S3 buckets. This didn’t go beyond simply listing the files of a directory.
    • scorepeer: An attempt to make a collection of online score-cards much like the Finska one I built.
    • withenv: Preconfigure the environment for a command with the values of an .env file (there must be something out there that does this already).
    • About 3 aborted attempts to make a wiki-style site using Hugo (one called “Techknow Space” which I though was pretty cleaver).

    I’m sure there’ll be more projects down the line that would receive the same treatment as these, so expect similar posts in the future.


    1. Or possibly a Peertube client. ↩︎

    A Summer Theme

    Made a slight tweak to my blog’s theme today, to “celebrate” the start of summer.

    I wanted a colour scheme that felt suitable for the season, which usually means hot, dry conditions. I went with one that uses yellow and brown as the primary colours. I suppose red would’ve been a more traditional representation of “hot”, but yellow felt like a better choice to invoke the sensation of dry vegetation. I did want to make it subtle though: it’s easy for a really saturated yellow to be quite garish, especially when used as a background.

    My original idea was to use yellow as the link colour, but there wasn’t a good shade that worked well with a white background that had a decent contract1. So I pivoted, making the background yellow instead, and throwing in a brown for the link colour. That improved the contrast dramatically, and helped to make the theme a little more interesting.

    One thing I did do was make it conditional to the data-theme attribute in the html tag, leaving me the option of adding a theme picker in the future. If you’re interested in the CSS, here it is:

    :root[data-theme="summer"] {
        --background: #FFFEF5;
        --link: #895238;
        --link_visited: #895238;
    }
    
    @media (prefers-color-scheme: dark) {
        :root[data-theme="summer"] {
            --text: #f8f8f2;
            --link: #fce98a;
            --link_visited: #fce98a;
            --background: #30302a;
        }
    }
    

    I plan to keep this theme for the next three months, then look at changing it again when summer turns into autumn. It’s probably not a great colour scheme, as I generally don’t have the patience for making minute adjustments to get the style “just right”. I guess it follows on from my feeling of the season: I generally don’t like summer and I just want to get it over with. Perhaps doing something small like this is a way of enjoying it a little more.


    1. It was much easier for the dark theme. ↩︎

    Delta of the Defaults 2024

    It’s a little over a year since Dual of the Defaults, and I see that Robb and Maique are posting their updates for 2024, so I’d thought I do the same. There’ve only been a few changes since last year, so much like Robb, I’m only posting the delta:

    • Notes: Obsidian for work. Notion for personal use if the note is long-lived. But I’ve started using Micro.blog notes and Strata for the more short-term notes I make from day to day.
    • To-do: To-do’s are still going to where my notes go, which is now Strata. Although I’m still using Google Keep for shopping lists.
    • Browser: Safari’s out from all machines. It’s Vivaldi everywhere now. Except iPad, where I don’t have a choice.
    • Presentations: Believe it or not, I haven’t had to make a presentation since last year. I am still paying for iA Presenter, and despite some thoughts, I think I’ll continue to use it for future presentations. I should also add that I’m using iA Writer for prose writing, not that I do much of that either.
    • Social Clients: Still Tusky for now, but I’m wondering how long it’ll be before I install a BlueSky client too.
    • Journalling App: I didn’t include this in last year’s list, but it’s worth mentioning here as I’ve moved away from Day One to a home grown web-app, similar to the one built by Kev Quirk.
    • POSSE: Micro.blog, EchoFeed. Also a new category, now I’m doing this a bit more nowadays.

    Looking At Coolify

    While reading Robb Knight’s post about setting up GoToSocial in Coolify, I got curious as to what this Coolify project actually is. I’m a happy user of Dokku, but being one with magpie tendencies, plus always on the lookout for ways to make the little apps I make for myself easier to deploy, I thought I’d check it out.

    So I spun up a Coolify instance on a new Hetzner server this morning and tried deploying a simple Go app, complete with automatic deployments when I push changes to a Forgejo repository. And yeah, I must say it works pretty well. I haven’t done anything super sophisticated, such as setting up a database or anything. But it’s almost as easy as deploying something with Dokku, and I’m please that I was able get it working with my Forgejo setup1.

    Anyway, this post is just a few things I want to make a note about for next time I want to setup a Coolify instance. It’s far from a comprehensive set-up guide: there’s plenty of documentation on the project website. But here are a few things I’d like to remember.

    Changing the Proxy to Caddy: Soon after setting up your Coolify instance, you probably want to change the proxy to Caddy, just so that you can easily get Lets Encrypt certificates. Do this before you setup a domain as you’ll need direct access to Coolify via the port.

    Go to “Servers → localhost” and in the “Proxy” tab, stop the current proxy. You then have the option of changing it to Caddy.

    Setting Up a Domain For Coolify Itself: Once you’ve change the proxy, you’d probably want to setup a domain so as to avoid accessing it via IP address and port number. You can do so by going to “Settings,” and within “Instance Settings” changing “Domain”.

    If you prepend your domain with https, a certificate will be setup for you. I’m guessing it’s using Lets Encrypt for this, which is fine. I’d probably do likewise if I had to set it up manually.

    Deploying From a Private Forgejo Repository: To deploy from a private Forgejo repository, follow the Gitea integration instructions on setting up a private key. This is basically creating a new key in “Keys And Tokens”, and adding it as a key in Forgejo.

    The Add Key modal showing options to generate an RSA or elliptical curve key
    The Add Key modal

    As far as I’m aware, it’s not possible to change an application source from a public Git repo to a private one. I tried that and I got a few deploy errors, most likely because I didn’t set the key. I had to delete it and start from scratch.

    Setting a Domain For a Project: Setting up a domain is pretty simple: just add a new A record pointing to the IP address of the service the application is running on. Much like the Coolify domain, prefacing your domain with https will provision a TLS certificate for you (docs):

    The Domain settings for the deployable project resource
    The Domain settings for the deployable project resource

    Unlike Dokku, your app doesn’t need to support the PORT environment variable. You should be able to start listening on a port and simply setup a mapping in the project page. The default seems to be port 3000, just in case you’re not interested in changing it:

    Automatic Deployments From Forgejo: Coolio looks to have some nice integrations with Github, but that doesn’t help me and my use of Forgejo. So the setup is a little more manual: adding some web-hook to automatically deploy when pushing commits to Forgejo. In Coolify, you’d want to use the Gittea web-hook:

    The web-hook settings for the deployable project resource, with the Gittea web-hook highlight
    The Gittea web-hook is the one to use

    You’ll need to generate the web-hook secret yourself. Running head -c 64 /dev/urandom | base64 or similar should give you something somewhat secure.

    Setting up the web-hook on Forgejo’s side was a little confusing. Clicking “Add Webhook” just brought up a list of integrations, which I’m guessing are geared towards particular form of web-hook payloads. You want to select the “Forgejo” one.

    Project web-hooks in Forgejo, with the Gittea domain from Coolify set as the target URL, the secret set, and everything else left as the default
    How the web-hook looks on Forgejo's side

    Use the URL that Coolify is showing for the Gittea web-hook, leave the method as “POST” and set the secret you generated. The rest you can configure based on your preferences.

    So that it. So far I’m liking it quite a bit, and I look forward to going a bit further than simple Go apps that serve a static message (some of the pre-canned applications look interesting). I’d like to try it for a bit longer before I consider it as a replacement for Dokku, but I suspect that may eventually happen.

    A screenshot of a browser window with a plain text message saying: 'Hello World. This is deployed via Coolify via a private repo that is auto-deployed.'
    Hello Coolify

    1. I say “it’s almost as easy” as Dokku, but one thing going for Coolify is that I don’t need to SSH into a Linux box to do things. When it comes to creating and operating these apps, doing it from a dashboard is a nicer experience. ↩︎

    Cropping A "Horizontal" PocketCast Clip To An Actual Horizontal Video

    Finally fixed the issue I was having with my ffmpeg incantation to crop a PocketCast clip. When I was uploading the clip to Micro.blog, the video wasn’t showing up. The audio was fine, but all I got for the visuals was a blank void1.

    For those that are unaware, clips from PocketCast are always generated as vertical videos. You can change how the artwork is presented between vertical, horizontal, or square; but that doesn’t change the dimensions of the video itself. It just centers it in a vertical video geared towards TikTok, or whatever the equivalent clones are.

    This, I did not care for. So I wanted to find a way to crop the videos to dimensions I find more reasonable (read: horizontal).

    Here’s the ffmpeg command I’m using to do so. This takes a video of the “horizontal” PocketCast clip type and basically does a crop at the centre to produce a video with the 16:9 aspect ratio. This post shows how the cropped video turns out.

    ffmpeg -i <in-file> \
      -vf "crop=iw:iw*9/16:(iw-ow)/2:(ih-oh)/2, scale=640:360" \
      -vcodec libx264 -c:a copy <out-file>
    

    Anyway, back to the issue I was having. I suspect the cause was that the crop was producing a video with an uneven width. When I upload the cropped video to Micro.blog, I’ve saw in the logs that Micro.blog was downscaling video to a height of 360. This was using a version of the command that didn’t have the scale filter, and the original clip was 1920 x 1080. If you downscale it while maintaining the original 15:9 aspect ratio, the new dimensions should be 640 x 360. But for some reason, the actual width of the cropped video was 639 instead.

    I’m not sure if this was the actual problem. I had no trouble playing the odd-width video in QuickTime. The only hint I had that this might be a problem was when I tried downscaling in ffmpeg myself, and ffmpeg threw up an error complaining that the width was not divisible by two. After forcing the video size to 640 x 360, and uploading it to Micro.blog, the video started coming through again. So there might be something there.

    Anyway, it’s working now. And with everything involving ffmpeg, once you get something working, you never touch it again. 😄


    1. Not that there’s much to see. It’s just the podcast artwork. Not even a rendered scrubber. ↩︎

    WeblogPoMo AMA #3: Best Music Experience

    I’m on a roll with these, but I must warn you, this streak may end at any time. Anyway, todays question is from Hiro who asked it to Gabz, and discovered via Robb:

    @gabz What’s the best music-related experience of your life so far?

    Despite attending only a hand-full of concerts in my life — live music is not really my jam — I’ve had some pretty wonderful music-related experiences in my life, both through listing to it or by performing it. Probably my most memorial experience was playing in the pit orchestra for our Year 10 production of Pippin. This was during the last few weeks before the show opened and we attended a music camp for a weekend to do full day rehearsals with the music director. The director had a reputation of being a bit of a hard man, prone to getting a bit angry, and not afraid to raise his voice. It was intimidating to me at the time, but in hindsight I can appreciate that he was trying to get the best from us. And with us being a group of teenage boys who were prone to loosing focus, I think we were deserving of his wrath.

    One evening, we were rehearsing late, and the director was spending a lot of time going through some aspect of the music. I can’t remember what was being discussed but it was one of those times where everyone was tired, yet each knew what they were meant to be doing and was still happy to be working. You feel something special during those moments, when the group was doing their best, not out of coercion but because we were trying to “get the work done”.

    Probably a very close second was discovering Mike Oldfield for the first time. This was probably when I was 11 or 12, and I wasn’t a bit music listener back then (I mean, we did have a home stereo but I wasn’t listening to a walkman or anything like that). Dad was working one night and I came up to him. He then started playing track 1 of Tubular Bells II, thinking that I would appreciate it. I was more intrigued at first, as it wasn’t the type of music I was use to at the time: long, instrumental pieces. Yet I found it to be decent, and something I could see myself liking the future1. He then played track 7, and I was absolutely hooked after that.


    1. In my experience, the tracks that take some time to grow to like turn out to be the best ones to listen to. ↩︎

    WeblogPoMo AMA #2: One Thing I Wish I Could Change About Myself

    Here’s my answer to another question asked by Annie for WebogPoMoAMA. This was previously answered by Keenan, Estebanxto, Kerri Ann, and Lou Plummer:

    If you could instantly change one internal pattern/thing about yourself, what would it be?

    My answer is that I wish I found it easier meeting new people. Not only am I quite introverted, I’m also really shy, and I find it extremely hard to introduce myself to new people in social situations. That is, if I ever find myself going to these social situations. I rarely do, and if I do attend, I usually stay quietly to the side, keeping with company that I know. It was at one time bad enough that I’d find excuses to avoid going out to see those I do know.

    I’m trying to get better at this. For starters, I’m no longer staying away from friends, and I am trying to make the effort of going to more social events as they come. It’s still not great though, and I do struggle when being around a group of strangers. I guess the secret is just practice, and maybe trying to make a game of it: setting goals like saying hello to at least one new person every hour or so. I don’t think I’ll ever get over my shyness, but I’m hoping I can find away to at least manage it a little better than I have been.

    Phaedra, The lmika Track Arrangement

    I recently learnt that the version of Phaedra I’ve been listening to for the past 15 years had not only the wrong track order, but also the wrong track names. This is not entirely surprising, given how this version was… ah, acquired.

    But after learning what the order and names should’ve been, I think I still prefer my version. And yes, that’s probably because I’m use to it, but if the official album were to have these names and this order, I think it would actually work really way. I may go so far as to say that if I got a copy of the official album, I’d probably change it to match the version I been listening to.

    In case your curious, here’s how the tracks are named in my version:

    Official Version lmika Version
    Phaedra Mysterious Semblance At The Strand Of Nightmares
    Mysterious Semblance At The Strand Of Nightmares Phaedra
    Movements Of A Visionary Sequent ‘C’
    Sequent ‘C’ Movements Of A Visionary

    I’m actually a little surprised that my version of Sequent ‘C’ is officially called Movements Of A Visionary and visa-versa. The name Movements Of A Visionary gives it a more mysterious feeling, which fits well with the small, soft, reverb-filled piece of music that it is. As for the track with has that name officially… well I just assumed the name Sequent ‘C’ made the most logical sense for a piece of music with a sequencer in the key of C. I don’t have an explanation for Phaedra or Semblance other than “long piece == long title,” but Phaedra just feels like a title that fits better for a piece of music that predominantly features a mellotron.

    The tracks in the version I listen too are arrange in the following order:

    No. Official Version Name lmika Version Name
    1. Sequent 'C' Movements Of A Visionary
    2. Phaedra Mysterious Semblance At The Strand Of Nightmares
    3. Mysterious Semblance At The Strand Of Nightmares Phaedra
    4. Movements Of A Visionary Sequent 'C'

    The fact that Phaedra is the first track in the official version make sense, given that on vinyl it would’ve taken up an entire side, but I reckon starting the album with a small, soft piece — acting almost like a prelude — whets the appetite for the heavier stuff. This would be track two, which is 17 minutes long, and is quite dynamic in it’s contract across the piece. You then climb down from that into what I thought was the title track which — given that it appears as the third one in my version — gives the artists an opportunity to have a something simpler to act as the centrepiece of the album. Then you end with a relatively lively piece with a driving sequencer, that finishes with a decisive C(7) chord, making it clear that the album is now over.

    So that’s how I’d name and arrange the tracks in this album. I don’t want to say that Tangerine Dream got it wrong but… they did get it pretty wrong. 😀

    My Favourite Watch

    Seeing all the nostalgia for digital watches of the ’90s and early 2000s, following the release of retroest desk clock shaped like a large Casio digital watch, it got me thinking of the watches I owned growing up. I started off as a Casio person but I eventually moved on to Timex watches. I was pretty happy with all the watches I owned, but my favourite was the Timex Datalink USB Sports Edition, which stood head and shoulders about the rest.

    Auto generated description:  A Timex Ironman digital watch with a black strap displays the time as 3:41 and is water-resistant up to 100 metres
    Source: Hamonoaneraea (site no longer online)

    Not only was this watch featureful out of the box — having the usual stopwatch, timers, and alarms — it was also reprogrammable. There was some Windows software that allowed you to install new modes and arrange them in the mode menu. I remember a few of these, such as a mode allowing you to browse data arranged in a tree; a simple note taking mode; and a horizontal game of Tetris.

    There was also an SDK, allowing you to build new modes in assembly. I remember building a score keeping mode, where you could track points for a game between two or four competitors, with an optional auxiliary counter used to keep track of things like fouls. I also remember building a dice rolling mode, allowing you to roll up to 6 dice, with each dice having between 2 to 9 sides, and the total automatically being displayed to you.

    I never used these modes for anything — I’m neither sporty nor much of a gamer to have any real need for tracking scores or rolling multiple dice — but they were super fun to build, and I got a fair bit of experience learning assembly from it. And the SDK was pretty well built, with predefined entry points for the mode, reacting to events like button presses, and displaying things on the LCD. The fact that the SDK came with a random-number generator, which wasn’t even used with any of the built-in modes, just showed how well Timex thought about what was possible with this watch.

    This was the last watch I regularly wore: I’ve moved to using phones to keep track of time. But it was a great watch while it lasted.

    Why I Keep Multiple Blogs

    Kev Quirk wrote a post yesterday wondering why people have multiple blogs for different topics:

    A few people I follow have multiple blogs that they use for various topics, but I don’t really understand why. […] I personally prefer to have a single place where I can get all your stuff. If you’re posting about something I’m not interested in, I’ll just skip over it in my RSS feed. I don’t have to read everything in my feed reader.

    I’ve written about this before, and after taking a quick look at that post, most of those reasons still stand. So if you’ve read that post, you can probably stop reading this one at reason number two (unless you’re listening to the audio narration of this, in which case, please listen on as that last post predated that feature 🙂).

    I’m currently keeping four separate blogs: this one, one for checkins to places I’ve visited, one for remembering how to do something for work, and one for projects I’m working on1. This arrangement came about after a few years of spinning out and combining topics to and from a single blog, generally following the tension I felt after publishing something, wondering if that was the right place for it. As strange as it is to say it, this multi-blogging arrangement gives me the lowest amount of tension for writing online.

    There are a few reasons for this. First is that for certain topics, I like an easy way to reference posts quickly. This is the primary reason why I keep that work-related reference blog, so that when I’m faced with a software-related problem I know I’ve seen in the past, I can quickly lookup how I solved it. I’ve tried keeping those posts here, but it was always difficult finding them again amongst all the frequent, day-to-day stuff.

    It mainly comes down to the online reading experience. Categories can only do so much, and that’s if I’m categorising posts rigorously, which is not always the case. Here, the posts are displayed in full, encouraging the reader to browse. But for my reference blog, a list of bare links works better for going directly to the post I need.

    The second reason is the writing experience. For me, certain CMSes work better for certain types of posts. Micro.blog works well for micro-posts or mid-sized posts like this one, but for longer ones, I prefer the editors of either Scribbles or Pika. I don’t know why this is. Might be because all the code-blocks I tend to use on those blogs are easier to write using a WYSIWYG editor rather than Markdown.

    And finally, it’s a good excuse to try out multiple CMSes. I have no rational explanation for this one: it’s an arrangement that costs me more money and requires learning new software. Might be that I just like the variety.

    So that’s why I keep multiple blogs. I do recognise that it does make it harder for others to find my writing online, not to mention following along using RSS. But that’s a tradeoff I’m willing to make for a writing and reading arrangement that works better for me. Of course, like I said in my previous post, this might change in due course.


    1. Actually, I have a fifth blog which is for projects I’m working on that I rather keep private. Oh, and a sixth, which is a travel blog that I really should maintain better. Might be that I have a few too many blogs. ↩︎

    On Panic, iA, and Google Drive

    I see that Panic is shutting down their Google Drive integration in their Android app, much like iA did a few weeks ago. This doesn’t affect me directly: even though I am a user of both Android and Google Drive, I regret to say that I don’t use apps from either company on my phone (I do use a few things from both on my Apple devices).

    But I do wonder why Google is enacting policies that push developers away from using Drive as general purpose user storage. That’s what Drive was meant to be used for, no? Does Google not think that by adding these security conditions, and not getting back to developers trying to satisfy them, is maybe pushing the scale between security and usefulness a bit too far out of balance? Are they thinking through the implication of any of this at all?

    If you were to ask me, my guess would probably be that no, they’re not thinking about it. In fact, I get the sense that they’re making these decisions unconsciously, at least at an organisation level. Probably someone said to the Drive devision that they need to “improve security” and that their performance will be measured against them doing so. So they drafted up these conditions and said “job done” without thinking through how it may affect the actual usefulness of Drive.

    And it just reveals to me how large Google is, possibly too large to know why they do anything at all. It’s not like they’re being malicious or anything: they’re a victim of their own success, with way too many product lines making zero dollars that distract them from their raison d’être, which is getting that sweet, sweet ad money. After-all, what does Drive matter to Google in terms of increasing advertising revenue? It’s probably a division making a loss more than anything else.

    I suppose, given that I do use both Drive and Android, that I should care more about it. And yeah, I care enough to write about it, but that’s barely above the level of mild curiosity I’m feeling as to why Google is letting this happen. Might be that I’ve just gotten numb to Google not caring about their own products themselves.

    Passing

    Three nights ago, and two months before her 94th birthday, my Nonna, my maternal grandmother, suffered a stroke. She’s now in palliative care and there’s no telling how much longer she has left. Over the last few years she was slowing down, yet was still quite aware and was able to do many things on her own, even travel to the shops by bus. She had a scare over the weekend but was otherwise in reasonably good health. So all of this is incredibly sudden.

    I was unsure as to whether or not I wanted to actually write this post. I did have a draft planned yesterday, with the assumption that she wouldn’t make it through the night. Delaying it any further did not seem right. Neither is making this an eulogy or display of public grief — that’s not how I like to do thing. But to not acknowledge that any of this is happening felt just as wrong, at least for now.

    But what seemed right was a public declaration that I love her and I’ll miss her. I consider myself lucky to have said that to her in person, while she was lucid.

    So, what now? Timelines at this stage are uncertain. Would it be hours? Days? Who can say. I guess following that would be the funeral and other matters pertaining to the estate, but that won’t happen for a week or so. What about today? Does one simply go about their day as one normally would? Does life go on? Seems wrong that it should be so, yet I’m not sure there’s anything else that I’m capable of doing. Just the daily routine smeared with sadness and loss.

    I heard someone say that grief comes from love, that you can’t have one without the other. I can attest to that, but the edges of that double-edge sword are razor sharp. I know that eventually the pain will dull, and all that would remain are the memories. All it takes is time.

    Tools And Libraries I Use For Building Web-Apps In Go

    I think I’ve settled on a goto set of tools and libraries for building web-apps in Go. It used to be that I would turn to Buffalo for these sorts of projects, which is sort of a “Ruby on Rails but for Go” type of web framework. But I get the sense that Buffalo is no longer being maintained. And although it was easy to get a project up and running, it was a little difficult to go beyond the CRUD-like layouts that it would generate (or it didn’t motivate me enough to do so). Plus, all that JavaScript bundling… ugh!

    Huge pain to upgrade any of that.Since I’ve moved away from Buffalo, I’m now left to do more of the work up-front, but I think it helps me to be a little more deliberate in how I build something. And after getting burned with Buffalo shutting down, I think it’s was time to consider a mix of tools and libraries that would give me the greatest level of code stability while still being relatively quick to get something up and running.

    So, here’s my goto list of tools and libraries for building web-apps in Go.

    • HTTP Routing: For this, I use Fiber. I suppose using Go’s builtin HTTP router is probably the best approach, but I do like the utility Fiber gives for doing a lot of the things that go beyond what the standard library provides, such as session management and template rendering. Speaking of…
    • Server Side Templating: Nothing fancy here. I just use Go’s template engine via Fiber’s Render integration. It has pretty much all I need, so I don’t really look at anything else.
    • Database: If I need one, then I’ll first take a look at Sqlite. I use the modernc.org Sqlite driver, as it doesn’t require CGo, making deployments easier (more on that later). If I need something a bit larger, I tend to go with PostgreSQL using the pgx driver. I would also like to use StormDB if I could, but it doesn’t play well with how I like to deploy things, so I tend to avoid that nowadays.
    • Database ORM: I don’t really use an ORM (too much PTSD from using the various Java ORMs), but I do use sqlc to generate the Go code that interacts with the database. It’s not perfect, and it does require some glue code which is tedious to write. But what it does it does really well, and it’s better than writing all that SQL marshalling code from scratch.
    • Database Migration: I’ve tried using golang-migrate before, and we do use it at work for PostgreSQL databases, but it doesn’t work well with the modernc.org Sqlite driver. So I ended up writing my own. But if it makes sense to use golang-migrate, I will.
    • JavaScript: I try to keep my JavaScript usage to a minimum, favouring vanilla JavaScript if I only need a few things. For anything else, I usually turn to Stimulus.js, which adds just enough “magic” for the slightly more involved pieces of front-end logic. I’m also looking at HTMX, and have tried it for a few things, but I’ve yet to use it for a complete project. I use esbuild if I need to bundle my JavaScript, but I’m trying to go “builderless” for most things nowadays, relying on import maps and just serving the JavaScript as is.
    • CSS: Much like JavaScript, I still prefer to use vanilla CSS served directly for most things. I tend to start new projects by importing SimpleCSS by Kev Quirk. It makes the HTML look good right out of the gate, but it does make each project look a little “samey” but that’s up to me to address.
    • Live Reloading: I’ve only recently been a convert to live reloading. I did use it when I was bundling JavaScript, but since moving away from that, plus doing most things server-side anyway, I needed something that would build the entire app. I’ve started using Air for this, and it’s… fine. There are certain things that I don’t like about it — particularly that it tends to favour configuration over convention — but it does the job.
    • Deployment: Finally, when I’m ready to deploy something, I do so using Dokku running on a Linux server. I bundle the app in a Docker container, mainly using a Go builder image, and a scratch image for the run-time container (this scratch container has nothing else in it, not even libc, which is why I use the modernc.org Sqlite driver). All I need to do is run git push, and Dokku does the rest. Dokku also makes it easy to provision PostgreSQL databases with automated backups, and HTTPS certificates using Lets Encrypt. Deploying something new does involve logging into the remote server to run some commands, but having been burned by PaaS providers that are either too pricy, or not pricy enough to stay in business, I’ve found this setup to be the most stable way to host apps.

    So, that’s my setup. It’s a collection that’s geared towards keeping the code low maintenance, even if it may come at the cost of scalability. I can’t tell you anything about that myself: I’m not running anything that has more than a couple of users anyway, and most things I’m running are only being used by myself. But I think that’s a problem for later, should it ever arise.

    Micro-fiction: Get A Horse

    Trying something new here. I came up with the concept of this short-story while riding home on the tram yesterday. The rest of it sort-of fell into place when I woke up at 5AM this morning, unable to get back to sleep. Hope you enjoy it.

    Josh was riding the scooter on the city footpath, not trying super hard to avoid the other pedestrians. He was going at a speed that was both unsafe and illegal, but it was the only speed he knew that would prevent that horse from showing up. Besides, he had something that he needed to do, and it was only at such reckless speeds that he knew that that thing would work. Well, he didn’t know; but being at his wits' end after trying everything else, he had to try this. He picked his target ahead and sped up towards it. Good thing he was wearing his helmet.

    Josh never used these sorts of scooters before the collision two weeks ago. He was walking to work that day, when he saw someone on such a scooter coming towards him, helmet on head. The rider was going a ridiculous speed, and Josh tried to get out of his way as he approached, but the scooter driver turned towards him, not slowing down at all. Josh tried again but was not fast enough. The scooter rider ran straight into him and bowled him over onto the footpath. Before Josh could gather himself, the scooter rider slap his helmet onto Josh’s head and shouted, “Get a horse!” He got back onto the scooter and sped away.

    Josh got up, fighting the various aching muscles from the fall. He dusted himself down, took the helmet from his head and looked at it. It was very uncharacteristic of those worn by scooter riders. Most of them were plastic things, either green or orange, yet this one was grey, made of solid leather that was slightly fuzzy to the touch. Josh looked inside the rim and found some printed writing: Wilkinsons Equestrian Helmet. One side fits all. The one was underlined with some black marker.

    Josh put the helmet in his backpack and was about to resume his commute, when he stopped in place. Several metres away, a white horse stood, staring at him. Or at least it looked like a horse. The vision was misty and slightly transparent, giving the sense that it was not real. Yet after blinking and clearing his eyes, it didn’t go away. Josh started to move towards it, and when he was just within arms reach, it disappeared. Josh shook his head, and starting walking. But when he turned the next corner, there it was again: a horse, standing in the middle of the footpath several metres away, staring at him intently.

    Since that day that horse has been haunting Josh. On his walk, at his workplace, in his home, even on the tram. Always staring, always outside of reach. Either standing in his path or following close behind him. The vision will go whenever Josh approached it, only to reappear when he turned to look in another direction. Naturally, no one else could see it. When that horse was in a public place, people seemed to instinctively walk around it. Yet when he asked them if they could see it, they had no idea what he was talking about. But Josh couldn’t do anything to stop seeing it. At every waking hour of the day, from when he got out of bed to when he got back in, there it was, always staring. Never looking away.

    And he knew it had something to do with that helmet. He tried a few things to dispel the vision, such as leaving the helmet at home or trying to give to random strangers (who always refused it). Yet nothing worked to clear the vision. That is, nothing other than what had worked on him. Now was the time to test that theory out.

    His target was ahead, a man in a business suit walking at a leisurely pace. He had his back to Josh, so he couldn’t see Josh turn his scooter towards him and accelerate. The gap between them rapidly closed, and Josh made contact with the man, slowing a little to avoid significant injury, but still fast enough to knock him over. Josh got off the scooter and stood by the man, sprawled on the footpath. Once again the horse appeared, as he knew it would. He looked down to see the man starting to get up. Josh had to go for it now! He took his helmet from his head, slapped it on the man and shouted, “Get a horse!”

    Josh got back on the scooter and sped away for few seconds, then stopped to look behind him. He saw the man back on his feet, helmet in hand, looking at it much like Josh did a fortnight ago. He saw the horse as well, but this time it had its back to Josh, staring intently at the man, yet Josh could see that the man hasn’t noticed yet. He could see the man put the helmet by side of the road and walk away, turning a corner. The horse was fading from Josh’s eyes, yet it was still visible enough for Josh to see it follow the man around the corner, several metres behind.

    Select Fun From PostgreSQL

    Using PostgreSQL these last few months reminds me of just how much fun it is to work with a relational database. DynamoDB is very capable, but I wouldn’t call it fun. It’s kinda boring, actually. Not that that’s a bad thing: one could argue that “boring” is what you want from a database.

    Working with PostgreSQL, on the other hand, has been fun. There’s no better word to describe it. It’s been quite enjoyable designing new tables and writing SQL statements.

    Not sure why this is, but I’m guessing it’s got something to do with working with a schema. It exercises the same sort of brain muscles1 as designing data structures or architecting an application. Much more interesting than dealing with a schemaless database, where someone could simply say “ah, just shove this object it a DynamoDB table.”

    It’s either that, or just that PostgreSQL has a more powerful query language than what DynamoDB offers. I mean, DynamoDB’s query capabilities need to be pretty restricted, thanks to how it stores it’s data. That’s the price you pay for scale.


    1. My brain muscles couldn’t come up with a better term here. 😄 ↩︎

    Rubber-ducking: Of Config And Databases

    It’s been a while since my last rubber-ducking session. Not that I’m in the habit of seeking them out: I mainly haven’t been in a situation when I needed to do one. Well that chance came by yesterday, when I was wondering whether to put queue configuration either in the database as data, or in the environment as configuration.

    This one’s relatively short, as I was leaning towards one method of the other before I started. But doubts remained, so having the session was still useful.

    So without further ado, let’s dive in. Begin scene.

    L: Hello

    🦆: Oh, you’re back. It’s been a while. How did that thing with the authorisation go?

    L: Yeah, good. Turns out doing that was a good idea.

    🦆: Ah, glad to hear it. Anyway, how can I help you today?

    L: Ok, so I’m working on this queue system that works with a database. I’ve got a single queue working quite well, but I want to extend it to something that works across multiple queues.

    🦆: Okay

    L: So I’m wondering where I could store the configuration to these queues. I’m thinking either in the database as data, or in the configuration. I’m thinking the database as: A) a reference to the queue needs to be stored alongside each item anyway, and B) if we wanted to add more queues, we can almost do so by simply adding rows.

    🦆: “almost do so?”

    L: Yeah, so this is where I’m a little unsure. See, I don’t want to spend a lot of effort building out the logic to deal with relaunching the queue dispatcher when the rows change. I rather the dispatcher just read how the queues are configured during startup and stick with that until the application is restarted.

    🦆: Okay

    L: And such an approach is closer to configuration. In fact, it could be argued that having the queues defined as configuration would be better, as adding additional queues could be an activity that is considered “intentional”, with a proper deployment and release process.

    I wonder if a good middle-ground might be to have the queues defined in the database as rows, yet managed via the migration script. That way, we can have the best of both worlds.

    🦆: Why not just go with configuration?

    L: The main reason is that I don’t want to add something like string representations of the queue to each queue item. I’m okay if it was just a UUID, since I’d imagine PostgreSQL could handle such fields relatively efficiently. But adding queue names like “default” or “test” as a string on each queue item seems like a bit of a waste.

    🦆: Do they need to be strings? Could the be like an enum?

    L: I rather they’re strings, as I want this arrangement to be relatively flexible. You know, “policy vs. mechanism” and all that.

    🦆: So how would this look in the database?

    L: Well, each row for a queue would have a string, say like a queue name. But each individual queue item would reference the queue via it’s ID.

    🦆: Okay, so it sounds like adding it to the database yet managing it with the migration script is the way to go.

    L: Yeah, that is probably the best approach.

    🦆: Good. I’m glad you can come away with thinking this.

    L: Yeah, honestly that was the way I was leaning anyway. But I’m glad that I can completely dismiss the configuration approach now.

    🦆: Okay, good. So I’m guessing my job is done here.

    L: Yeah, thanks again.

    🦆: No worries.

    About Those STOP Messages

    John Gruber, discussing political spam text messages on Daring Fireball:

    About a month ago I switched tactics and started responding to all such messages with “STOP”. I usually send it in all caps, just like that, because I’m so annoyed. I resisted doing this until a month ago thinking that sending any reply at all to these messages, including the magic “STOP” keyword, would only serve to confirm to the sender that an actual person was looking at the messages sent to my phone number. But this has actually worked. Election season is heating up but I’m getting way way fewer political spam texts now. Your mileage may vary, but for me, the “STOP” response works.

    As someone who use to work for a company that operated a SMS messaging gateway, allow me to provide some insight into how this works. When you send an opt-out keyword — usually “STOP1” although there are a few others — this would be received by our messaging gateway, and your number would be added to an opt-out list for that sender. From that point on, any attempt by that sender to send a message to your number would fail.

    Maintaining these opt-out lists is a legal requirement with some significant penalties, so the company I worked for took this quite seriously. Once, the service maintaining this list went down, and we couldn’t know whether someone opted-out or not. We actually stopped all messaging completely until we got that service back up again. I still remember that Friday afternoon (naturally, it happened on a Friday afternoon).

    Now, if memory serves, there was a way for a sender to be notified when an opt-out occurred. This was mainly for customers that decided to take on the responsibility — and thus legal accountability — of maintaining the opt-out lists themselves. There were a handful of customers that had this enabled, and it was something that we had to enable for them on the backend, but most customers simply delegated this responsibility to us (I can’t remember if customers that had this feature off could still receive opt-out notifications).

    Finally, there is a way, a variant of the “STOP” message, in which someone could opt-out of any message sent from our gateway, basically adding themselves to a global opt-out list which applies to everyone. The only way someone could remove themselves from this list was to call support, so I wouldn’t recommend doing this unless you know you would never need another 2FA code via SMS again.

    Addendum: The customer never had access to these opt-out lists but I believe they could find out when a message they tried to send was blocked. This is because they would be charged per message sent, and if a message was blocked, they would receive a credit. There was also an API to return the status of a message, so if you knew the message ID, it was possible to call the API to know whether a message was blocked.


    1. I can’t remember if this is case insensitive, although I think it is. ↩︎

    My Home Computer Naming Scheme

    I enjoyed Manton’s post about the naming scheme he uses for Micro.blog servers. I see these names pop up in the logs when I go to rebuild my blog, each with a Wikipedia link explaining the origins of the name (that’s a really nice touch).

    Having a server or desktop naming scheme is one of those fun little things to do when working with computers. Growing up we named our home desktops after major characters of Lord of the Rings, such as Bilbo, or Frodo, but I never devised a scheme for myself when I started buying my own computers. I may have kept it up if we were doing likewise at work, but when AWS came onto the scene, the prevailing train of thought was to treat your servers like cattle rather than pets. Granted, it is probably the correct approach, especially when the lifecycle of a particular EC2 instance could be as short as a few minutes.

    But a few years ago, after buying a new desktop and setting up the old one to be a home server, and finding that I need a way to name them, I figured now was the time for a naming scheme. Being a fan of A Game of Thrones, both the book and the TV series, I’ve came up with one based on the major houses of Westeros.

    So, to date, here are the names I’ve chosen:

    • Stark — the M2 Mac Mini that I use as my desktop
    • Tully — the Intel Mac Mini that I use as my home server
    • Crow — a very old laptop that I occasionally use when I travel (this one is a reference to the Night’s Watch)

    I think at one point I had an Intel Nuc that was called Ghost, a reference to John Snow’s dire wolf, but I haven’t used that in a while so I may be misremembering things. I also don’t have a name for my work laptop: it’s simply called “work laptop.”

    Go Feature Request: A 'Rest' Operator for Literals

    Here’s a feature request for Go: shamelessly copying JavaScript and adding support for the “rest” operator in literals. Go does have a rest operator, but it only works in function calls. I was writing a unit test today and I was thinking to myself that it would be nice to use this operator in both slice and struct literals as well.

    This could be useful for making copies of values without modifying the originals. Imagine the following bit of code:

    type Vector struct { X, Y, Z int }
    oldInts := []int{3, 4}
    oldVec := Vector{X: 1}
    
    newInts := append([]int{1, 2}, oldInts...)
    newVec := oldVec
    newVec.Y = 2
    newVec.Z = 3
    

    Now imagine how it would look if rest operators in literals were supported:

    type Vector struct { X, Y, Z int }
    oldInts := []int{3, 4}
    oldVec := Vector{X: 1}
    
    newInts := []int{1, 2, oldInts...}
    newVec := Vector{Y: 2, Z: 3, oldVec...}
    

    I hope you’ll agree that it looks a bit neater than the former. Certainly it looks more pleasing to my eyes. True, this is a contrived example, but the code I’m writing for real is not too far off from this.

    On the other hand, Go does prefer clarity over brevity; and I have seen some JavaScript codebases which use these “rest” operators to an absurd level, making the code terribly hard to read. But I think the Go user-base is pretty good at moderating themselves, and just because it could result in unreadable code, doesn’t make it a forgone conclusion. Just look at Go’s use of type parameters.

    Anyway, if the Go team is looking for things to do, here’s one.

Older Posts →