Long Form Posts

    Running PeerTube In Coolify

    A guide for setting up a basic PeerTube instance on Coolify using a docker-compose file.

    Attending the DDD Melbourne 2025 Conference

    Yesterday, I attended the DDD Melbourne 2025 conference. This was in service of my yearly goal to get out more, to be around people more often than I have been. So the whole reason I attended was to meet new people. That didn’t happen: I said hi to a few people I once worked with, and spoke to a few sponsors, but that was it. So although I marked it off my goal list, it wasn’t a huge success.

    Auto-generated description: A stage featuring a Melbourne 2025 banner is set up in a theatre with people seated and colourful lighting.

    But a dev conference is still a dev conference and I’d thought I’d write a few notes of the sessions I attended, just to record what I did get out of it.

    Keynote

    Emerging trends in robots, by Sue Keay.

    The keynote interesting session about the state of robotics in Australia. Didn’t get a lot of specifics, but I did get a name for the robot I saw once in a Tokyo department store that, let’s just say, left an impression on me.

    Auto-generated description: A speaker is presenting on stage with a slide displaying images of humanoid robots.

    First Session

    Are you overcomplicating software development? I certainly have beenā€¦, by Ian Newmarch.

    This speaker was absolutely preaching my gospel around complexity in software development. But it took someone to go deeper into into why developers are prone to take an inherently complex practice and add additional complexity (so call “accidental complexity”). This is mainly due to human factors: ego, fear, imposter syndrome, and to some extent, to keep the job interesting.

    Auto-generated description: A large screen displays a quote by Neal Ford about developers and complexity, set in a grand hall with colorful lighting.

    Very reliable. Only real way to mitigate this is going back to principals such as avoiding premature abstraction, YAGNI, and KISS. Thing about principals is that they’re always a little hard to know when you need it. So remember to always keep a focus on the problem - what you’re trying to solve - and working with people can help here.

    Second Session

    How to Design Your Next Career Move, by Emily Conaghan.

    This speaker went through a process of how one could reflect on what they want out of theirĀ career, and how to come up with what they need to do to bridge the gap to get to it. The process is rather methodical, which is not a bad thing, and there’s a whole workbook component to this. This might be something that’s personally worth doing though: it does feel like I’m drifting aimlessly a little.

    Third Session

    The Lost Art of good README documentation, by Swapnil Ogale.

    Auto-generated description: A speaker is presenting on The lost art of good README documentation to an audience in a formal room with flags and ornate decor.

    I found this one to be quite good. It touched on the properties of what makes a good README for a project, and why you’d want to (the reason is that a developer’s or user’s trust in a project directly relates to the support document). In short, a good readme should have:

    • A project overview: basically answering the question of what this project is, why it s, and why should one use it.
    • How to instructions: how does one install it, get started using it, etc.
    • How does one engage and contribute to the project: how they can get help, contribute changes, etc.
    • Credits, license and contact details

    But even though these could be described as what a “good” README looks like, a takeaway is there’s no such thing as a bad README, apart from not having any README at all.

    One other thing that I didn’t know was that README’s are traditionally capitalised so that they appear near the top in a alphanumerical listing of files. That was interesting to know.

    Lunch

    Yeah, this was the hardest part of the day. But it’s amazing how much time you can kill just by waiting in lines.

    Forth Session

    Being consistently wrong, by Tom Ridge.

    I was expecting this to be a bit more general, like techniques for keeping an open mind or learning from one’s mistakes. But it was largely focused on task estimations, which is a weakness of mine, but seeing that this was after lunch and I was getting a bit tired around this time, I only halved listened. But the takeaways I did get are the importance of measuring, how long tasks take to travel across the board, how long they’re in progress, in review, etc.; using those measurements to determine capacity using formula’s derived from queuing theory; keeping the amount of work in progress low; and keeping task duration variance low by slicing.

    Auto-generated description: A speaker standing at a podium addresses an audience with a presentation slide that reads Estimation is waste.

    These are all valid points, although I’m not sure how applicable they are to how we work at my job. But it may be a worthy talk to revisit if that changes.

    Fifth Session

    On The Shoulders Of Giants ā€” A Look At Modern Web Development, by Julian Burr.

    Despite being a backend developer by day, I am still curious of the state of web developer. This talk was a good one, where the speaker went through the various “milestones” of major technical developments in web technology ā€” such as when Javascript and jQuery was introduce, when AJAX became a thing, and when CSS was developed (I didn’t know CSS was devised at CERN).

    Auto-generated description: A presenter stands in front of an audience, showing a slide with two individuals wearing t-shirts that read I blame Immanuel for this behavior.

    Going back in time was fun (R.I.P. Java applets & Flash) but it seems the near-term future is all React, all the time. And not just React in the traditional sense, but React used for zero-hydration server side components (Qwik) and out of order streaming (React Suspense). Not sure that appeals to me. Although one thing that does is that Vite is becoming the build tool du jour for frontend stuff. This I may look at, since it looks simple enough to get started.

    Some other fun things: JavaScript style sheets was a thing, and Houdini still is a thing.

    Sixth Session

    Dungeons andā€¦ Developers? Building skills in tech teams with table top role playing games, by Kirsty McDonald.

    This was the talk that got me in the door in some respects. I’ve heard of role-playing games being a thing for scenario planning, so the idea of doing it for team development and practice responding to things like production incidents. This consisted of the normal thing’s you’d expect from a role playing game, like character cards, a game master, and scenario events with a random-number generator component it it.

    Auto-generated description: Four people are sitting on a stage with a large screen behind them displaying a web page and a wooden surface with a geometric object.

    I’ve never played D&D before, so I was curious as to how these games actually ran. Fortunately, I was not disappointed, as the last part of the talk was walking through an example game with a couple of volunteers from the audience. Definitely a talk worth staying back for.

    Locknote

    Coding Like it’s 2005, by Aaron Powell

    This was a fun look-back on the state of the art of web development back in 2005, before jQuery, AJAX, decent editors, when annoying workarounds in JavaScript and CSS were necessary to get anything working in Internet Explorer. This was just before my time as a practicing dev, and apparently trying to replicate rich-client applications in the web browser were all the rage, which was something I missed. It was mainly focused on Microsoft technology, something I don’t have a lot of personal experience in, but I did get flashbacks of using Visual Studio 2003 and version 1 of Firefox.

    Auto-generated description: A computer screen displays a code editor with HTML and ASP.NET source code open. Auto-generated description: A lecturer stands in front of a large projected screen displaying a webpage titled Retro ChatGPT with colorful sections of text.

    Lot’s of fun going down memory lane (R.I.P clearfix & YUI; rot in hell, IE6 šŸ˜›).

    Overall

    I was contemplating not showing up to this, and even while I was there, I was considering leaving at lunchtime, but overall I’m glad that I stayed the whole day. It got me out of the house, and I learnt a few interesting things. And let me be clear: DDD Melbourne and the volunteers did an excellent job! It was a wonderfully run conference with a lot of interesting speakers. I hope to see some of the talks on YouTube later.

    But, I don’t think I’ll be going to a conference by myself again. I mean, it’s one thing to go if work asks you to: I can handle myself in that situation. But under my own volition? Hmm, it would be much easier going with someone else, just so that I have someone to talk to. It’s clear that I need to do something about my fear of approaching someone I don’t know and start speaking to them. Ah well, it was worth a try.

    An Incomplete List of DRM-Free Media Stores

    A collection of links to online stores that sell DRM-Free media.

    Apple AI in Mail and What Could Be

    Apple AI features in Mail currently do not help me. But they can, if Apple invited us to be more involved in what constitute an important email.

    First Impressions of the Cursor Editor

    Trying out the Cursor editor to build a tool to move Micro.blog posts.

    Playing Around With MacOS Image Playground

    Trying out MacOS Image Playground.

    UCL: Some Updates

    Made a few minor changes to UCL. Well, actually, I made one large change. I’ve renamed the foreach builtin to for.

    I was originally planning to have a for loop that worked much like other languages: you have a variable, a start value, and an end value, and you’d just iterate over the loop until you reach the end. I don’t know how this would’ve looked, but I imagined something like this:

    for x 0 10 {
        echo $x
    }
    # numbers 0..9 would be printed.
    

    But this became redundant after adding the seq builtin:

    foreach (seq 10) { |x|
        echo $x
    }
    

    This was in addition to all the other useful things you could do with the foreach loop1, such as loop over lists and hashes, and consume values from iterators. It’s already a pretty versatile loop. So I elected to go the Python way and just made it so that the for loop is the loop to use to iterate over collections.

    This left an opening for a loop that dealt with guards, so I also added the while loop. Again, much like most languages, this loop would iterate over a block until the guard becomes false:

    set x 0
    while (lt $x 5) {
        echo $x
        set x (add $x 1)
    }
    echo "done"
    

    Unlike the for loop, this is unusable in a pipeline (well, unless it’s the first component). I was considering having the loop return the result of the guard when it terminates, but I realised that would be either false, nil, or anything else that was “falsy.” So I just have the loop return nil. That said, you can break from this loop, and if the break call had a value, that would be used as the result of the loop:

    set x 0
    while (lt $x 5) {
        set x (add $x 1)
        if (ge $x 3) {
            break "Ahh"
        }
    } | echo " was the break"
    

    The guard is optional, and if left out, the while loop will iterate for ever.

    The Set! Builtin

    Many of these changes come from using of UCL for my job, and one thing I found myself doing recently is writing a bunch of migration scripts. This needed to get data from a database, which may or may not be present. If it’s not, I want the script to fail immediately so I can check my assumptions. This usually results in constructs like the following:

    set planID (ls-plans | first { |p| eq $p "Plan Name" } | index ID)
    if (not $planID) {
        error "cannot find plan"
    }
    

    And yeah, adding the if block is fine ā€” I do it all the time when writing Go ā€” but it would be nice to assert this when you’re trying to set the variable, for no reason other than the fact that you’re thinking about nullability while writing the expression to fetch the data.

    So one other change I made was add the set! builtin. This will basically set the variable only if the expression is not nil. Otherwise, it will raise an error.

    set! planID (ls-plans | first { |p| eq $p "Missing Plan" } | index ID)
    # refusing to set! `planID` to nil value
    

    This does mean that ! and ? are now valid characters to appear in identifiers, just like Ruby. I haven’t decided whether I want to start following the Ruby convention of question marks indicating a predicate or bangs indicating a mutation. Not sure that’s going to work out now, given that the bang is being used here to assert non-nullability. In either case, could be useful in the future.


    1. And the seq builtin ↩︎

    About My New Cooler's Programming Feature

    There’s lots to like about my new cooler, but the programming feature is not one of them. My old unit had a very simple timer with two modes: turn cooler on after N hours, or turn cooler off after N hours. Anything else requires manual intervention.

    Auto-generated description: A wall-mounted thermostat with buttons labeled for power, mode, and temperature adjustment options.
    The old control panel (turns out I did have a photo, albeit an old one). Set the mode: cool/vent (fan), the power setting, then tap Timer Select to choose between turn on or off after N hours.

    I really liked this simple setup. Many times in summer, when the days are warm but not hot, and the nights are cool, it was nice to turn the cooler’s fan on and set the timer to turn it off after 2 to 3 hours, maybe longer if the days were a bit warmer1. The cooler will simply pull cool air from outside and circulate it around the room. This was enough for me to get to sleep, at which point the cooler would shut itself off in the middle of the night.

    With the new cooler comes a control panel that is effectively a cheap Android phone, so it’s capable of much more. You can now set a program that has four separate modes set to four separate times of the day. Each day of the week now has it’s own program too. A particular mode can either be a desired temperature setting, or “off”.

    Auto-generated description: A digital control panel for a Seeley International air conditioning system displays a program schedule with times and statuses.
    The new control panel, showing the four mode settings for a particular day of the week.

    To recreate what I previously had, I would now have to choose a specific shutoff time for every day of the week. No longer am I able to set the running time based on how I feel: it has to be an actual timestamp, with several taps involve if you want to change it. This timestamp can only be set up to 11:59 PM so if you want the unit to shut off after midnight, you’ll have to remember to choose the program for the next day.

    Oh, and mercy on you if you wanted a timestamp that didn’t land on the hour. The minutes can only be changed by 1, so you’ll be tapping 30 times if you want the unit to shut-off at the half hour.

    Auto-generated description: A digital thermostat screen displays settings including start time, day selection, temperature, and status options.
    Adjusting the settings of a particular mode, one minute at a time.

    You also have no control over the fan speed. That was another nice thing about the old unit: you set the speed to what you want, and then you set the timer. The unit will stay in that mode until it shuts off. I don’t want the fan to be blowing a gale when I’m trying to get to sleep, so the fan was usually set to the lowest or second-lowest setting.

    This new programming modes only have a temperature setting, so if the house is warm, the cooler will crank up the fan until it reaches half-speed or just above; speeds I usually use in the middle of a very hot day. This means noise that will change in intensity as the target temperature is reached. I’m not a great sleeper so any additional noise like that is disruptive.

    Auto-generated description: A Seeley International MagiQtouch controller displays temperature settings and a schedule for cooling with the message PREWET IN PROGRESS.
    The unit operating in program mode.

    So I’m a little sad that I lost this simple timer-based approach to operating the cooler. I’m not even sure who this programming feature is built for. It sort of exists in that nether region where it’s too complicated for the simple things, yet useless for anything other than a set weekly routine. I set my cooler based on the weather conditions which, you may be surprised to know, does not fall into a nice weekly routine. Granted, it may make it possible to use this to recreate the simple timer approach I had before: I just preset everything and only activate the program when I want it. And yeah, it’ll probably be fine, but I do feel like I’ve lost something.

    Update: Apparently the cooler does have a shutoff after N hours feature. It’s just buried in the settings menu. The post still stands, as it would’ve been nice that this was a feature of the Program mode, but at least there’s something I can use.


    1. If the days and nights are hot, I don’t bother with the timer and just leave it running all night long. ↩︎

    On Slash Pages Verses Blog Posts

    Interesting discussion on ShopTalk about slash pages and whether blog posts may make more sense for some of them. Chris and Dave makes the point that blog posts have the advantage of syndicating updates, something that static pages lack on most CMSs. It’s a good point, and a tension I feel occasionally. Not so much on this site, but there’ve been several attempts where I tried to make a site for technical knowledge, only to wonder whether a blog or a wiki makes more sense. I’d like the pages to be evergreen yet I also like to syndicate updates when I learn new stuff.

    I’ve currently settled on the blog format for now, and it’s fine ā€” tags tend to help here ā€” but I wonder if something smarter could be useful here. One idea I have is to have a page with “sections” where each one could be seen as a mini blog post. You add and modify sections over time, and when you do, each section would be syndicated individually. Yet the page will be rendered as a whole when viewing it in the browser. It’s almost like the RSS feed contains diffs for the page, albeit something self contained and readable by humans. There might be a CMS that does this already, I haven’t looked. But I get the sense that most RSS feeds of wiki pages actually contain a diff, or a simple message saying “this page has been updated.” There’s nothing to suggest that what’s out there has this sections semantics.

    In lieu of that, I like the idea proposed by Chris and Dave where you basically new versions of these slash pages as blog posts and redirect the slash URL to the latest one, kind of like a bookmark. I may start doing these for some of them, starting with /defaults which is, conveniently, already a blog post.

    Project Update: DSL Formats For Interactive Fiction

    Still bouncing around things to work on at the moment. Most of the little features have been addressed, and I have little need to add anything pressing for the things I’ve been working on recently. As for the large features, well apathy’s taking care of those. But there is one project that is tugging at my attention. And it’s a bit of a strange one, as part of me just wants to kill it. Yet it seems to be resisting.

    About 6 months ago, I started working on some interactive fiction using Evergreen. I got most of the story elements squared away but much of the interactive elements were still left to be done. And as good as Evergreen is for crafting the story, I found it a little difficult tracking down all the scripts I needed to write, debug and test.

    So in early November, I had a look at porting this story over to a tool of my own, called Pine Needle (yes, the name is a bit of a rip-off). Much like Evergreen, the artefact of this is a choose-your-own-adventure story implemented as a static webpage. Yet the means of building the story couldn’t be more different. Seeing that I’m more comfortable working with code and text files, I eschewed building any UI in favour of a tool that simply ran from the command line.

    But this meant that I needed someway to represent the story in text. Early versions simply had the story hard coded in Go, but it wasn’t long before I started looking a using a DSL. My first attempt was a hand-built one based on Markdown with some additional meta-elements. The goal was to keep boilerplate to a minimum, with the meta-elements getting out of the way of the prose. Here’s a sample of what I had working so far:

    // Three dashes separate pages, with the page ID following on.
    // Also these are comments, as the hash is reserved for titles.
    --- tulips-laundry-cupboard
    
    You open the cupboard door and look at the shelf
    above the brooms. There are a couple of aerosol cans up there,
    including a red one that says "Begone Insecticide".
    You bring it out and scan the active ingredients. There are a
    bunch of letters and numbers back there, and none of them have
    the word "organic."
    
    \choice 'Take Insecticide' to=tulips-take-insecticide
    \choice 'Leave Insecticide' to=tulips-leave-insecticide
    
    --- tulips-take-insecticide
    
    You return to the tulips with the insecticide, and start 
    spraying them. The pungent odour of the spray fills the air, 
    but you get the sense that it's helping a little.
    
    \choice 'Continue' to=tulips-end
    
    --- tulips-leave-insecticide
    
    You decide against using the can of insecticide. You put the
    can back on the shelf and close the cupboard door.
    
    \choice 'Look Under The Trough' to=tulips-laundry-trough
    \choice 'Exit Laundry' to=tulips-exit-laundry
    

    The goal was to have the meta-elements look like LaTeX macros ā€” for example, \option{Label}{target-screen} ā€” but I didn’t get far in finishing the parser for this. And I wasn’t convinced it had the flexibility I wanted. LaTeX macros relies pretty much on positional arguments, but I knew I wanted key-value pairs to make it easier to rely on defaults, plus easier to extend later.

    I did imagine a fully LaTeX inspired DSL for this, but I quickly dismissed it for how “macro-heavy” it would be. For reference, here’s how I imagined it:

    \screen{tulips-laundry-cupboard}{
      You open the cupboard door and lo ok at the shelf
      above the brooms. There are a couple of aerosol cans up there,
      including a red one that says "Begone Insecticide".
      You bring it out and scan the active ingredients. There are a
      bunch of letters and numbers back there, and none of them have
      the word "organic."
    
      \choice{Take Insecticide}{to=tulips-take-insecticide}
      \choice{Leave Insecticide}{to=tulips-leave-insecticide}
    }
    \screen{tulips-take-insecticide}{
      You return to the tulips with the insecticide, and start 
      spraying them. The pungent odour of the spray fills the air, 
      but you get the sense that it's helping a little.
    
      \choice{Continue}{to=tulips-end}
    }
    \screen{tulips-leave-insecticide}{
      You decide against using the can of insecticide. You put the
      can back on the shelf and close the cupboard door.
    
      \choice{Look Under The Trough}{to=tulips-laundry-trough}
      \choice{Exit Laundry}{to=tulips-exit-laundry}
    }
    

    I wasnā€™t happy with the direction of the DSL, so I looked for something else. I briefly had a thought about using JSON. I didnā€™t go so far as to try it, but the way this could work is something like this:

    {"screens": {
      "id": "tulips-laundry-cupboard",
      "body": "
        You open the cupboard door and look at the shelf
        above the brooms. There are a couple of aerosol cans up 
        there, including a red one that says \"Begone Insecticide\".
        You bring it out and scan the active ingredients. There
        are a bunch of letters and numbers back there, and none of 
        them have the word \"organic.\"
      ",
      "options": [
        {
          "screen":"tulips-take-insecticide",
          "label":"Take Insecticide",
        },
        {
          "screen":"tulips-leave-insecticide",
          "label":"Leave Insecticide",
        }
      ]
    }, {
      "id":"tulips-take-insecticide",
      "body":"
        You return to the tulips with the insecticide, and start 
        spraying them. The pungent odour of the spray fills the air, 
        but you get the sense that it's helping a little.
      ",
      "options": [
        {
          "screen":"tulips-end",
          "label":"Continue"
        }
      ]
    }}
    

    I generally like JSON as a transport format, but it didnā€™t strike me as a format that suited the type of data I wanted to encode. Most of what this format would contain would be prose, which Iā€™d prefer to keep as Markdown. But this would clash with JSONā€™s need for explicit structure. Setting aside the additional boilerplate this structure would require, all the prose would have to be encoded as one big string, which didnā€™t appeal to me. Also no comments, especially within string literals, which is a major deal breaker.

    So, the current idea is to use something based on XML. This has some pretty significant benefits: editors have good support for XML, and Go has an unmarshaller which can read an XML directly into Go structures. JSON has this too, but I think itā€™s also a pretty decent format for at editing documents by hand, so long as you keep your XML elements to a minimum.

    I think one aspect that turned people off XML back in the day was format designerā€™s embrace of XMLā€™s ability to represent hierarchical data without leaning into itā€™s use as a language for documents. The clunky XML documents I had to deal with were purely used to encode structure, usually in a way that mapped directly to an domainā€™s class model. You had formats where you need 10 nested elements to encode a single bit of information that were a pain to read or edit by hand. These were usually dismissed by the designers with promises like, ā€œOh, you wonā€™t be editing this by hand most of the time. Youā€™ll have GUI design tools to help you.ā€ But these were awful to use too, and thatā€™s if they were available, which they usually were not (did you know building GUIs are hard?)

    If you have an XML format that skews closer to HTML rather than something thatā€™s representable in JSON, I think it could be made to work. So yesterday I had a go at seeing whether it this could work for Pine Needle. Hereā€™s what Iā€™ve got so far:

    <?xml version="1.0">
    <story>
    <screen id="tulips-laundry-cupboard">
      You open the cupboard door and look at the shelf
      above the brooms. There are a couple of aerosol cans up 
      there, including a red one that says "Begone Insecticide".
      You bring it out and scan the active ingredients. There
      are a bunch of letters and numbers back there, and none of 
      them have the word "organic."
    
      <option screen="tulips-take-insecticide">Take Insecticide</option>
      <option screen="tulips-leave-insecticide">Leave Insecticide</option>
    </screen>
    <screen id="tulips-take-insecticide">
      You return to the tulips with the insecticide, and start 
      spraying them. The pungent odour of the spray fills the air, 
      but you get the sense that it's helping a little.
    
      <option screen="tulips-end">Continue</option>
    </screen>
    <screen id="tulips-leave-insecticide">
      You decide against using the can of insecticide. You put the
      can back on the shelf and close the cupboard door.
    
      <option screen="tulips-laundry-trough">Look Under The Trough</option>
      <option screen="tulips-exit-laundry">Exit Laundry</option>  
    </screen>
    </story>
    

    The idea is that the prose will still be Markdown, so things like blank lines will still be respected (the parser strips all the leading whitespace, allowing one to easily indent the prose). Attributes satisfy the key/value requirement for the elements, and I get the features that make this easy to modify by hand, such as comments and good editor support.

    I think itā€™s going to work. It would require some custom code, as Goā€™s unmarshaller doesnā€™t quite like the mix of prose and declared <option> elements, but I think itā€™s got the bones of a decent format for this interactive fiction. Already Iā€™m coming up with ideas of how to add script elements and decompose fragments into sub-files to make it easier to test.

    Iā€™ll talk more about this project in the future if Iā€™m still working on it. I donā€™t know if the story that started all this will see the light of day. Iā€™ve gone through it a few times, and itā€™s not great. But shipping stuff youā€™re proud of comes from shipping stuff youā€™re not proud of, and given how far along it is, it probably deserved to be release in one form or another. Thatā€™s probably why itā€™s been saved from the chopping block so far1.


    1. Yes, this is probably just a rationalisation for trying to minimise sunk-costs, but Iā€™ve got nothing else to work on, so why not this? ↩︎

    Gallery of Fake Logo For Test Organisations

    In lieu of anything else, I thought I’d put together a gallery of the logos I use for test organisations. IĀ occasionally need to create fake organisations for testing at work, and to add a bit of amusement to the mix, I sometimes make fake logo for them. These logos are typically made using ChatGPT, although if I’m particularly bored, I sometimes touched them up by hand (nothing too drastic, usually things like cropping or adding titles). Most of the fake organisation are film and media production studios, as that’s typically the client base of our product.

    I do this mainly for fun, but there is some utility to it. A user can be a member of zero or more organisations, and can change the one they’re acting in at any time. Having a unique avatar for each one helps in distinguishing which one I have active. I do get cute with the names, and for that, I make no apologies. šŸ™‚

    2024 Year In Review

    Itā€™s a few minutes to 12:00 PM on the 1st January 2025 when I published this. Thanks to time-zones, that means itā€™s just about to turn 12:00 AM one hour to the west of Greenwich, meaning that itā€™s still 2024 in much to the west of the prime meridian. So Iā€™m technically still within the window of time where I could say I got a year in review post out for 2024.

    Turns out itā€™s not the first time Iā€™ve used this excuse. Looking at other posts I’ve published on the 1st of January, Iā€™ve done this twice before. And I guess this pattern of posting a year in review on the first of the next year makes some sense: itā€™s usually a quiet day, with nothing open, and Iā€™m usually a little tired and listless1.

    That means thereā€™s usually a large block of time available to me, and despite what I wrote yesterday, it is a good practice to do some reflecting, however brief it may be.

    So Iā€™d figured Iā€™d might as well drag iA Writer out and scribble out a brief review of the year that was.

    Online Presence

    I made a point of wanting to cut down the number of domains I acquire in my year in review for 2023. This is still an ongoing progress, but year-on-year, Iā€™m down 2 domains on net, which is an improvement. 2024 saw the acquisition of 7 new domains, of which 2 Iā€™m using for something, and 1 I plan to keep around:

    Domains 2024 2023
    Registered 23 25
    With auto-renew turned on 17 16
    Currently used for something 15 13
    Not currently for something but worth keeping 2 3
    Want to remove but stuck with it because itā€™s been shared by others 0 1

    Iā€™m still trying to keep this blog alive by posting regularly here. Iā€™ve mustā€™ve felt more comfortable with doing so as itā€™s been a record breaking year, with 840 posts, beating the previous record by 594 posts. The initial fear of falling out the practice has subsided to one where I find joy in posting here. Thatā€™s probably why the post count is so much higher.

    Of course the quality of a blog doesnā€™t correlate with oneā€™s ability to ā€œpost the mostā€, and I do feel like there have been times where I felt a little blasĆ© about what I write here. I do want to be better here, or at least be a little more conscious about it. But not at the expense of turning this into solely a soapbox/marking/punditry site: thereā€™s plenty of those out there already. So youā€™ll probably continue to see the cringey, irrelevant, lame, and potentially disposable posts here. Iā€™ll just try to make sure that itā€™s balanced out with posts that are actually good.

    Some notable posts of the year:

    One other thing about blogs: I still have that wandering eye for other blogging CMSs. I did start three other blogs, of which two I shut-down and rolled into this one. That just leaves one new blog created this year that Iā€™m planning to keep separate, mainly because I feel the subject matter warrants a dedicated site.

    Projects

    2024 didnā€™t feel like a big project year, probably because most of the projects Iā€™ve started I kept to myself. But I did manage to release a couple this year:

    • Sidebar For Tiny Theme: This came with the additions for recommendations to Micro.blog, and the addition of sidebars listing those recommendations in a couple of the themes. I use Tiny Theme, which didnā€™t have a way to do this, so what started as a blog-post eventually became a Micro.blog plugin. Itā€™s been good seeing this adopted by other bloggers.
    • Cyber Burger: A Pico-8 game, which I hoped wouldā€™ve only taken a few weeks to build. Overall it took 3 months, largely because I spent large stretches of time not working on it. But Iā€™m glad I actually got this finished and released.

    The other things Iā€™ve worked on that I built mainly for myself:

    • UCL: A tool-command language, similar to TCL, that Iā€™ve built for work. The state of this is smack-bang in the middle of ā€œsort-of-usableā€ and Iā€™ll need to spend some effort cleaning this up and documenting this if I ever want this to see wide release. Iā€™m not sure that I want that though. at least not yet.
    • Blogging Tools: A set of tools that help me post here. What started as something to assist with making photo galleries has grown to a suite of tools dealing with images, audio and video. Really useful for my needs.
    • Tape Playback Site: A private site for browsing old cassette tapes that have been digitised. I still have a pile of tapes I need to digitise though (are you getting a sense that I find it hard to finish things? šŸ˜€).
    • Nano-Journal: My version of Kev Quirkā€™s web-based journalling app. After adding Git synchronisation and attachment support, Iā€™ve use this now for all my journalling. It doesnā€™t do everything that Day One does (off-line supportā€™s a big one) but it does enough.

    Finally, some of the projects that have been abandoned:

    • Photo Bucket: My third attempt at making something to publish image galleries online. I think the need for this has been made somewhat redundant with Blogging Tools.
    • I also started two interactive fiction stories in Evergreen that I havenā€™t finished yet. One is close, and I probably should get it done. The other is about halfway finished and probably wonā€™t see the light of day.

    Books And Media

    Given that I exceeded my reading goal for 2023, I thought I could push myself a bit more for 2024 and go for finishing 10 books. Sadly, I was nowhere near meeting it. I only got to finishing 4 this year:

    Useful Not True Hell Yeah or No Magician (The Riftwar Saga, Book 1) Twenty Bits I Learned about Making Websites

    (not pictured: Twenty Bits I Learned about Making Websites)

    I think I probably started more books than Iā€™ve finished. Not that I have to finish every book Iā€™ve started, but I think my problem is one of focus. The books I have started could be interesting, and I have plans to finish some of them. I just need to spend more time reading.

    Iā€™m not a movie person but I did manage to watch a few this year. Here are the ones Iā€™ve posted reviews for:

    There was a period of time where I felt burnt out by scripted TV shows, favouring YouTube over much else. I eventually got back into watching a few ā€œhigh production valueā€ shows, some of which I enjoyed:

    And some that I enjoyed far less:

    Onwards To 2025

    I know itā€™s clichĆ© to look back on the last year and feel pretty crappy about it. And yeah, not every day of your life is going to be great. Theyā€™ve been some rotten days in 2024 that I havenā€™t included in this review (I have written about them so check out the archive if your curious).

    So, was 2024 a good year? Well, Iā€™ll start by saying that it hasnā€™t been wholly a remarkable year. Thereā€™s no one event that I can point to and say ā€œthis is what made 2024 great.ā€ Maybe this past year was like that to others, but such events didnā€™t happen to me2. And recency bias has made it difficult for me to say there were more good days then bad (the last few months have been rough).

    But I wouldnā€™t say 2024 was a bad year. Certainly a busy one, with a lot going on. But on balance, Iā€™d say itā€™s been one of the better ones.

    So thatā€™s it, year in review for 2024 done. Have a happy new year and onwards to a great 2025.


    1. I donā€™t do much on New Years Eve, electing to go to bed early. Yet, I usually get awaken at midnight for various reasons: fireworks, messages from people wising me a Happy New Year. Last night was no exception. ↩︎

    2. Well, thatā€™s not entirely true. Becoming an uncle was an example of such an event. But again, such personal news, is currently outside the scope of this post. ↩︎

    Home Screen Of 2024

    Itā€™s just turned 3:00 in the afternoon, and I was alternating between the couch and the computer desk, racking my brain on what to do. With no ongoing projects ā€” a few ideas have been bouncing around, yet none has grabbed me so far, and I had nothing else in a state where I could just slip on some music or a podcast and work on ā€” and seeing a few others make similar posts on their blogs, Iā€™d figured I talk about my home screens.

    A smartphone home screen features various app icons arranged across three panels, set against a blue floral background.

    I realised that I havenā€™t actually done this before, mainly because my home screens change very slowly (the background hardly ever). Dramatic changes usually come about when Iā€™m setting up a new phone.

    And yet, I do want to talk a little about the apps I have at the moment, and I did want to make sure I had a record of how the home screens looked. And seeing that I needed to keep myself occupied doing something, now is as good a time as any.

    Screen One

    Auto-generated description: A mobile home screen displays weather information, calendar events, and various app icons against a blurred background of yellow and gray flowers.

    This screen contains two widgets ā€” the date and weather widget at the top, and the calendar widget on the right ā€” plus a small collection of apps placed deliberately where they are. The apps I have here are not necessarily the most used (although two of them are) but I like having easy access to them for various reasons.

    Aside from the widgets, the apps I have on this screen ā€” from left to right, top to bottom ā€” are as follows:

    • Micropub Checkin: A silly little Flutter app I used for adding check-ins to lmika.day. The apps in a bit of a neglected state, but I still use it as I get value from tracking places Iā€™ve been.
    • Strata: The noteā€™s app from Micro.blog. This is where I write my short-term notes. I use Google Keep for shopping lists, but everything else goes here.
    • Alto: A music app I wrote, and the main music app I listen to.
    • Pocket Casts: The podcast player app I use. Apart from the web-browser, this and Alto are two of the most used apps I have on my phone.
    • VSReader: Another silly little Flutter app. This is a test build for an RSS reader I was working on a couple of months ago. Itā€™s been a while since Iā€™ve opened this, and I probably should just kill it given that I havenā€™t made any recent changes to it.
    • Google Wallet: Googleā€™s digital wallet (well, at least their current iteration of their digital wallet). I use it mainly for my train ticket but I do have my credit card in there, just in case I walk out without my ā€œrealā€ wallet.

    The items in the dock are as follows:

    • Phone: My family and I still use the phone quite frequently so this app has remained in the dock since I set the phone up.
    • Messages: This is Androidā€™s messaging app. Much like the phone, I communicate with family mostly via SMS, and now RCS, messages.
    • Play Store: I rarely go to the Play Store, so thereā€™s no real need for this icon to be here. But I havenā€™t got around to removing it yet.
    • Vivaldi: My web browser of choice.
    • The right most icon changes based on the last used app, which Iā€™m not a huge fan of, as it occasionally changes just as I go to tap it and I launch the wrong app by mistake.

    Screen Two

    Auto-generated description: A smartphone screen displays various app icons arranged in a grid over a floral background.

    A grab-bag of apps I frequently use. Some of them probably should be on the first screen, but since real-estate is at a bit of a premium I just keep them here, and swipe over when I need them.

    From left to right, top to bottom, the apps on this screen is as follows:

    • PTV: The Victorian public transport app. I usually use it to know the arrival time of the tram I take going home. Also useful for trip planning.
    • Plex: I generally donā€™t watch things on my phone, but before I got my Nvidia Shield, I used this Plex app to Chromecast shows to the TV. It was never great at it though, as it sometimes disconnected from the Chromecast session while the video was running, leaving me with no means of stopping it until I unplugged the Chromecast.
    • Kindle: Kept here as I occasionally use it to read books if Iā€™ve read through my RSS feeds.
    • ChatGPT: I donā€™t use ChatGPT on my phone that often, but it does occasionally come in useful when a web-search proves fruitless.
    • FastMail: My email provider of choice. Given how often I use it, this is arguably one of those apps that should be on the first screen.
    • Pager Duty: The twenty-four hours on-call paging software I had to use for work. Iā€™m no longer on the on-call roster so itā€™s probably something I can remove.
    • WhatsApp: What I use for messaging friends. I donā€™t like the fact that I have a Meta app on my phone, but thatā€™s what my friends chose to use so Iā€™m stuck with it (itā€™s also better than Viber, which is what we used before).
    • WireGuard: Personal VPN, although Iā€™m currently not using WireGuard for anything right now. I like to keep it mainly because I like the logo.
    • Discord: Iā€™m a member of a few Discord servers, but I use the mobile client mainly to check into the Hemispheric Views Discord.
    • Notion: Where I store my ā€œlong termā€ notes, at least for now.
    • Tusky: Mastodon client.
    • Splitwise: Group expense management and splitting app. This was useful during our European trip last year, where each of us would take in turn to pay for the group.
    • SunSmart: Used to track the current and forecasted UV index. Useful around this time of year if Iā€™m planning to be outside for an extended period of time.
    • Micro.blog: The Micro.blog app, although I occasionally use the web version too.
    • 1Password: My password manager of choice.
    • Realestate.com: Used to browse real-estate, out of curiosity more than anything else.
    • Spotify: My ā€œsecondaryā€ music app. I donā€™t use it for anything that I regularly listen to, but itā€™s occasionally useful for those once-off tracks.
    • Google Authenticator: Where I keep my 2FA codes.
    • Day One: Before I moved to a web-based journalling app, I used this Day One client for writing journal entries. It wasnā€™t perfect: there was always syncing delays to/from the Apple platform instances of Day One. But it was fine.
    • Slack: Used mainly for work.
    • Camera: Iā€™m not sure why I have this here, since I almost always use the double power-button tap to bring up the camera. I guess I moved it here from screen one and never removed it.

    Screen Three

    Auto-generated description: A smartphone home screen displays a vibrant wallpaper of yellow flowers and foliage, with apps like Booking, Airalo, and Emirates icons visible at the top.

    This is a screen I hardly ever used, as itā€™s mainly reserved for apps that are useful while travelling. The Booking.com app and Emirates apps I can probably remove: I was using them mainly to track flights and accomodation during my European trip last year.

    The only one worth keeping is Airalo, which allows you to buy and setup data SIMs that work overseas. This has been really useful to me during my last couple of trips, and I hope to keep using it for trips in the future. It doesnā€™t offer a lot of data, but any data is better than zero data, as my friends ā€” who continued asking to use my data when weā€™re out of WiFi range ā€” can attest.

    2024 Song of The Year

    It’s Christmas Eve once again, which means it’s time for the Song of The Year for 2024. Looking at the new and rediscovered albums for the year, there are quite a few to choose from.

    The runners up are pretty much all from Lee Resevere, a new artist I’ve started listening to, and includes:

    But there can only be one winner, and this year it’s Oxygene, Pt. 20 by Jean-Michel Jarre. šŸ‘

     A globe is depicted with a skull emerging from its surface, set against a blue background with the text 'Jean-Michel Jarre Oxygene Trilogy' above it.

    Oxygene is actually an album in my regular rotation, but I always stopped listening to it after Part 19. The strange organ at the start of Part 20 put me off. But one day this year, feeling a little down, I decided to work my way through it and give it a listen, and after the 30 or so seconds, it turned into quite a lovely piece. A nice contrast to the rest of the disk, and a suitable conclusion to the album itself. I’ve even grown to like the organ at the start.

    Honourable Mentions this year include:

    An actually bumper crop this year in terms of music. Let’s hope 2025 is just as good.

    That Which Didn't Make The Cut

    I did a bit of a clean-up of my projects folder yesterday, clearing out all the ideas that never made it off the ground. I’d figured it’d be good to write a few words about each one before erasing them from my hard drive for good.

    I suppose the healthiest thing to do would be to just let them go. But what can I say? Should a time come in the future where I wish to revisit them, it’d be better to have something written down than not. It wouldn’t be the first time I wished this was so.

    Anyway, here are the ones that were removed today. I don’t have dates of when these were made or abandoned, but it’s likely somewhere between 2022 and 2024.

    Interlaced

    This was an idea for a YouTube client1 that would’ve used YouTube’s RSS feeds to track subscriptions. The idea came about during a time when I got frustrated with YouTube’s ads. I think it was an election year and I was seeing some distasteful political ads that really turned me off. This would’ve been a mobile app, most likely built using Flutter, and possibly with a server component to get this working with Chromecast, although I had no idea how that would work.

    This never got beyond the UI mock-up stage, mainly because the prospect of working on something this large seemed daunting. Probably just as well, as YouTube solved the ads problem for me, with the release of YouTube Premium.

    Auto-generated description: A smartphone interface mockup displays a channels list with annotations highlighting features like a navigation tab, subscription indicators, filter options, and a Chromecast button.

    Red Crest

    I thought I could build my own blogging engine and this is probably the closest I got (well, in recent years). This project began as an alternative frontend for Dave Winer’s Drummer, rendering posts that would be saved in OPML. But it eventually grew into something of it’s own with the introduction of authoring features.

    I got pretty far on that front, allowing draft posts and possibly even scheduled posts (or at least the mechanics for scheduled posts). One feature I did like was the ability to make private posts. These would be interleaved with the public ones once I logged in, giving me something of a hybrid between a blogging CMS and a private journal. It was also possible to get these posts via a private RSS feed. I haven’t really seen a CMS do something quite like this. I know of some that allow posts to be visible to certain cohorts of readers, but nothing for just the blog author.

    In the end, it all got a bit much. I started preparing the screen for uploading and managing media, I decided it wasn’t worth the effort. After all, there were so many other blogging CMS’s already out there that did 90% of what I wanted.

    Reno

    As in “Renovation”. Not much to say about this one, other than it being an attempt to make a Pipe Dreams clone. I think I was exploring a Go-based game library and I wanted to build something relatively simple. This didn’t really go any further that what you see here.

    Auto-generated description: A grid of dark squares is displayed on a computer screen, with one square featuring two horizontal white lines.
    Auto-generated description: A grid of interconnected circuit-like lines on a dark background.
    Tileset free for anyone who wants it.

    SLog

    Short for “Structured Log”. This was a tool for reading JSON log messages, like the ones produce by zerolog. It’s always difficult to read these in a regular text editor, and to be able to list them in a table made sense to me. This one was built for the terminal but I did make a few other attempts building something for this; one using a web-based GUI tool, and another as a native MacOS app. None of these went very far ā€” turns out there’s a lot of tedious code involved ā€” but this version was probably the furthest along before I stopped work.

    Despite appearing on this list, I think I’ll keep this one around. The coding might be tedious, but I still have need something like this, and spending the time to build this properly might be worth it one day.

    Auto-generated description: A terminal window displays log messages with levels and a table summarizing error, ID, level, message, and time values.

    Miscellany

    Here are all the others that didn’t even get to the point that warranted a screenshot or a paragraph of text:

    • s3-browse: a TUI tool for browsing S3 buckets. This didn’t go beyond simply listing the files of a directory.
    • scorepeer: An attempt to make a collection of online score-cards much like the Finska one I built.
    • withenv: Preconfigure the environment for a command with the values of an .env file (there must be something out there that does this already).
    • About 3 aborted attempts to make a wiki-style site using Hugo (one called “Techknow Space” which I though was pretty cleaver).

    I’m sure there’ll be more projects down the line that would receive the same treatment as these, so expect similar posts in the future.


    1. Or possibly a Peertube client. ↩︎

    A Summer Theme

    Made a slight tweak to my blog’s theme today, to “celebrate” the start of summer.

    I wanted a colour scheme that felt suitable for the season, which usually means hot, dry conditions. I went with one that uses yellow and brown as the primary colours. I suppose red would’ve been a more traditional representation of “hot”, but yellow felt like a better choice to invoke the sensation of dry vegetation. I did want to make it subtle though: it’s easy for a really saturated yellow to be quite garish, especially when used as a background.

    My original idea was to use yellow as the link colour, but there wasn’t a good shade that worked well with a white background that had a decent contract1. So I pivoted, making the background yellow instead, and throwing in a brown for the link colour. That improved the contrast dramatically, and helped to make the theme a little more interesting.

    One thing I did do was make it conditional to the data-theme attribute in the html tag, leaving me the option of adding a theme picker in the future. If you’re interested in the CSS, here it is:

    :root[data-theme="summer"] {
        --background: #FFFEF5;
        --link: #895238;
        --link_visited: #895238;
    }
    
    @media (prefers-color-scheme: dark) {
        :root[data-theme="summer"] {
            --text: #f8f8f2;
            --link: #fce98a;
            --link_visited: #fce98a;
            --background: #30302a;
        }
    }
    

    I plan to keep this theme for the next three months, then look at changing it again when summer turns into autumn. It’s probably not a great colour scheme, as I generally don’t have the patience for making minute adjustments to get the style “just right”. I guess it follows on from my feeling of the season: I generally don’t like summer and I just want to get it over with. Perhaps doing something small like this is a way of enjoying it a little more.


    1. It was much easier for the dark theme. ↩︎

    Delta of the Defaults 2024

    It’s a little over a year since Dual of the Defaults, and I see that Robb and Maique are posting their updates for 2024, so I’d thought I do the same. There’ve only been a few changes since last year, so much like Robb, I’m only posting the delta:

    • Notes: Obsidian for work. Notion for personal use if the note is long-lived. But I’ve started using Micro.blog notes and Strata for the more short-term notes I make from day to day.
    • To-do: To-do’s are still going to where my notes go, which is now Strata. Although I’m still using Google Keep for shopping lists.
    • Browser: Safari’s out from all machines. It’s Vivaldi everywhere now. Except iPad, where I don’t have a choice.
    • Presentations: Believe it or not, I haven’t had to make a presentation since last year. I am still paying for iA Presenter, and despite some thoughts, I think I’ll continue to use it for future presentations. I should also add that I’m using iA Writer for prose writing, not that I do much of that either.
    • Social Clients: Still Tusky for now, but I’m wondering how long it’ll be before I install a BlueSky client too.
    • Journalling App: I didn’t include this in last year’s list, but it’s worth mentioning here as I’ve moved away from Day One to a home grown web-app, similar to the one built by Kev Quirk.
    • POSSE: Micro.blog, EchoFeed. Also a new category, now I’m doing this a bit more nowadays.

    Looking At Coolify

    While reading Robb Knight’s post about setting up GoToSocial in Coolify, I got curious as to what this Coolify project actually is. I’m a happy user of Dokku, but being one with magpie tendencies, plus always on the lookout for ways to make the little apps I make for myself easier to deploy, I thought I’d check it out.

    So I spun up a Coolify instance on a new Hetzner server this morning and tried deploying a simple Go app, complete with automatic deployments when I push changes to a Forgejo repository. And yeah, I must say it works pretty well. I haven’t done anything super sophisticated, such as setting up a database or anything. But it’s almost as easy as deploying something with Dokku, and I’m please that I was able get it working with my Forgejo setup1.

    Anyway, this post is just a few things I want to make a note about for next time I want to setup a Coolify instance. It’s far from a comprehensive set-up guide: there’s plenty of documentation on the project website. But here are a few things I’d like to remember.

    Changing the Proxy to Caddy: Soon after setting up your Coolify instance, you probably want to change the proxy to Caddy, just so that you can easily get Lets Encrypt certificates. Do this before you setup a domain as you’ll need direct access to Coolify via the port.

    Go to “Servers ā†’ localhost” and in the “Proxy” tab, stop the current proxy. You then have the option of changing it to Caddy.

    Setting Up a Domain For Coolify Itself: Once you’ve change the proxy, you’d probably want to setup a domain so as to avoid accessing it via IP address and port number. You can do so by going to “Settings,” and within “Instance Settings” changing “Domain”.

    If you prepend your domain with https, a certificate will be setup for you. I’m guessing it’s using Lets Encrypt for this, which is fine. I’d probably do likewise if I had to set it up manually.

    Deploying From a Private Forgejo Repository: To deploy from a private Forgejo repository, follow the Gitea integration instructions on setting up a private key. This is basically creating a new key in “Keys And Tokens”, and adding it as a key in Forgejo.

    The Add Key modal showing options to generate an RSA or elliptical curve key
    The Add Key modal

    As far as I’m aware, it’s not possible to change an application source from a public Git repo to a private one. I tried that and I got a few deploy errors, most likely because I didn’t set the key. I had to delete it and start from scratch.

    Setting a Domain For a Project: Setting up a domain is pretty simple: just add a new A record pointing to the IP address of the service the application is running on. Much like the Coolify domain, prefacing your domain with https will provision a TLS certificate for you (docs):

    The Domain settings for the deployable project resource
    The Domain settings for the deployable project resource

    Unlike Dokku, your app doesn’t need to support the PORT environment variable. You should be able to start listening on a port and simply setup a mapping in the project page. The default seems to be port 3000, just in case you’re not interested in changing it:

    Automatic Deployments From Forgejo: Coolio looks to have some nice integrations with Github, but that doesn’t help me and my use of Forgejo. So the setup is a little more manual: adding some web-hook to automatically deploy when pushing commits to Forgejo. In Coolify, you’d want to use the Gittea web-hook:

    The web-hook settings for the deployable project resource, with the Gittea web-hook highlight
    The Gittea web-hook is the one to use

    You’ll need to generate the web-hook secret yourself. Running head -c 64 /dev/urandom | base64 or similar should give you something somewhat secure.

    Setting up the web-hook on Forgejo’s side was a little confusing. Clicking “Add Webhook” just brought up a list of integrations, which I’m guessing are geared towards particular form of web-hook payloads. You want to select the “Forgejo” one.

    Project web-hooks in Forgejo, with the Gittea domain from Coolify set as the target URL, the secret set, and everything else left as the default
    How the web-hook looks on Forgejo's side

    Use the URL that Coolify is showing for the Gittea web-hook, leave the method as “POST” and set the secret you generated. The rest you can configure based on your preferences.

    So that it. So far I’m liking it quite a bit, and I look forward to going a bit further than simple Go apps that serve a static message (some of the pre-canned applications look interesting). I’d like to try it for a bit longer before I consider it as a replacement for Dokku, but I suspect that may eventually happen.

    A screenshot of a browser window with a plain text message saying: 'Hello World. This is deployed via Coolify via a private repo that is auto-deployed.'
    Hello Coolify

    1. I say “it’s almost as easy” as Dokku, but one thing going for Coolify is that I don’t need to SSH into a Linux box to do things. When it comes to creating and operating these apps, doing it from a dashboard is a nicer experience. ↩︎

    Cropping A "Horizontal" PocketCast Clip To An Actual Horizontal Video

    Finally fixed the issue I was having with my ffmpeg incantation to crop a PocketCast clip. When I was uploading the clip to Micro.blog, the video wasn’t showing up. The audio was fine, but all I got for the visuals was a blank void1.

    For those that are unaware, clips from PocketCast are always generated as vertical videos. You can change how the artwork is presented between vertical, horizontal, or square; but that doesnā€™t change the dimensions of the video itself. It just centers it in a vertical video geared towards TikTok, or whatever the equivalent clones are.

    This, I did not care for. So I wanted to find a way to crop the videos to dimensions I find more reasonable (read: horizontal).

    Here’s the ffmpeg command I’m using to do so. This takes a video of the “horizontal” PocketCast clip type and basically does a crop at the centre to produce a video with the 16:9 aspect ratio. This post shows how the cropped video turns out.

    ffmpeg -i <in-file> \
      -vf "crop=iw:iw*9/16:(iw-ow)/2:(ih-oh)/2, scale=640:360" \
      -vcodec libx264 -c:a copy <out-file>
    

    Anyway, back to the issue I was having. I suspect the cause was that the crop was producing a video with an uneven width. When I upload the cropped video to Micro.blog, I’ve saw in the logs that Micro.blog was downscaling video to a height of 360. This was using a version of the command that didnā€™t have the scale filter, and the original clip was 1920 x 1080. If you downscale it while maintaining the original 15:9 aspect ratio, the new dimensions should be 640 x 360. But for some reason, the actual width of the cropped video was 639 instead.

    Iā€™m not sure if this was the actual problem. I had no trouble playing the odd-width video in QuickTime. The only hint I had that this might be a problem was when I tried downscaling in ffmpeg myself, and ffmpeg threw up an error complaining that the width was not divisible by two. After forcing the video size to 640 x 360, and uploading it to Micro.blog, the video started coming through again. So there might be something there.

    Anyway, it’s working now. And with everything involving ffmpeg, once you get something working, you never touch it again. šŸ˜„


    1. Not that there’s much to see. It’s just the podcast artwork. Not even a rendered scrubber. ↩︎

    WeblogPoMo AMA #3: Best Music Experience

    I’m on a roll with these, but I must warn you, this streak may end at any time. Anyway, todays question is from Hiro who asked it to Gabz, and discovered via Robb:

    @gabz Whatā€™s the best music-related experience of your life so far?

    Despite attending only a hand-full of concerts in my life ā€” live music is not really my jam ā€” I’ve had some pretty wonderful music-related experiences in my life, both through listing to it or by performing it. Probably my most memorial experience was playing in the pit orchestra for our Year 10 production of Pippin. This was during the last few weeks before the show opened and we attended a music camp for a weekend to do full day rehearsals with the music director. The director had a reputation of being a bit of a hard man, prone to getting a bit angry, and not afraid to raise his voice. It was intimidating to me at the time, but in hindsight I can appreciate that he was trying to get the best from us. And with us being a group of teenage boys who were prone to loosing focus, I think we were deserving of his wrath.

    One evening, we were rehearsing late, and the director was spending a lot of time going through some aspect of the music. I can’t remember what was being discussed but it was one of those times where everyone was tired, yet each knew what they were meant to be doing and was still happy to be working. You feel something special during those moments, when the group was doing their best, not out of coercion but because we were trying to “get the work done”.

    Probably a very close second was discovering Mike Oldfield for the first time. This was probably when I was 11 or 12, and I wasn’t a bit music listener back then (I mean, we did have a home stereo but I wasn’t listening to a walkman or anything like that). Dad was working one night and I came up to him. He then started playing track 1 of Tubular Bells II, thinking that I would appreciate it. I was more intrigued at first, as it wasn’t the type of music I was use to at the time: long, instrumental pieces. Yet I found it to be decent, and something I could see myself liking the future1. He then played track 7, and I was absolutely hooked after that.


    1. In my experience, the tracks that take some time to grow to like turn out to be the best ones to listen to. ↩︎

Older Posts ā†’