One other thing I found this morning during my exploration of Markdown and Asciidoc is that many tools have a problem with JSON code blocks containing JavaScript-like comments. They’re reported as syntax errors, and sometimes they break the syntax highlighting. They’re still included in the rendered HTML, but it feels to me like the tools do so begrudgingly. Gitlab even marks them up with a red background colour.

Why so strict? The code blocks are for human consumption, and it’s really useful to annotate them occasionally. I always find myself adding remarks like “this is the new line”; or removing large, irrelevant chunk of JSON and replacing it with an ellipsis indicating that I’ve done so.

I know that some Markdown parsers support line annotations, but each one has a different syntax, and they don’t work for every annotation I want to make. But you know what does? Comments! I know how to write them, they’re easy to add, and they’re the same everywhere. Just let me use them in blocks of JSON code, please.

Oh, and also let me add trailing commas too.

Asciidoc, Markdown, And Having It All

Took a brief look at Asciidoc this morning.

This is for that Markdown document I’ve been writing in Obsidian. I’ve been sharing it with others using PDF exports, but it’s importance has grown to a point where I need to start properly maintaining a change log. And also… sharing via PDF exports? What is this? Microsoft Word in the 2000s?

So I’m hoping to move it to a Gitlab repo. Gitlab does support Markdown with integrated Mermaid diagrams, but not Obsidian’s extension for callouts. I’d like to be able to keep these callouts as I used them in quite a few places.

While browsing through Gitlabs’s help guide on Markdown extensions, I came across their support for Asciidoc. I’ve haven’t tried Asciidoc before, and after taking a brief look at it, it seemed like a format better suited for the type of document I’m working on. It has things like auto-generated table of contents, builtin support for callouts, proper title and heading separations; just features that work better than Markdown for long, technical documents. The language syntax also supports a number of text-based diagram formats, including Mermaid.

However, as soon as I started porting the document over to Asciidoc, I found it to be no Markdown in terms of mind share. Tool support is quite limited, in fact it’s pretty bad. There’s nothing like iA Writer for Asciidoc, with the split-screen source text and live preview that updates when you make changes. There’s loads of these tools for Markdown, so many that I can’t keep track of them (the name of the iA Writer alternative always eludes me).

Code editors should work, but they’re not perfect either. GoLand supports Asciidoc, but not with embedded Mermaid diagrams. At least not out of the box: I had to get a separate JAR which took around 10 minutes to download. Even now I’m fighting with the IDE, trying to get it to find the Mermaid CLI tool so it can render the diagrams. I encountered none of these headaches when using Markdown: GoLand supports embedded Mermaid diagrams just fine. I guess I could try VS Code, but to download it just for this one document? Hmm.

In theory the de-facto CLI tool should work, but in order to get Mermaid diagrams working there I need to download a Ruby gem and bundle it with the CLI tool (this is in addition to the same Mermaid command-line tool GoLand needs). Why this isn’t bundled by default in the Homebrew distribution is beyond me.

So for now I’m abandoning my wish for callouts and just sticking with Markdown. This is probably the best option, even if you set tooling aside. After all, everyone knows Markdown, a characteristic of the format that I shouldn’t simply ignore. Especially for these technical documents, where others are expected to contribute changes as well.

It’s a bit of a shame though. I still think Asciidoc could be better for this form of writing. If only those that make writing tools would agree.

Addendum: after drafting this post, I found that Gitlab actually supports auto-generated table of contents in Markdown too. So while I may not have it all with Markdown β€” such as callouts β€” I can still have a lot.

Must say I enjoyed The Rest Is History’s recent podcast on Dragons. They go into how these mythical beasts developed over the years, how they’re seen differently in different cultures, and how they entered the mainstream. Just watch out for the odd spoiler for House of the Dragon series 1. πŸŽ™οΈ

Eight months in and I’m still enjoying writing technical documents in Obsidian. I’ve never really appreciated how well it works for this form of writing. I wish we were using this for our knowledge base, instead of Confluence.

Key ring.

Image of a keychain with the phrase 'Everyone I know has gone to Europe and all I got is all this work to do' with stars below Europe, and laptop and briefcase emoji below 'Work to do'

It’s always after you commit to a deadline that you find the tasks that you forgot to do.

I think if I ever created a Tetris game for the TI-83 graphing calculator, I would call it “Tetris Instruments.”

My Position On Blocking AI Web Crawlers

I’m seeing a lot of posts online about sites and hosting platforms blocking web crawlers used for AI training. I can completely understand their position, and fully support them: it’s their site and they can do what they want.

Allow me to lay my cards on the table. My current position is to allow these crawlers to access my content. I’m choosing to opt in, or rather, not to opt out. I’m probably in the minority here (well, the minority of those I follow), but I do have a few reasons for this, with the principal one being that I use services like ChatGTP and get value from them. So to prevent them from training their models on my posts feels personally hypocritical to me. It’s the same reason why I don’t opt out of Github Copilot crawling my open source projects (although that’s a little more theoretical, as I’m not a huge user of Copilot). To some, this position might sound weird, and when you consider the gulf between what value these AI companies get from scraping the web verses what value I get from them as a user, it may seem downright stupid. And if you approach it from a logical perspective, it probably is. But hey, we’re in the realm of feelings, and right now this is just how I feel. Of course, if I were to make a living out of this site, it would be a different story. But I don’t.

And this leads to the tension I see between site owners making decisions regarding their own content, and services making decisions on behalf of their users. This site lives on Micro.blog, so I’m governed by what Manton chooses to do or not do regarding these crawlers. I’m generally in favour of what Micro.blog has chosen so far: allowing people to block these scrapers via “robots.txt” but not yet blocking requests based on their IP address. I’m aware that others may not agree, and I can’t, in principal, reject the notion of a hosting provider choosing to block this crawlers at the network layer. I am, and will continue to be, a customer of such services.

But I do think some care should be considered, especially when it comes to customers (and non-customer) asking these services to add these network blocks. You may have good reason to demand this, but just remember there are users of these services that have opinions that may differ. I personally would prefer a mechanism where you opt into these crawlers, and this would be an option I’ll probably take (or probably not; my position is not that strong). I know that’s not possible under all circumstances so I’m not going to cry too much if this was not offered to me in lieu of a blanket ban.

I will make a point on some comments that I’ve seen that, if taken in an uncharitable way, imply that creators that have no problem with these crawlers do not care about their content. I think such opinions should be worded carefully. I know how polarising the use of AI currently is, and making such remarks, particularly within posts that are already heated due to the author’s feelings regarding these crawlers, risks spreading this heat to those that read it. The tone gives the impression that creators okay with these crawlers don’t care about what they push online, or should care more than they do. That might be true for some β€” might even be true for me once in a while β€” but to make such blanket assumptions can come off as a little insulting. And look, I know that’s not what they’re saying, but it can come across that way at times.

Anyway, that’s my position as of today. Like most things here, this may change over time, and if I become disenfranchised with these companies, I’ll join the blockade. But for the moment, I’m okay with sitting this one out.

Finally did something today that I should’ve done a long time ago: buy a UPS. Hopefully power outages will no longer bring down my Mac Mini server while I’m away (power is usually quite reliable when I’m home, but as soon as I leave for any extended period of time… πŸͺ«).

A box containing a Digitech 650VA Line Interactive Uninterruptible Power Supply (UPS) is placed on a wooden kitchen table.

Sometimes I wonder how and why my work email address got onto various B2B marketing email lists. β€œWant to buy some network gear, or setup a meeting with our account manager?” What? No! Even if I wanted to, that’s not a decision I’m authorised to make.

In today’s demonstration of the gulf between taste and ability, may I present my attempt at fixing the fence extension:

Auto-generated description: A partially damaged wooden fence with a makeshift repair tied together using green string and an olive tree branch protruding through the slats.

Part of the challenge was getting to it. I had to hack out a path through the overgrown beds:

Auto-generated description: A fenced backyard features dense, overgrown plants with long, slender leaves amid a paved ground.

Trust me when I say that this is an improvement. πŸ˜…

Checked out of the Cockatiel Cafe and heading home to Melbourne. Always a little melancholy leaving Canberra, but I’m sure to be back soon enough. As for the “residents” I was looking after, I’ll be seeing them again real soon. More posts then I’m sure.

Two cockatiels, one white and the other yellow, perched on a cage in a room.

One of these days, I’m going to make change to a Dockerfile or a Github workflow, and it’s going to work the first time.

πŸ”— How the β€œNutbush” became Australia’s unofficial national dance

It’s amusing to grow up thinking everyone did this up until a few years ago, when someone from overseas told me they never learnt this dance. Anyway, this is totally a thing. Last wedding I attended, we absolutely did the Nutbush. πŸ˜„

Been asked to do a routine task today. This is the fifth time I’ve started it, the fifth time I said to myself “hmm, I should probably automate this,” and the fifth time I just did it manually. Now wondering if that was time well spent.

Two columns, the left one with the heading 'This Universe' and five evenly sized boxes with the label 'Nah, it'll take too long. I'll do it next time', the right one with the heading 'Alternate Universe' with one large box with 250% the height of the left boxes with the label 'Build the thing' , and six small boxes taking up 50% the height of the left boxes

πŸ“ New post on Sidetracks over at Workpad: Blogging Gallery Tool

MacOS has cat, but not tac. Fortunately, Vim came to the rescue with this command:

:global/^/move 0

Source: Superuser

Thinking About Plugins In Go

Thought I’d give Go’s plugin package a try for something. Seems to works fine for the absolutely simple things. But start importing any dependencies and it becomes a non-starter. You start seeing these sorts of error messages when you try to load the plugin:

plugin was built with a different version of package golang.org/x/sys/unix

Looks like the host and plugins need to have exactly the same dependencies. To be fair, the package documentation says as much, and also states that the best use of plugins is for dynamically loaded modules build from the same source. But that doesn’t help me and what I’m trying to do, which is encoding a bunch of private struct types as Protobuf messages.

So might be that I’ll need to find another approach. I wonder how others would do this. An embedded scripting language would probably not be suitable for this, since I’m dealing with Protobuf and byte slices. Maybe building the plugin as a C shared object? That could work, but then I’d loose all the niceties that come from using Go’s type system.

Another option would be something like WASM. It’s interesting seeing WASM modules becoming a bit of a thing for plugin architectures. There’s even a Go runtime to host them. The only question is whether they would have the same facilities as regular process would have, like network access; or whether they’re completely sandboxed, and you as the plugin host would need to add support for these facilities.

I guess I’d find out if I were to spend any more time looking at this. But this has been a big enough distraction already. Building a process to shell-out to would work just fine, so that’s probably what I’ll ultimately do.

It’s easy for me to say this now, but I would pay a non-zero number of dollars for a set of well designed and well curated sites that can replace Know Your Meme, Fandom, and all these song lyric sites. I’d be fine it they also host ads, so long as there’s one or two, and none of them are video.

I’d be curious to know if there’re any Go apps that are using the plugin package. I’m not aware of any myself; most seem to use things like shell-outs or embedded languages. It seems like the package itself is little more than an experiment so I’m not that surprised, but it’s a little disappointing.