The Alluring Trap Of Tying Your Fortunes To AI
Kind of realised something about these AI tools: it’s easy to tie your fortunes to the existence and operations of these models. This might seem like a good thing — for those unskilled at a particular domain, it may be possible to get something off the ground quickly. But it’s ultimately a trap: if what is produced is not what you’re looking for, and there’s no way to get it to be what you’re looking for, you’re really left with no recourse. You want/need to make a change, but you yourself don’t have the skills to produce what these models are producing: that’s probably why you’re using these models. So what are you going to do? Will you going to settle with what was produced? Try a different LLM?
Worse still, you’re dependent on these AI companies for as long as you need to produce this output. Might be that in time that use of these models becomes so widespread that relying on AI is similar to relying on a persistent internet connection. But I know for myself that I’m not comfortable with having this dependency in my life just yet.
This trap is obvious when you think about it for a few minutes. But it really hit home when I was tinkering with a logo design in ChatGPT and getting nowhere. I don’t possess the skills needed to produce such a logo myself, and I couldn’t get the LLM to change it the way that I wanted. So I either need to keep asking it to make corrections — “no, make it larger still” — or just accept it as what it is.
That’s not to say I don’t think there’s room for AI in any area of production. I still think using it as an assistant, in areas where you yourself possess the skills to “take over” should you need to, is useful1. Or in situations where it doesn’t really matter if what is produced is not exactly what your or others expect.
That’s pretty much the case here: this logo I was trying to generate was just a bit of fun, and I haven’t tied my salary to the correct output of the model. But those that have — “vibe coders” and those starting tech businesses without any human coders — should probably beware. It’s an alluring trap, and it could spring when you lease expect it.
-
I think Github naming their AI coding assistant “Copilot” is genius. The assistant could be involved in the creation of code, but at any time, the human can “take over”, much like a pilot can “take over” control of a plane. ↩︎