In 1996 I knew nothing about music theory. Zero. I couldn't tell a major scale from a minor. My knowledge of harmony was limited to "this sounds good" and "this sounds like crap." And yet, something inside me knew perfectly well when a 64-line pattern worked and when it didn't.
Fast Tracker 2 didn't compose music. It was a pattern editor with samples. You placed the notes, one by one, in a vertical grid. You decided which sample played on which channel. You adjusted the volume, panning, effects. The tool had no opinion. It didn't suggest anything. If you placed a note that sounded horrible, it sounded horrible. Period.
And yet, things like News-o-tronik came out of there, which placed second in Music Contest 6 worldwide. Over seventy songs that are still on ModArchive came out of there. A way of understanding music that I've never lost came out of there.
Now I use AI to work with audio. And the conversation I see on social media seems both familiar and absurd in equal measure.
"AI is going to replace musicians."
"AI has no soul."
"AI is cheating."
"AI is the future."
All of that sounds like the nineties debates about trackers and samplers. "That's not real music." "You don't play any instrument." "You just copy sounds from others."
Spoiler: the tool has never been the problem or the solution.
When I work on the sound atmosphere for The Bunker, I use AI to generate textures, ambiences, sound layers that I then process, cut, mix and destroy until they fit with what I have in my head. Sometimes I generate twenty variations of an ambient sound and throw them all away. Sometimes the first iteration gives me a clue I wouldn't have found on my own.
But AI doesn't know what I want. It doesn't know that the silence in the bunker has to feel heavy, not empty. It doesn't know that the water dripping has to be irregular in a specific way that generates anxiety without being annoying. It doesn't know that there's a difference between "unsettling" and "uncomfortable" and that I want the second one.
I know that. I knew it at nineteen moving samples around a grid, and I still know it now.
People talk about AI as if it were an entity with judgment. It's not. It's an options generator. A producer of raw material. Very fast, very abundant, very indifferent to whether what it produces is good or bad.
The judgment is still yours.
Fast Tracker 2 didn't compose music. I composed music with Fast Tracker 2. The difference seems semantic, but it's fundamental. The tool amplifies what you already have inside. If you have nothing, it amplifies the void.
I've seen people generate "music with AI" that sounds technically correct and emotionally dead. I've seen people use AI as a starting point to create things that gave me goosebumps. The difference isn't in the tool. It's in who uses it and for what.
In developing The Bunker, sound is as important as mechanics. The player spends long periods in relative silence. That silence has to have texture. It has to feel inhabited by something unseen. AI helps me iterate quickly on those textures. But the decision of which one works and which doesn't is still completely manual.
Thirty years ago, in a poorly lit room, I learned that the ear knows things that theory doesn't explain. That there's something in the combination of certain sounds that generates a physical, emotional response that you can't fully rationalize.
That hasn't changed.
Tools change. Judgment is trained over the years. The ability to recognize when something vibrates and when it doesn't is the only thing that really matters.
AI doesn't compose music.
Fast Tracker 2 didn't either.
I do.
And so do you, if you have something to say.