When I first saw Autechre's Gantz Graf in 2002 I assumed I was watching a data driven pop video. I later learned this wasn't the case, and that the audio reactivity in the video was all hand sync'd, frame by frame, by the director.
Fortunately, I didn't learn this until after I had decided I was gonna get in on the game and had begun working out how it was done. This is how fictions find their way into reality.
I started around 2009, and achieved my goal in 2013 when I made Prisms with 65daysofstatic, possibly the first authenticly audio-driven algorithmic pop video. Scroll the page to see a few renders.
Since then hardware has accelerated and tools have evolved. What was once an over-night render in Cinema 4D can now run realtime in Unreal Engine. And what I once needed C++ for I can now run in a Processing sketch.
I have my own tools too. "Mono" is the name of my live MIDI/OSC viz rig, and "Fourtrack" is my audio analysis system. Fourtrack works alone, or with Unity, Houdini, Unreal, Processing or, theoretically, any other scriptable renderer.
This page is the family album for my tools. A few snapshots of their children. Some are loved only by their parents, but they're all part of the journey. Every year this trick gets easier. More is to come.