AI is a new substrate for Algorithms
Because we all wanted a new edition of Cracking the Coding Interview
Originally posted on LinkedIn.
For decades, good algorithms were shaped by the silicon running them. Cache lines, branch prediction, pipeline stalls — the hardware’s characteristics shaped the code.
AI algorithms deserve the same mindset.
LLMs have performance characteristics too. Fast and tireless on some things, unreliable on others. Better at generation than verification, or sometimes the reverse. Context has a sweet spot. Token costs and latency stack fast. The algorithms we build on top of models should be shaped by all of this.
When I think about an AI system at Levelpath, it mostly comes down to two decisions: how to decompose a problem, and how to structure the creator/evaluator loop around each piece. Decompose too coarsely, the model drowns. Too finely, orchestration overhead eats the gains. Strong creator, weak evaluator: you ship garbage confidently. Flip it: you burn tokens forever.
To make it more challenging, these characteristics keep moving. Context windows grow. Reasoning gets cheaper. What was a 10-step decomposition last year is one prompt today. We design for current models, knowing that we can pull away constraints as the substrate evolves.
The software engineer’s job is changing, and the shape of the new job is this: designing algorithms for the AI substrate.
