← Blog

Velocity Is a System Property

35 commits. 23 hours. 11,139 lines inserted across 80 new files. Roughly 78 story points if you’re counting.

Here’s what shipped: a GPU-accelerated physarum simulation running WebGL2 compute shaders with dual CPU/GPU fallback. A desktop windowing system — draggable, z-ordered, state-persisted across sessions. An AI chatbot on Cloudflare Workers streaming SSE through a structured context layer. Cross-browser WebGL context recovery for Safari 18. Interactive shader controls with 12 real-time parameters and 4 respawn patterns. Media carousels with fullscreen lightbox. Responsive iPad layouts across 3 breakpoints. A feed page. SEO infrastructure. Performance passes — WebP conversion, latin-only font subsetting, resource prefetch.

One developer. Not a team. Not a sprint. One person and an AI, building from architecture to deployment in a single sustained session.

35 commits · 23 hrs · 11,139 loc desktop shell 5,553 ai chatbot 2,088 physarum gpu 1,824 media + lightbox 563 interactive mode 517 safari webgl fix 125 seo + perf 70
23 hours. one gradient. 11,139 insertions.

The work came in bursts — a 3-hour foundation sprint at 3am, an 90-minute UI blitz, a 12-hour gap where the Safari WebGL bug lived rent-free in my head, then a final midnight session to wire up the chatbot. The cumulative pace: roughly 3.4 points per hour of active development.

3am 4am 5am 3pm 5pm 11pm 1am 18 foundation 12 content 26 ui + physics 4 ··· 6 safari fix ··· 12 feed + chatbot ~78 pts 23 hrs
35 commits across 23 hours. each dot is a commit. each block is a sprint.

The easy explanation is “AI wrote the code.” That’s the wrong read.

AI doesn’t have taste. It doesn’t know that physarum should be the soul of the site while the chatbot is furniture. It doesn’t decide that windowing needs persistent z-ordering or that Safari’s WebGL context loss demands a four-commit investigation. It doesn’t feel the moment where the glass lens distortion shader turns a UI panel from flat to alive. The human provides the gradient — the direction of travel. AI provides the bandwidth to walk it without stopping to type.

What actually produced this velocity was architectural coherence.

Every component follows the same shape: Astro file, scoped styles, CSS custom properties, IntersectionObserver trigger. The desktop shell uses one drag handler everywhere. The physarum simulation abstracts GPU and CPU behind a clean interface so it degrades without branching. Blog visualizations — including the ones in this post — share one animation pattern: scroll-triggered, stroke-dasharray, staggered reveals. When the system is internally consistent, the number of unique decisions per feature drops. And compressed decisions are exactly what AI executes best — not invention, but pattern completion across a learnable surface.

Velocity isn’t a property of the developer or the tool. It’s emergent from the system — the coupling between architectural intuition and execution bandwidth. Reduce the decision surface, and everything accelerates.

layout? animation? style? state? error? perf? a11y? compress component css vars observer
patterns compress decisions. decisions compress time.

The proof by failure

The four-commit Safari WebGL fix is the counterargument to “AI just generates code.”

View Transitions were silently destroying the WebGL context during page navigation. The first fix was wrong — it removed transition:persist from the canvas and added a CSS-loss safety net. The second commit was a revert. The third disabled native View Transitions on Safari 18 entirely, which worked for Chrome but broke the fallback animation. The fourth caught the edge case: persist the fallback=swap flag across navigations.

AI accelerated every iteration — generating context recovery handlers, testing lifecycle hooks, proposing alternate approaches. But the human had to diagnose. Had to understand that Safari’s rendering pipeline handles WebGL context ownership differently during CSS transitions. Velocity without comprehension is typing faster into the void.

attempt fail revert rethink resolve
attempt. fail. revert. rethink. resolve.

Emergence

The physarum simulation behind this site solves mazes. Not through intelligence — through emergence. Each agent follows three local rules: sense the trail ahead, turn toward concentration, deposit a trace. No agent knows the maze. The global solution appears anyway.

Same principle here. Simple composable patterns. Clear architectural constraints. A human providing the gradient. An AI providing the execution surface. The velocity isn’t the achievement. The system that produces it is.

Incredible things happen when you stop trying to control the uncontrollable.