ai · 8 min read · Apr 28, 2026

Neural Networks and ODEs Compute Primitive Recursion via Dynamics, Not Composition

Bournez proves recurrent ReLU networks, polynomial ODEs, and discrete maps all express primitive recursive functions through continuous-time trajectories rather than symbolic subroutine chaining.

Source: arxiv/cs.LG · Olivier Bournez · open original ↗

Recurrent ReLU networks, polynomial ODEs, and polynomial maps equivalently compute primitive recursion via bounded iteration and continuous dynamics.

  • All three frameworks—RNNs, polynomial ODEs, discrete maps—express primitive recursive functions through fixed dynamical systems.
  • Composition emerges from trajectory evolution, not from explicit closure rules or subroutine calls.
  • Time bounds are themselves primitive recursive; inputs are raw integer vectors.
  • Polynomial ODEs robustly perform rounding and phase selection via continuous flow; fixed polynomial maps cannot.
  • ReLU gates enable exact branching; step-size parameters in discrete maps recover continuous-time benefits with discretization trade-offs.
  • These models shape dynamical trajectories through clocks and error correction, structurally unlike symbolic programming.
  • Framework enables studying subrecursive hierarchies by restricting time, polynomial degree, or discretization resources.

Astrobobo tool mapping

  • Reading Queue Add this paper to a queue tagged 'theory' and schedule 45 min to work through the main theorem and one equivalence proof (RNN or ODE).
  • Knowledge Capture After reading, record the three key asymmetries (polynomial maps lack rounding, ReLU lacks continuous control, ODEs lack step-size discretion) as a comparison table for future reference.
  • Focus Brief Summarize in one page: 'How does my current neural architecture exploit or ignore dynamical properties?' Use the paper's framework to audit your model.

Frequently asked

  • Yes, according to Bournez's theorem. Any primitive recursive function can be compiled into a fixed polynomial ODE system with bounded iteration. The ODE operates on real-valued states and uses continuous-time flow to perform rounding and phase selection robustly, though the time bound itself must be primitive recursive.
Share X LinkedIn
cite
APA
Olivier Bournez. (2026, April 28). Neural Networks and ODEs Compute Primitive Recursion via Dynamics, Not Composition. Astrobobo Content Engine (rewrite of arxiv/cs.LG). https://astrobobo-content-engine.vercel.app/article/neural-networks-and-odes-compute-primitive-recursion-via-dynamics-not-compositio-104fc8
MLA
Olivier Bournez. "Neural Networks and ODEs Compute Primitive Recursion via Dynamics, Not Composition." Astrobobo Content Engine, 28 Apr 2026, https://astrobobo-content-engine.vercel.app/article/neural-networks-and-odes-compute-primitive-recursion-via-dynamics-not-compositio-104fc8. Based on "arxiv/cs.LG", https://arxiv.org/abs/2604.24356.
BibTeX
@misc{astrobobo_neural-networks-and-odes-compute-primitive-recursion-via-dynamics-not-compositio-104fc8_2026,
  author       = {Olivier Bournez},
  title        = {Neural Networks and ODEs Compute Primitive Recursion via Dynamics, Not Composition},
  year         = {2026},
  url          = {https://astrobobo-content-engine.vercel.app/article/neural-networks-and-odes-compute-primitive-recursion-via-dynamics-not-compositio-104fc8},
  note         = {Astrobobo rewrite of arxiv/cs.LG, https://arxiv.org/abs/2604.24356},
}

Related insights