Learning turbulence closures via nudging sidesteps solver backprop
A data-assimilation-inspired approach trains neural network turbulence models on DNS data without embedding them in solvers, reducing computational cost and improving stability.
Source: arxiv/cs.LG · Ashwin Suriyanarayanan, Melissa Adrian, Dibyajyoti Chakraborty, Romit Maulik · open original ↗
Nudging-based training lets neural turbulence closures learn from DNS data without costly solver backpropagation or stability issues.
- — A-posteriori learning embeds neural closures in solvers but requires expensive gradient backpropagation and causes instability.
- — A-priori learning uses DNS data directly but assumes filter properties that don't match actual numerical discretization effects.
- — Continuous data assimilation (nudging) treats DNS as sparse observations and trains closures offline without modifying the solver.
- — Nudging approach avoids adjoints, reduces computational burden, and maintains long-term stability in LES deployments.
- — Model generalizes across different numerical schemes and temporal discretizations better than traditional closure models.
- — No need to embed neural network inside solver, lowering barrier to adoption in existing simulation codes.
- — Addresses mismatch between assumed filter properties and real numerical discretization errors that destabilize standard approaches.
Astrobobo tool mapping
- Knowledge Capture Record the key insight: nudging decouples closure training from solver internals. Capture the three-step workflow (DNS data → nudging loss → offline model) as a reusable template for your domain.
- Focus Brief Summarize the paper's claim and limitations in a one-page brief for your team. Highlight that this method trades end-to-end differentiability for stability and ease of integration.
- Reading Queue Queue related papers on continuous data assimilation and neural operator learning to understand the broader context of offline physics-informed training.
Frequently asked
- Nudging, or continuous data assimilation, is a technique that treats high-fidelity DNS data as sparse observations and uses a forcing term to guide a coarse-grid model toward those observations. In this work, it allows a neural network closure to learn the required subgrid stress without being embedded inside the LES solver, reducing computational cost and avoiding stability issues from filter mismatch.
cite ▸
APA
Ashwin Suriyanarayanan, Melissa Adrian, Dibyajyoti Chakraborty, Romit Maulik. (2026, April 28). Learning turbulence closures via nudging sidesteps solver backprop. Astrobobo Content Engine (rewrite of arxiv/cs.LG). https://astrobobo-content-engine.vercel.app/article/learning-turbulence-closures-via-nudging-sidesteps-solver-backprop-dee209
MLA
Ashwin Suriyanarayanan, Melissa Adrian, Dibyajyoti Chakraborty, Romit Maulik. "Learning turbulence closures via nudging sidesteps solver backprop." Astrobobo Content Engine, 28 Apr 2026, https://astrobobo-content-engine.vercel.app/article/learning-turbulence-closures-via-nudging-sidesteps-solver-backprop-dee209. Based on "arxiv/cs.LG", https://arxiv.org/abs/2604.23874.
BibTeX
@misc{astrobobo_learning-turbulence-closures-via-nudging-sidesteps-solver-backprop-dee209_2026,
author = {Ashwin Suriyanarayanan, Melissa Adrian, Dibyajyoti Chakraborty, Romit Maulik},
title = {Learning turbulence closures via nudging sidesteps solver backprop},
year = {2026},
url = {https://astrobobo-content-engine.vercel.app/article/learning-turbulence-closures-via-nudging-sidesteps-solver-backprop-dee209},
note = {Astrobobo rewrite of arxiv/cs.LG, https://arxiv.org/abs/2604.23874},
}