The wide range of scales of motion present in turbulent flow provides a major challenge in the prediction of fluid-dynamical processes. Feasible simulation strategies require solving computational problems with reduced complexity, for example by filtering out the smaller scales of motion and/or numerically solving the governing equations on a coarse grid. These necessary simplifications induce errors in the flow prediction, which can be mitigated by so-called closure models. In this presentation, we will explain how complexity reduction motivates the use of i) (data-driven) stochastic closure models to account for discretisation errors and ii) data assimilation methods to steer predictions towards observations. In particular, we focus on assimilating flow statistics to compensate for prediction error, which can be measured relatively straightforwardly if a flow is in statistically steady state. Additionally, if historical flow data is available, these flow statistics can be modeled as time series and thereby provide ‘synthetic data'. In turn, the stochastic model together with a data assimilation scheme provides a computationally efficient stand-alone model for coarse-grained turbulence.