Laminarization is an important topic in heat transfer and turbulence modeling.
Recent studies have demonstrated that several well-known turbulence models
failed to provide accurate prediction when applied to mixed convection flows
with significant re-laminarization effects. One of those models, a well-validated
cubic nonlinear eddy-viscosity model, was observed to miss this feature entirely.
This paper studies the reasons behind this failure by providing a detailed
comparison with the baseline Launder–Sharma model. The difference is attributed
to the method of near-wall damping. A range of tests have been conducted and
two noteworthy findings are reported for the case of flow re-laminarization