Observations show an almost ubiquitous presence of extra mixing in low-mass
upper giant branch stars. The most commonly invoked explanation for this is the
thermohaline instability. One dimensional stellar evolution models include
prescriptions for thermohaline mixing, but our ability to make direct
comparisons between models and observations has thus far been limited. Here, we
propose a new framework to facilitate direct comparison: Using carbon to
nitrogen measurements from the SDSS-IV APOGEE survey as a probe of mixing and a
fluid parameter known as the reduced density ratio from one dimensional stellar
evolution programs, we compare the observed amount of extra mixing on the upper
giant branch to predicted trends from three-dimensional fluid dynamics
simulations. By applying this method, we are able to place empirical
constraints on the efficiency of mixing across a range of masses and
metallicities. We find that the observed amount of extra mixing is strongly
correlated with the reduced density ratio and that trends between reduced
density ratio and fundamental stellar parameters are robust across choices for
modeling prescription. We show that stars with available mixing data tend to
have relatively low density ratios, which should inform the regimes selected
for future simulation efforts. Finally, we show that there is increased mixing
at low values of the reduced density ratio, which is consistent with current
hydrodynamical models of the thermohaline instability. The introduction of this
framework sets a new standard for theoretical modeling efforts, as validation
for not only the amount of extra mixing, but trends between the degree of extra
mixing and fundamental stellar parameters is now possible.Comment: 19 pages, 7 figures, submitted to Ap