research

Buoyancy Instabilities in Galaxy Clusters: Convection Due to Adiabatic Cosmic Rays and Anisotropic Thermal Conduction

Abstract

Using a linear stability analysis and two and three-dimensional nonlinear simulations, we study the physics of buoyancy instabilities in a combined thermal and relativistic (cosmic ray) plasma, motivated by the application to clusters of galaxies. We argue that cosmic ray diffusion is likely to be slow compared to the buoyancy time on large length scales, so that cosmic rays are effectively adiabatic. If the cosmic ray pressure pcrp_{cr} is 25\gtrsim 25 % of the thermal pressure, and the cosmic ray entropy (pcr/ρ4/3p_{\rm cr}/\rho^{4/3}; ρ\rho is the thermal plasma density) decreases outwards, cosmic rays drive an adiabatic convective instability analogous to Schwarzschild convection in stars. Global simulations of galaxy cluster cores show that this instability saturates by reducing the cosmic ray entropy gradient and driving efficient convection and turbulent mixing. At larger radii in cluster cores, the thermal plasma is unstable to the heat flux-driven buoyancy instability (HBI), a convective instability generated by anisotropic thermal conduction and a background conductive heat flux. Cosmic-ray driven convection and the HBI may contribute to redistributing metals produced by Type 1a supernovae in clusters. Our calculations demonstrate that adiabatic simulations of galaxy clusters can artificially suppress the mixing of thermal and relativistic plasma; anisotropic thermal conduction allows more efficient mixing, which may contribute to cosmic rays being distributed throughout the cluster volume.Comment: submitted to ApJ; 15 pages and 12 figures; abstract shortened to < 24 lines; for high resolution movies see http://astro.berkeley.edu/~psharma/clustermovie.htm

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 01/04/2019