The buoyant rise of hot plasma bubbles inflated by active galactic nuclei outflows in galaxy clusters can heat the cluster gas and thereby compensate radiative energy losses of this material. Numerical simulations of this effect often show the complete disruption of the bubbles followed by the mixing of the bubble material with the surrounding cluster gas due to fluid instabilities on the bubble surface. This prediction is inconsistent with the observations of apparently coherent bubble structures in clusters. We derive a general description in the linear regime of the growth of instabilities on the surface between two fluids under the influence of a gravitational field, viscosity, surface tension provided by a magnetic field and relative motion of the two fluids with respect to each other. We demonstrate that Kelvin–Helmholtz instabilities are always suppressed, if the fluids are viscous. They are also suppressed in the inviscid case for fluids of very different mass densities. We show that the effects of shear viscosity as well as a magnetic field in the cluster gas can prevent the growth of Rayleigh–Taylor instabilities on relevant scalelengths. Rayleigh–Taylor instabilities on parsec scales are suppressed even if the kinematic viscosity of the cluster gas is reduced by two orders of magnitude compared to the value given by Spitzer for a fully ionized, unmagnetized gas. Similarly, magnetic fields exceeding a few ?G result in an effective surface tension preventing the disruption of bubbles. For more massive clusters, instabilities on the bubble surface grow faster. This may explain the absence of thermal gas in the north-west bubble observed in the Perseus cluster compared to the apparently more disrupted bubbles in the Virgo cluster
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.