A Dependence of the Tidal Disruption Event Rate on Global Stellar Surface Mass Density and Stellar Velocity Dispersion

Abstract

© 2018. The American Astronomical Society. All rights reserved. The rate of tidal disruption events (TDEs), R TDE , is predicted to depend on stellar conditions near the super-massive black hole (SMBH), which are on difficult-to-measure sub-parsec scales. We test whether R TDE depends on kpcscale global galaxy properties, which are observable. We concentrate on stellar surface mass density, ∑ M∗ , and velocity dispersion, σ v , which correlate with the stellar density and velocity dispersion of the stars around the SMBH. We consider 35 TDE candidates, with and without known X-ray emission. The hosts range from starforming to quiescent to quiescent with strong Balmer absorption lines. The last (often with post-starburst spectra) are overrepresented in our sample by a factor of 35 +21 -17 or 18 +8 -7 , depending on the strength of the Hδ absorption line. For a subsample of hosts with homogeneous measurements, ∑ M∗ = 10 9 -10 10 M ⊙ /kpc 2 , higher on average than for a volume-weighted control sample of Sloan Digital Sky Survey galaxies with similar redshifts and stellar masses. This is because (1) most of the TDE hosts here are quiescent galaxies, which tend to have higher ∑ M∗ than the star-forming galaxies that dominate the control, and (2) the star-forming hosts have higher average ∑ M∗ than the star-forming control. There is also a weak suggestion that TDE hosts have lower σ v than for the quiescent control. Assuming that R TDE ∝ ∑ M∗ α × σ v β , and applying a statistical model to the TDE hosts and control sample, we estimate α = 0.9 ; 0.2 and β = -1.0 0.6. This is broadly consistent with RTDE being tied to the dynamical relaxation of stars surrounding the SMBH

    Similar works