237 research outputs found

    Thermal Conduction and Multiphase Gas in Cluster Cores

    Full text link
    We examine the role of thermal conduction and magnetic fields in cores of galaxy clusters through global simulations of the intracluster medium (ICM). In particular, we study the influence of thermal conduction, both isotropic and anisotropic, on the condensation of multiphase gas in cluster cores. Previous hydrodynamic simulations have shown that cold gas condenses out of the hot ICM in thermal balance only when the ratio of the cooling time (tcoolt_{\rm cool}) and the free-fall time (tfft_{\rm ff}) is less than 10\approx 10. Since thermal conduction is significant in the ICM and it suppresses local cooling at small scales, it is imperative to include thermal conduction in such studies. We find that anisotropic (along local magnetic field lines) thermal conduction does not influence the condensation criterion for a general magnetic geometry, even if thermal conductivity is large. However, with isotropic thermal conduction cold gas condenses only if conduction is suppressed (by a factor 0.3\lesssim 0.3) with respect to the Spitzer value.Comment: 7 pages, 4 figures; replaced by the MNRAS-accepted versio

    Practical Bayesian optimization in the presence of outliers

    Get PDF
    Inference in the presence of outliers is an important field of research as outliers are ubiquitous and may arise across a variety of problems and domains. Bayesian optimization is method that heavily relies on probabilistic inference. This allows outstanding sample efficiency because the probabilistic machinery provides a memory of the whole optimization process. However, that virtue becomes a disadvantage when the memory is populated with outliers, inducing bias in the estimation. In this paper, we present an empirical evaluation of Bayesian optimization methods in the presence of outliers. The empirical evidence shows that Bayesian optimization with robust regression often produces suboptimal results. We then propose a new algorithm which combines robust regression (a Gaussian process with Student-t likelihood) with outlier diagnostics to classify data points as outliers or inliers. By using an scheduler for the classification of outliers, our method is more efficient and has better convergence over the standard robust regression. Furthermore, we show that even in controlled situations with no expected outliers, our method is able to produce better results.Comment: 10 pages (2 of references), 6 figures, 1 algorith
    corecore