444 research outputs found

    Observing the clustering properties of galaxy clusters in dynamical dark-energy cosmologies

    Full text link
    We study the clustering properties of galaxy clusters expected to be observed by various forthcoming surveys both in the X-ray and sub-mm regimes by the thermal Sunyaev-Zel'dovich effect. Several different background cosmological models are assumed, including the concordance Λ\LambdaCDM and various cosmologies with dynamical evolution of the dark energy. Particular attention is paid to models with a significant contribution of dark energy at early times which affects the process of structure formation. Past light cone and selection effects in cluster catalogs are carefully modeled by realistic scaling relations between cluster mass and observables and by properly taking into account the selection functions of the different instruments. The results show that early dark-energy models are expected to produce significantly lower values of effective bias and both spatial and angular correlation amplitudes with respect to the standard Λ\LambdaCDM model. Among the cluster catalogues studied in this work, it turns out that those based on \emph{eRosita}, \emph{Planck}, and South Pole Telescope observations are the most promising for distinguishing between various dark-energy models.Comment: 16 pages, 10 figures. A&A in pres

    The XMM Cluster Survey: The interplay between the brightest cluster galaxy and the intra-cluster medium via AGN feedback

    Get PDF
    Using a sample of 123 X-ray clusters and groups drawn from the XMM-Cluster Survey first data release, we investigate the interplay between the brightest cluster galaxy (BCG), its black hole, and the intra-cluster/group medium (ICM). It appears that for groups and clusters with a BCG likely to host significant AGN feedback, gas cooling dominates in those with Tx > 2 keV while AGN feedback dominates below. This may be understood through the sub-unity exponent found in the scaling relation we derive between the BCG mass and cluster mass over the halo mass range 10^13 < M500 < 10^15Msol and the lack of correlation between radio luminosity and cluster mass, such that BCG AGN in groups can have relatively more energetic influence on the ICM. The Lx - Tx relation for systems with the most massive BCGs, or those with BCGs co-located with the peak of the ICM emission, is steeper than that for those with the least massive and most offset, which instead follows self-similarity. This is evidence that a combination of central gas cooling and powerful, well fuelled AGN causes the departure of the ICM from pure gravitational heating, with the steepened relation crossing self-similarity at Tx = 2 keV. Importantly, regardless of their black hole mass, BCGs are more likely to host radio-loud AGN if they are in a massive cluster (Tx > 2 keV) and again co-located with an effective fuel supply of dense, cooling gas. This demonstrates that the most massive black holes appear to know more about their host cluster than they do about their host galaxy. The results lead us to propose a physically motivated, empirical definition of 'cluster' and 'group', delineated at 2 keV.Comment: Accepted for publication in MNRAS - replaced to match corrected proo

    Hybridizing and applying computational intelligence techniques

    Get PDF
    As computers are increasingly relied upon to perform tasks of increasing complexity affecting many aspects of society, it is imperative that the underlying computational methods performing the tasks have high performance in terms of effectiveness and scalability. A common solution employed to perform such complex tasks are computational intelligence (CI) techniques. CI techniques use approaches influenced by nature to solve problems in which traditional modeling approaches fail due to impracticality, intractability, or mathematical ill-posedness. While CI techniques can perform considerably better than traditional modeling approaches when solving complex problems, the scalability performance of a given CI technique alone is not always optimal. Hybridization is a popular process by which a better performing CI technique is created from the combination of multiple existing techniques in a logical manner. In the first paper in this thesis, a novel hybridization of two CI techniques, accuracy-based learning classifier systems (XCS) and cluster analysis, is presented that improves upon the efficiency and, in some cases, the effectiveness of XCS. A number of tasks in software engineering are performed manually, such as defining expected output in model transformation testing. Especially since the number and size of projects that rely on tasks that must be performed manually, it is critical that automated approaches are employed to reduce or eliminate manual effort from these tasks in order to scale efficiently. The second paper in this thesis details a novel application of a CI technique, multi-objective simulated annealing, to the task of test case model generation to reduce the resulting effort required to manually update expected transformation output --Abstract, page iv

    A brief history of learning classifier systems: from CS-1 to XCS and its variants

    Get PDF
    © 2015, Springer-Verlag Berlin Heidelberg. The direction set by Wilson’s XCS is that modern Learning Classifier Systems can be characterized by their use of rule accuracy as the utility metric for the search algorithm(s) discovering useful rules. Such searching typically takes place within the restricted space of co-active rules for efficiency. This paper gives an overview of the evolution of Learning Classifier Systems up to XCS, and then of some of the subsequent developments of Wilson’s algorithm to different types of learning

    A Serendipitous Galaxy Cluster Survey with XMM: Expected Catalogue Properties and Scientific Applications

    Get PDF
    This paper describes a serendipitous galaxy cluster survey that we plan to conduct with the XMM X-ray satellite. We have modeled the expected properties of such a survey for three different cosmological models, using an extended Press-Schechter (Press & Schechter 1974) formalism, combined with a detailed characterization of the expected capabilities of the EPIC camera on board XMM. We estimate that, over the ten year design lifetime of XMM, the EPIC camera will image a total of ~800 square degrees in fields suitable for the serendipitous detection of clusters of galaxies. For the presently-favored low-density model with a cosmological constant, our simulations predict that this survey area would yield a catalogue of more than 8000 clusters, ranging from poor to very rich systems, with around 750 detections above z=1. A low-density open Universe yields similar numbers, though with a different redshift distribution, while a critical-density Universe gives considerably fewer clusters. This dependence of catalogue properties on cosmology means that the proposed survey will place strong constraints on the values of Omega-Matter and Omega-Lambda. The survey would also facilitate a variety of follow-up projects, including the quantification of evolution in the cluster X-ray luminosity-temperature relation, the study of high-redshift galaxies via gravitational lensing, follow-up observations of the Sunyaev-Zel'dovich effect and foreground analyses of cosmic microwave background maps.Comment: Accepted to ApJ. Minor changes, e.g. presentation of temperature errors as a figure (rather than as a table). Latex (20 pages, 6 figures, uses emulateapj.sty

    Controlled self-organisation using learning classifier systems

    Get PDF
    The complexity of technical systems increases, breakdowns occur quite often. The mission of organic computing is to tame these challenges by providing degrees of freedom for self-organised behaviour. To achieve these goals, new methods have to be developed. The proposed observer/controller architecture constitutes one way to achieve controlled self-organisation. To improve its design, multi-agent scenarios are investigated. Especially, learning using learning classifier systems is addressed

    Massive Science with VO and Grids

    Full text link
    There is a growing need for massive computational resources for the analysis of new astronomical datasets. To tackle this problem, we present here our first steps towards marrying two new and emerging technologies; the Virtual Observatory (e.g, AstroGrid) and the computational grid (e.g. TeraGrid, COSMOS etc.). We discuss the construction of VOTechBroker, which is a modular software tool designed to abstract the tasks of submission and management of a large number of computational jobs to a distributed computer system. The broker will also interact with the AstroGrid workflow and MySpace environments. We discuss our planned usages of the VOTechBroker in computing a huge number of n-point correlation functions from the SDSS data and massive model-fitting of millions of CMBfast models to WMAP data. We also discuss other applications including the determination of the XMM Cluster Survey selection function and the construction of new WMAP maps.Comment: Invited talk at ADASSXV conference published as ASP Conference Series, Vol. XXX, 2005 C. Gabriel, C. Arviset, D. Ponz and E. Solano, eds. 9 page
    corecore