141 research outputs found

    The XMM Cluster Survey: Forecasting cosmological and cluster scaling-relation parameter constraints

    Get PDF
    We forecast the constraints on the values of sigma_8, Omega_m, and cluster scaling relation parameters which we expect to obtain from the XMM Cluster Survey (XCS). We assume a flat Lambda-CDM Universe and perform a Monte Carlo Markov Chain analysis of the evolution of the number density of galaxy clusters that takes into account a detailed simulated selection function. Comparing our current observed number of clusters shows good agreement with predictions. We determine the expected degradation of the constraints as a result of self-calibrating the luminosity-temperature relation (with scatter), including temperature measurement errors, and relying on photometric methods for the estimation of galaxy cluster redshifts. We examine the effects of systematic errors in scaling relation and measurement error assumptions. Using only (T,z) self-calibration, we expect to measure Omega_m to +-0.03 (and Omega_Lambda to the same accuracy assuming flatness), and sigma_8 to +-0.05, also constraining the normalization and slope of the luminosity-temperature relation to +-6 and +-13 per cent (at 1sigma) respectively in the process. Self-calibration fails to jointly constrain the scatter and redshift evolution of the luminosity-temperature relation significantly. Additional archival and/or follow-up data will improve on this. We do not expect measurement errors or imperfect knowledge of their distribution to degrade constraints significantly. Scaling-relation systematics can easily lead to cosmological constraints 2sigma or more away from the fiducial model. Our treatment is the first exact treatment to this level of detail, and introduces a new `smoothed ML' estimate of expected constraints.Comment: 28 pages, 17 figures. Revised version, as accepted for publication in MNRAS. High-resolution figures available at http://xcs-home.org (under "Publications"

    A time series classifier

    Get PDF
    A time series is a sequence of data measured at successive time intervals. Time series analysis refers to all of the methods employed to understand such data, either with the purpose of explaining the underlying system producing the data or to try to predict future data points in the time series...An evolutionary algorithm is a non-deterministic method of searching a solution space, and modeled after biological evolutionary processes. A learning classifier system (LCS) is a form of evolutionary algorithm that operates on a population of mapping rules. We introduce the time series classifier TSC, a new type of LCS that allows for the modeling and prediction of time series data, derived from Wilson\u27s XCSR, an LCS designed for use with real-valued inputs. Our method works by modifying the makeup of the rules in the LCS so that they are suitable for use on a time series...We tested TSC on real-world historical stock data --Abstract, page iii

    Architecting system of systems: artificial life analysis of financial market behavior

    Get PDF
    This research study focuses on developing a framework that can be utilized by system architects to understand the emergent behavior of system architectures. The objective is to design a framework that is modular and flexible in providing different ways of modeling sub-systems of System of Systems. At the same time, the framework should capture the adaptive behavior of the system since evolution is one of the key characteristics of System of Systems. Another objective is to design the framework so that humans can be incorporated into the analysis. The framework should help system architects understand the behavior as well as promoters or inhibitors of change in human systems. Computational intelligence tools have been successfully used in analysis of Complex Adaptive Systems. Since a System of Systems is a collection of Complex Adaptive Systems, a framework utilizing combination of these tools can be developed. Financial markets are selected to demonstrate the various architectures developed from the analysis framework --Introduction, page 3

    The XMM Cluster Survey: X-ray analysis methodology

    Get PDF
    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM-Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5,776 XMM observations used to construct the current XCS source catalogue. A total of 3,675 > 4-sigma cluster candidates with > 50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg^2. Of these, 993 candidates are detected with > 300 background-subtracted X-ray photon counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these candidates, as well as to estimate redshifts from the X-ray data alone. A total of 587 (122) X-ray temperatures to a typical accuracy of < 40 (< 10) per cent have been measured to date. We also present the methodology adopted for determining the selection function of the survey, and show that the extended source detection algorithm is robust to a range of cluster morphologies by inserting mock clusters derived from hydrodynamical simulations into real XMM images. These tests show that the simple isothermal beta-profiles is sufficient to capture the essential details of the cluster population detected in the archival XMM observations. The redshift follow-up of the XCS cluster sample is presented in a companion paper, together with a first data release of 503 optically-confirmed clusters.Comment: MNRAS accepted, 45 pages, 38 figures. Our companion paper describing our optical analysis methodology and presenting a first set of confirmed clusters has now been submitted to MNRA

    The efficient market hypothesis through the eyes of an artificial technical analyst

    Get PDF
    The academic literature has been reluctant to accept technical analysis as a rational strategy of traders in financial markets. In practice traders and analysts heavily use technical analysis to make investment decisions. To resolve this incongruence the aim of this study is to translate technical analysis into a rigorous formal framework and to investigate its potential failure or success. To avoid subjectivism we design an Artificial Technical Analyst. The empirical study presents the evidence of past market inefficiencies observed on the Tokyo Stock Exchange. The market can be perceived as inefficient if the technical analyst's transaction costs are below the break-even level derived from technical analysis. (English

    Automatic synthesis of fuzzy systems: An evolutionary overview with a genetic programming perspective

    Get PDF
    Studies in Evolutionary Fuzzy Systems (EFSs) began in the 90s and have experienced a fast development since then, with applications to areas such as pattern recognition, curve‐fitting and regression, forecasting and control. An EFS results from the combination of a Fuzzy Inference System (FIS) with an Evolutionary Algorithm (EA). This relationship can be established for multiple purposes: fine‐tuning of FIS's parameters, selection of fuzzy rules, learning a rule base or membership functions from scratch, and so forth. Each facet of this relationship creates a strand in the literature, as membership function fine‐tuning, fuzzy rule‐based learning, and so forth and the purpose here is to outline some of what has been done in each aspect. Special focus is given to Genetic Programming‐based EFSs by providing a taxonomy of the main architectures available, as well as by pointing out the gaps that still prevail in the literature. The concluding remarks address some further topics of current research and trends, such as interpretability analysis, multiobjective optimization, and synthesis of a FIS through Evolving methods

    ć­Šçż’æˆŠç•„ă«ćŸșă„ăć­Šçż’ćˆ†éĄžć­ă‚·ă‚čăƒ†ăƒ ăźèš­èšˆ

    Get PDF
    On Learning Classifier Systems dubbed LCSs a leaning strategy which defines how LCSs cover a state-action space in a problem can be one of the most fundamental options in designing LCSs. There lacks an intensive study of the learning strategy to understand whether and how the learning strategy affects the performance of LCSs. This lack has resulted in the current design methodology of LCS which does not carefully consider the types of learning strategy. The thesis clarifies a need of a design methodology of LCS based on the learning strategy. That is, the thesis shows the learning strategy can be an option that determines the potential performance of LCSs and then claims that LCSs should be designed on the basis of the learning strategy in order to improve the performance of LCSs. First, the thesis empirically claims that the current design methodology of LCS, without the consideration of learning strategy, can be limited to design a proper LCS to solve a problem. This supports the need of design methodology based on the learning strategy. Next, the thesis presents an example of how LCS can be designed on the basis of the learning strategy. The thesis empirically show an adequate learning strategy improving the performance of LCS can be decided depending on a type of problem difficulties such as missing attributes. Then, the thesis draws an inclusive guideline that explains which learning strategy should be used to address which types of problem difficulties. Finally, the thesis further shows, on an application of LCS for a human daily activity recognition problem, the adequate learning strategy according to the guideline effectively improves the performance of the application. The thesis concludes that the learning strategy is the option of the LCS design which determines the potential performance of LCSs. Thus, before designing any type of LCSs including their applications, the learning strategy should be adequately selected at first, because their performance degrades when they employ an inadequate learning strategy to a problem they want to solve. In other words, LCSs should be designed on the basis of the adequate learning strategy.é›»æ°—é€šäżĄć€§ć­Š201

    A discrete time approach to option pricing

    Get PDF
    • 

    corecore