6,260 research outputs found

    Data based identification and prediction of nonlinear and complex dynamical systems

    Get PDF
    We thank Dr. R. Yang (formerly at ASU), Dr. R.-Q. Su (formerly at ASU), and Mr. Zhesi Shen for their contributions to a number of original papers on which this Review is partly based. This work was supported by ARO under Grant No. W911NF-14-1-0504. W.-X. Wang was also supported by NSFC under Grants No. 61573064 and No. 61074116, as well as by the Fundamental Research Funds for the Central Universities, Beijing Nova Programme.Peer reviewedPostprin

    Decision Support Based on Bio-PEPA Modeling and Decision Tree Induction: A New Approach, Applied to a Tuberculosis Case Study

    Get PDF
    The problem of selecting determinant features generating appropriate model structure is a challenge in epidemiological modelling. Disease spread is highly complex, and experts develop their understanding of its dynamic over years. There is an increasing variety and volume of epidemiological data which adds to the potential confusion. We propose here to make use of that data to better understand disease systems. Decision tree techniques have been extensively used to extract pertinent information and improve decision making. In this paper, we propose an innovative structured approach combining decision tree induction with Bio-PEPA computational modelling, and illustrate the approach through application to tuberculosis. By using decision tree induction, the enhanced Bio-PEPA model shows considerable improvement over the initial model with regard to the simulated results matching observed data. The key finding is that the developer expresses a realistic predictive model using relevant features, thus considering this approach as decision support, empowers the epidemiologist in his policy decision making

    Development and Construction of a new Photoelectron Imaging Spectrometer for Studying the Spectroscopy and Ultrafast Dynamics of Molecular Anions

    Get PDF
    We present a detailed account of the development, construction, and commissioning of a new experiment for studying the spectroscopy and ultrafast dynamics of molecular anions in the gas phase. The new instrument incorporates: an electrospray ionisation source, which is capable of generating a vast class of molecular anions; a Wiley-McLaren time-of-flight mass spectrometer; and a compact photoelectron imaging arrangement for anions, which negates the use of pulsed high voltages. We use this instrument in conjunction with a femtosecond laser system to perform the first ultrafast time-resolved photoelectron imaging experiments on molecular anions generated through electrospray ionisation. A method for reconstructing three dimensional charged particle distributions from their associated two dimensional projections on an imaging detector plane is described. This new method utilises: (1) onion-peeling in polar co-ordinates (POP) to perform the reconstruction; and (2) basis set concepts to significantly enhance the algorithms computational speed. We compare this new POP algorithm with other reconstruction algorithms, which shows that the method is as good as the benchmark pBASEX method in terms of accuracy. Importantly, we show that it is also computationally fast, allowing images to be reconstructed as they are acquired in a typical imaging experiment. Original work is presented which investigates the spectroscopy and ultrafast excited dynamics of the 7,7,8,8-tetracyanoquinodimethane (TCNQ) radical anion. The photoelectron spectrum of TCNQ– is measured at 3.1 eV, which is used to gain insight into the electronic structure and geometries of both the anion and neutral states. Time-resolved photoelectron imaging experiments explore the relaxation dynamics of its first excited 1 2B3u state, which we show undergoes internal conversion back to the 2B2g ground state on a timescale of 650 fs. Results also provide evidence of a wave packet motion on the excited state, which exhibits a characteristic frequency of 30 cm–1. Finally, we describe, for the first time, a formulism which allows ultrafast relaxation timescales to be extracted from the photoelectron angular distributions of isoenergetic photoelectron features. As an example, we use the time-resolved photoelectron angular distributions of a nearly isoenergetic feature in the photoelectron images of TCNQ–. From this model we extract a relaxation time for the 1 2B3u state, which quantitatively agrees with those extracted from fits to the features in the photoelectron spectra derived from the images

    Audiovisual processing for sports-video summarisation technology

    Get PDF
    In this thesis a novel audiovisual feature-based scheme is proposed for the automatic summarization of sports-video content The scope of operability of the scheme is designed to encompass the wide variety o f sports genres that come under the description ‘field-sports’. Given the assumption that, in terms of conveying the narrative of a field-sports-video, score-update events constitute the most significant moments, it is proposed that their detection should thus yield a favourable summarisation solution. To this end, a generic methodology is proposed for the automatic identification of score-update events in field-sports-video content. The scheme is based on the development of robust extractors for a set of critical features, which are shown to reliably indicate their locations. The evidence gathered by the feature extractors is combined and analysed using a Support Vector Machine (SVM), which performs the event detection process. An SVM is chosen on the basis that its underlying technology represents an implementation of the latest generation of machine learning algorithms, based on the recent advances in statistical learning. Effectively, an SVM offers a solution to optimising the classification performance of a decision hypothesis, inferred from a given set of training data. Via a learning phase that utilizes a 90-hour field-sports-video trainmg-corpus, the SVM infers a score-update event model by observing patterns in the extracted feature evidence. Using a similar but distinct 90-hour evaluation corpus, the effectiveness of this model is then tested genencally across multiple genres of fieldsports- video including soccer, rugby, field hockey, hurling, and Gaelic football. The results suggest that in terms o f the summarization task, both high event retrieval and content rejection statistics are achievable

    Chasing a consistent picture for dark matter direct searches

    Full text link
    In this paper we assess the present status of dark matter direct searches by means of Bayesian statistics. We consider three particle physics models for spin-independent dark matter interaction with nuclei: elastic, inelastic and isospin violating scattering. We shortly present the state of the art for the three models, marginalising over experimental systematics and astrophysical uncertainties. Whatever the scenario is, XENON100 appears to challenge the detection region of DAMA, CoGeNT and CRESST. The first aim of this study is to rigorously quantify the significance of the inconsistency between XENON100 data and the combined set of detection (DAMA, CoGeNT and CRESST together), performing two statistical tests based on the Bayesian evidence. We show that XENON100 and the combined set are inconsistent at least at 2 sigma level in all scenarios but inelastic scattering, for which the disagreement drops to 1 sigma level. Secondly we consider only the combined set and hunt the best particle physics model that accounts for the events, using Bayesian model comparison. The outcome between elastic and isospin violating scattering is inconclusive, with the odds 2:1, while inelastic scattering is disfavoured with the odds of 1:32 because of CoGeNT data. Our results are robust under reasonable prior assumptions. We conclude that the simple elastic scattering remains the best model to explain the detection regions, since the data do not support extra free parameters. Present direct searches therefore are not able to constrain the particle physics interaction of the dark matter. The outcome of consistency tests implies that either a better understanding of astrophysical and experimental uncertainties is needed, either the dark matter theoretical model is at odds with the data.Comment: 18 pages, 8 figures and 7 tables; minor revisions following referee report. Accepted for publication in Phys.Rev.

    Operating System Support for Redundant Multithreading

    Get PDF
    Failing hardware is a fact and trends in microprocessor design indicate that the fraction of hardware suffering from permanent and transient faults will continue to increase in future chip generations. Researchers proposed various solutions to this issue with different downsides: Specialized hardware components make hardware more expensive in production and consume additional energy at runtime. Fault-tolerant algorithms and libraries enforce specific programming models on the developer. Compiler-based fault tolerance requires the source code for all applications to be available for recompilation. In this thesis I present ASTEROID, an operating system architecture that integrates applications with different reliability needs. ASTEROID is built on top of the L4/Fiasco.OC microkernel and extends the system with Romain, an operating system service that transparently replicates user applications. Romain supports single- and multi-threaded applications without requiring access to the application's source code. Romain replicates applications and their resources completely and thereby does not rely on hardware extensions, such as ECC-protected memory. In my thesis I describe how to efficiently implement replication as a form of redundant multithreading in software. I develop mechanisms to manage replica resources and to make multi-threaded programs behave deterministically for replication. I furthermore present an approach to handle applications that use shared-memory channels with other programs. My evaluation shows that Romain provides 100% error detection and more than 99.6% error correction for single-bit flips in memory and general-purpose registers. At the same time, Romain's execution time overhead is below 14% for single-threaded applications running in triple-modular redundant mode. The last part of my thesis acknowledges that software-implemented fault tolerance methods often rely on the correct functioning of a certain set of hardware and software components, the Reliable Computing Base (RCB). I introduce the concept of the RCB and discuss what constitutes the RCB of the ASTEROID system and other fault tolerance mechanisms. Thereafter I show three case studies that evaluate approaches to protecting RCB components and thereby aim to achieve a software stack that is fully protected against hardware errors
    corecore