405 research outputs found

    Links between N-modular redundancy and the theory of error-correcting codes

    Get PDF
    N-Modular Redundancy (NMR) is one of the best known fault tolerance techniques. Replication of a module to achieve fault tolerance is in some ways analogous to the use of a repetition code where an information symbol is replicated as parity symbols in a codeword. Linear Error-Correcting Codes (ECC) use linear combinations of information symbols as parity symbols which are used to generate syndromes for error patterns. These observations indicate links between the theory of ECC and the use of hardware redundancy for fault tolerance. In this paper, we explore some of these links and show examples of NMR systems where identification of good and failed elements is accomplished in a manner similar to error correction using linear ECC's

    Human Cognition and Emotion using Physio Psychological Approach : A Survey

    Get PDF
    A soldier’s responsibility in the military includes his physical and mental attitudes which makes him to support the army in a full-fledged manner. This type of human dimension recognizes Soldier readiness from training proficiency to motivation for the Army’s future success. It introduces the concept of holistic fitness, a comprehensive combination of the whole person, including all components of the human dimension as a triad of moral, cognitive and physical components. The human dimension concept is directly related to the human mind and memory system. In this research, a system which will be capable of recognizing human emotions based on physiological parameters of a human body is discussed. The data from the system is fed to a computer where it is stored. Stored information regarding human parameters is retrieved and classified using support vector machine to generate a data set about the various emotions the human poses at a specific situation. The emotion, thus calculated is grouped to generate a grade for his present status. This grade is used to recommend the suitable working environment for the person

    Scintillating double beta decay bolometers

    Full text link
    We present the results obtained in the development of scintillating Double Beta Decay bolometers. Several Mo and Cd based crystals were tested with the bolometric technique. The scintillation light was measured through a second independent bolometer. A 140 g CdWO_4 crystal was run in a 417 h live time measurement. Thanks to the scintillation light, the alpha background is easily discriminated resulting in zero counts above the 2615 keV gamma line of Thallium 208. These results, combined with an extremely easy light detector operation, represent the first tangible proof demonstrating the feasibility of this kind of technique.Comment: 15 pages, 8 figure

    A Comparison of Algorithms for the Construction of SZ Cluster Catalogues

    Get PDF
    We evaluate the construction methodology of an all-sky catalogue of galaxy clusters detected through the Sunyaev-Zel'dovich (SZ) effect. We perform an extensive comparison of twelve algorithms applied to the same detailed simulations of the millimeter and submillimeter sky based on a Planck-like case. We present the results of this "SZ Challenge" in terms of catalogue completeness, purity, astrometric and photometric reconstruction. Our results provide a comparison of a representative sample of SZ detection algorithms and highlight important issues in their application. In our study case, we show that the exact expected number of clusters remains uncertain (about a thousand cluster candidates at |b|> 20 deg with 90% purity) and that it depends on the SZ model and on the detailed sky simulations, and on algorithmic implementation of the detection methods. We also estimate the astrometric precision of the cluster candidates which is found of the order of ~2 arcmins on average, and the photometric uncertainty of order ~30%, depending on flux.Comment: Accepted for publication in A&A: 14 pages, 7 figures. Detailed figures added in Appendi

    Compressed sensing imaging techniques for radio interferometry

    Get PDF
    Radio interferometry probes astrophysical signals through incomplete and noisy Fourier measurements. The theory of compressed sensing demonstrates that such measurements may actually suffice for accurate reconstruction of sparse or compressible signals. We propose new generic imaging techniques based on convex optimization for global minimization problems defined in this context. The versatility of the framework notably allows introduction of specific prior information on the signals, which offers the possibility of significant improvements of reconstruction relative to the standard local matching pursuit algorithm CLEAN used in radio astronomy. We illustrate the potential of the approach by studying reconstruction performances on simulations of two different kinds of signals observed with very generic interferometric configurations. The first kind is an intensity field of compact astrophysical objects. The second kind is the imprint of cosmic strings in the temperature field of the cosmic microwave background radiation, of particular interest for cosmology.Comment: 10 pages, 1 figure. Version 2 matches version accepted for publication in MNRAS. Changes includes: writing corrections, clarifications of arguments, figure update, and a new subsection 4.1 commenting on the exact compliance of radio interferometric measurements with compressed sensin

    Component separation methods for the Planck mission

    Get PDF
    The Planck satellite will map the full sky at nine frequencies from 30 to 857 GHz. The CMB intensity and polarization that are its prime targets are contaminated by foreground emission. The goal of this paper is to compare proposed methods for separating CMB from foregrounds based on their different spectral and spatial characteristics, and to separate the foregrounds into components of different physical origin. A component separation challenge has been organized, based on a set of realistically complex simulations of sky emission. Several methods including those based on internal template subtraction, maximum entropy method, parametric method, spatial and harmonic cross correlation methods, and independent component analysis have been tested. Different methods proved to be effective in cleaning the CMB maps from foreground contamination, in reconstructing maps of diffuse Galactic emissions, and in detecting point sources and thermal Sunyaev-Zeldovich signals. The power spectrum of the residuals is, on the largest scales, four orders of magnitude lower than that of the input Galaxy power spectrum at the foreground minimum. The CMB power spectrum was accurately recovered up to the sixth acoustic peak. The point source detection limit reaches 100 mJy, and about 2300 clusters are detected via the thermal SZ effect on two thirds of the sky. We have found that no single method performs best for all scientific objectives. We foresee that the final component separation pipeline for Planck will involve a combination of methods and iterations between processing steps targeted at different objectives such as diffuse component separation, spectral estimation and compact source extraction.Comment: Matches version accepted by A&A. A version with high resolution figures is available at http://people.sissa.it/~leach/compsepcomp.pd

    Relationships between consecutive long-term and mid-term mobility decisions over the life course: a bayesian network approach

    Get PDF
    Long-term and mid-term mobility decision processes in different life trajectories generate complex dynamics, in which consecutive life events are interrelated and time dependent. This study uses the Bayesian network approach to study the dynamic relationships among residential events, household structure events, employment/education events, and car ownership events. Using retrospective data obtained from a web-based survey in Beijing, China, first structure learning is used to discover the direct and indirect relationships between these mobility decisions. Parameter learning is then applied to describe the conditional probabilities and predict the direct and indirect effects of actions and policies in the resulting network. The results confirm the interdependencies between these long-term and mid-term mobility decisions, and evidence the reactive and proactive behavior of individuals and households in the context of various life events over the course of their lives. In this regard, it is important to note that an increase in household size has a contemporaneous effect on car acquisition in the future; while residential events have a synergic relationship with employment/education events. Moreover, if people’s residential location or workplace/study location will move from an urban district to a suburban or outer suburban district, it has both lagged and concurrent effects on car acquisition

    MILCA, a Modified Internal Linear Combination Algorithm to extract astrophysical emissions from multi-frequency sky maps

    Full text link
    The analysis of current Cosmic Microwave Background (CMB) experiments is based on the interpretation of multi-frequency sky maps in terms of different astrophysical components and it requires specifically tailored component separation algorithms. In this context, Internal Linear Combination (ILC) methods have been extensively used to extract the CMB emission from the WMAP multi-frequency data. We present here a Modified Internal Linear Component Algorithm (MILCA) that generalizes the ILC approach to the case of multiple astrophysical components for which the electromagnetic spectrum is known. In addition MILCA corrects for the intrinsic noise bias in the standard ILC approach and extends it to an hybrid space-frequency representation of the data. It also allows us to use external templates to minimize the contribution of extra components but still using only a linear combination of the input data. We apply MILCA to simulations of the Planck satellite data at the HFI frequency bands. We explore the possibility of reconstructing the Galactic molecular CO emission on the Planck maps as well as the thermal Sunyaev-Zeldovich effect. We conclude that MILCA is able to accurately estimate those emissions and it has been successfully used for this purpose within the Planck collaboration.Comment: 13 page

    CHEX-MATE: A non-parametric deep learning technique to deproject and deconvolve galaxy cluster X-ray temperature profiles

    Full text link
    Temperature profiles of the hot galaxy cluster intracluster medium (ICM) have a complex non-linear structure that traditional parametric modelling may fail to fully approximate. For this study, we made use of neural networks, for the first time, to construct a data-driven non-parametric model of ICM temperature profiles. A new deconvolution algorithm was then introduced to uncover the true (3D) temperature profiles from the observed projected (2D) temperature profiles. An auto-encoder-inspired neural network was first trained by learning a non-linear interpolatory scheme to build the underlying model of 3D temperature profiles in the radial range of [0.02-2] R500_{500}, using a sparse set of hydrodynamical simulations from the THREE HUNDRED PROJECT. A deconvolution algorithm using a learning-based regularisation scheme was then developed. The model was tested using high and low resolution input temperature profiles, such as those expected from simulations and observations, respectively. We find that the proposed deconvolution and deprojection algorithm is robust with respect to the quality of the data, the morphology of the cluster, and the deprojection scheme used. The algorithm can recover unbiased 3D radial temperature profiles with a precision of around 5\% over most of the fitting range. We apply the method to the first sample of temperature profiles obtained with XMM{\it -Newton} for the CHEX-MATE project and compared it to parametric deprojection and deconvolution techniques. Our work sets the stage for future studies that focus on the deconvolution of the thermal profiles (temperature, density, pressure) of the ICM and the dark matter profiles in galaxy clusters, using deep learning techniques in conjunction with X-ray, Sunyaev Zel'Dovich (SZ) and optical datasets.Comment: 32 pages, 30 figures, 6 tables, Accepted in A&

    Polarization leakage in epoch of reionization windows – I. Low Frequency Array observations of the 3C196 field

    Get PDF
    Detection of the 21-cm signal coming from the epoch of reionization (EoR) is challenging especially because, even after removing the foregrounds, the residual Stokes I maps contain leakage from polarized emission that can mimic the signal. Here, we discuss the instrumental polarization of LOFAR and present realistic simulations of the leakages between Stokes parameters. From the LOFAR observations of polarized emission in the 3C196 field, we have quantified the level of polarization leakage caused by the nominal model beam of LOFAR, and compared it with the EoR signal using power spectrum analysis. We found that at 134– 166 MHz, within the central 4◦ of the field the (Q,U)→I leakage power is lower than the EoR signal at k<0.3 Mpc−¹. The leakage was found to be localized around a Faraday depth of 0, and the rms of the leakage as a fraction of the rms of the polarized emission was shown to vary between 0.2–0.3%, both of which could be utilized in the removal of leakage. Moreover, we could define an ‘EoR window’ in terms of the polarization leakage in the cylindrical power spectrum above the PSF-induced wedge and below k∥∼0.5 Mpc−¹, and the window extended up to k∥∼1 Mpc−¹ at all k⊥ when 70% of the leakage had been removed. These LOFAR results show that even a modest polarimetric calibration over a field of view of ≲4∘ in the future arrays like SKA will ensure that the polarization leakage remains well below the expected EoR signal at the scales of 0.02–1 Mpc−¹
    corecore