68 research outputs found

    Analysis of the high-frequency content in human qrs complexes by the continuous wavelet transform: An automatized analysis for the prediction of sudden cardiac death

    Get PDF
    Background: Fragmentation and delayed potentials in the QRS signal of patients have been postulated as risk markers for Sudden Cardiac Death (SCD). The analysis of the high-frequency spectral content may be useful for quantification. Methods: Forty-two consecutive patients with prior history of SCD or malignant arrhythmias (patients) where compared with 120 healthy individuals (controls). The QRS complexes were extracted with a modified Pan-Tompkins algorithm and processed with the Continuous Wavelet Transform to analyze the high-frequency content (85–130 Hz). Results: Overall, the power of the high-frequency content was higher in patients compared with controls (170.9 vs. 47.3 103nV2Hz−1; p = 0.007), with a prolonged time to reach the maximal power (68.9 vs. 64.8 ms; p = 0.002). An analysis of the signal intensity (instantaneous average of cumulative power), revealed a distinct function between patients and controls. The total intensity was higher in patients compared with controls (137.1 vs. 39 103nV2Hz−1s−1; p = 0.001) and the time to reach the maximal intensity was also prolonged (88.7 vs. 82.1 ms; p < 0.001). Discussion: The high-frequency content of the QRS complexes was distinct between patients at risk of SCD and healthy controls. The wavelet transform is an efficient tool for spectral analysis of the QRS complexes that may contribute to stratification of risk

    Multi-GPU Development of a Neural Networks Based Reconstructor for Adaptive Optics

    Get PDF
    Aberrations introduced by the atmospheric turbulence in large telescopes are compensated using adaptive optics systems, where the use of deformable mirrors and multiple sensors relies on complex control systems. Recently, the development of larger scales of telescopes as the E-ELT or TMT has created a computational challenge due to the increasing complexity of the new adaptive optics systems. The Complex Atmospheric Reconstructor based on Machine Learning (CARMEN) is an algorithm based on artificial neural networks, designed to compensate the atmospheric turbulence. During recent years, the use of GPUs has been proved to be a great solution to speed up the learning process of neural networks, and different frameworks have been created to ease their development. The implementation of CARMEN in different Multi-GPU frameworks is presented in this paper, along with its development in a language originally developed for GPU, like CUDA. This implementation offers the best response for all the presented cases, although its advantage of using more than one GPU occurs only in large networks

    A New Missing Data Imputation Algorithm Applied to Electrical Data Loggers

    Get PDF
    Nowadays, data collection is a key process in the study of electrical power networks when searching for harmonics and a lack of balance among phases. In this context, the lack of data of any of the main electrical variables (phase-to-neutral voltage, phase-to-phase voltage, and current in each phase and power factor) adversely affects any time series study performed. When this occurs, a data imputation process must be accomplished in order to substitute the data that is missing for estimated values. This paper presents a novel missing data imputation method based on multivariate adaptive regression splines (MARS) and compares it with the well-known technique called multivariate imputation by chained equations (MICE). The results obtained demonstrate how the proposed method outperforms the MICE algorithm.Ministerio de Economía y Competitividad; AYA2014-57648-PAsturias (Comunidad Autónoma). Consejería de Economía y Empleo; FC-15-GRUPIN14-01

    A parametric model of the LARCODEMS heavy media separator by means of multivariate adaptive regression splines

    Get PDF
    Modeling of a cylindrical heavy media separator has been conducted in order to predict its optimum operating parameters. As far as it is known by the authors, this is the first application in the literature. The aim of the present research is to predict the separation efficiency based on the adjustment of the device’s dimensions and media flow rates. A variety of heavy media separators exist that are extensively used to separate particles by density. There is a growing importance in their application in the recycling sector. The cylindrical variety is reported to be the most suited for processing a large range of particle sizes, but optimizing its operating parameters remains to be documented. The multivariate adaptive regression splines methodology has been applied in order to predict the separation efficiencies using, as inputs, the device dimension and media flow rate variables. The results obtained show that it is possible to predict the device separation efficiency according to laboratory experiments performed and, therefore, forecast results obtainable with different operating conditions

    Radon Mitigation Approach in a Laboratory Measurement Room

    Get PDF
    [Abstract] Radon gas is the second leading cause of lung cancer, causing thousands of deaths annually. It can be a problem for people or animals in houses, workplaces, schools or any building. Therefore, its mitigation has become essential to avoid health problems and to prevent radon from interfering in radioactive measurements. This study describes the implementation of radon mitigation systems at a radioactivity laboratory in order to reduce interferences in the different works carried out. A large set of radon concentration samples is obtained from measurements at the laboratory. While several mitigation methods were taken into account, the final applied solution is explained in detail, obtaining thus very good results by reducing the radon concentration by 76%.Ministerio de Economía y Competitividad; AYA2014-57648-PAsturias (Comunidad Autónoma). Consejería de Economía y Empleo; FC-15-GRUPIN14-01

    A Hybrid Algorithm for Missing Data Imputation and Its Application to Electrical Data Loggers

    Get PDF
    The storage of data is a key process in the study of electrical power networks related to the search for harmonics and the finding of a lack of balance among phases. The presence of missing data of any of the main electrical variables (phase-to-neutral voltage, phase-to-phase voltage, current in each phase and power factor) affects any time series study in a negative way that has to be addressed. When this occurs, missing data imputation algorithms are required. These algorithms are able to substitute the data that are missing for estimated values. This research presents a new algorithm for the missing data imputation method based on Self-Organized Maps Neural Networks and Mahalanobis distances and compares it not only with a well-known technique called Multivariate Imputation by Chained Equations (MICE) but also with an algorithm previously proposed by the authors called Adaptive Assignation Algorithm (AAA). The results obtained demonstrate how the proposed method outperforms both algorithms.Ministerio de Economía y Competitividad, AYA2014-57648-PAsturias (Comunidad Autónoma). Consejería de Economía y Empleo, FC-15-GRUPIN14-01

    Comparative study of imputation algorithms applied to the prediction of student performance

    Get PDF
    [Abstract]: Student performance and its evaluation remain a serious challenge for education systems. Frequently, the recording and processing of students’ scores in a specific curriculum have several f laws for various reasons. In this context, the absence of data from some of the student scores undermines the efficiency of any future analysis carried out in order to reach conclusions. When this is the case, missing data imputation algorithms are needed. These algorithms are capable of substituting, with a high level of accuracy, the missing data for predicted values. This research presents the hybridization of an algorithm previously proposed by the authors called adaptive assignation algorithm (AAA), with a well-known technique called multivariate imputation by chained equations (MICE). The results show how the suggested methodology outperforms both algorithms.Ministerio de Economía y Competitividad ; AYA2014-57648-PAsturias. Consejería de Economía y Empleo ; FC-15-GRUPIN14-01

    Missing data imputation of solar radiation data under different atmospheric conditions

    Get PDF
    [Abstract] Global solar broadband irradiance on a planar surface is measured at weather stations by pyranometers. In the case of the present research, solar radiation values from nine meteorological stations of the MeteoGalicia real-time observational network, captured and stored every ten minutes, are considered. In this kind of record, the lack of data and/or the presence of wrong values adversely affects any time series study. Consequently, when this occurs, a data imputation process must be performed in order to replace missing data with estimated values. This paper aims to evaluate the multivariate imputation of ten-minute scale data by means of the chained equations method (MICE). This method allows the network itself to impute the missing or wrong data of a solar radiation sensor, by using either all or just a group of the measurements of the remaining sensors. Very good results have been obtained with the MICE method in comparison with other methods employed in this field such as Inverse Distance Weighting (IDW) and Multiple Linear Regression (MLR). The average RMSE value of the predictions for the MICE algorithm was 13.37% while that for the MLR it was 28.19%, and 31.68% for the IDW.Ministerio de Economía y Competitividad; AYA2010-1851
    corecore