115 research outputs found

    The human ECG - nonlinear deterministic versus stochastic aspects

    Full text link
    We discuss aspects of randomness and of determinism in electrocardiographic signals. In particular, we take a critical look at attempts to apply methods of nonlinear time series analysis derived from the theory of deterministic dynamical systems. We will argue that deterministic chaos is not a likely explanation for the short time variablity of the inter-beat interval times, except for certain pathologies. Conversely, densely sampled full ECG recordings possess properties typical of deterministic signals. In the latter case, methods of deterministic nonlinear time series analysis can yield new insights.Comment: 6 pages, 9 PS figure

    Extracting Fetal Electrocardiogram from Being Pregnancy Based on Nonlinear Projection

    Get PDF
    Fetal heart rate extraction from the abdominal ECG is of great importance due to the information that carries in assessing appropriately the fetus well-being during pregnancy. In this paper, we describe a method to suppress the maternal signal and noise contamination to discover the fetal signal in a single-lead fetal ECG recordings. We use a locally linear phase space projection technique which has been used for noise reduction in deterministically chaotic signals. Henceforth, this method is capable of extracting fetal signal even when noise and fetal component are of comparable amplitude. The result is much better if the noise is much smaller (P wave and T wave can be discovered)

    Least Dependent Component Analysis Based on Mutual Information

    Get PDF
    We propose to use precise estimators of mutual information (MI) to find least dependent components in a linearly mixed signal. On the one hand this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand it has the advantage, compared to other implementations of `independent' component analysis (ICA) some of which are based on crude approximations for MI, that the numerical values of the MI can be used for: (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output, by comparing the pairwise MIs with those of re-mixed components; (iii) clustering the output according to the residual interdependencies. For the MI estimator we use a recently proposed k-nearest neighbor based algorithm. For time sequences we combine this with delay embedding, in order to take into account non-trivial time correlations. After several tests with artificial data, we apply the resulting MILCA (Mutual Information based Least dependent Component Analysis) algorithm to a real-world dataset, the ECG of a pregnant woman. The software implementation of the MILCA algorithm is freely available at http://www.fz-juelich.de/nic/cs/softwareComment: 18 pages, 20 figures, Phys. Rev. E (in press

    Novel neural approaches to data topology analysis and telemedicine

    Get PDF
    1noL'abstract è presente nell'allegato / the abstract is in the attachmentopen676. INGEGNERIA ELETTRICAnoopenRandazzo, Vincenz

    Separability between signal and noise components using the distribution of scaled Hankel matrix eigenvalues with application in biomedical signals.

    Get PDF
    Biomedical signals are records from human and animal bodies. These records are considered as nonlinear time series, which hold important information about the physiological activities of organisms, and embrace many subjects of interest. However, biomedical signals are often corrupted by artifacts and noise, which require separation or signal extraction before any statistical evaluation. Another challenge in analysing biomedical signals is that their data is often non-stationary, particularly when there is an abnormal event observed within the signal, such as epileptic seizure, and can also present chaotic behaviour. The literature suggests that distinguishing chaos from noise continues to remain a highly contentious issue in the modern age, as it has been historically. This is because chaos and noise share common properties, which in turn make them indistinguishable. We seek to provide a viable solution to this problem by presenting a novel approach for the separability between signal and noise components and the differentiation of noise from chaos. Several methods have been used for the analysis of and discrimination between different categories of biomedical signals, but many of these are based on restrictive assumptions of the normality, stationarity and linearity of the observed data. Therefore, an improved technique which is robust in its analysis of non-stationary time series is of paramount importance in accurate diagnosis of human diseases. The SSA (Singular Spectrum Analysis) technique does not depend on these assumptions, which could be very helpful for analysing and modelling biomedical data. Therefore, the main aim of the thesis is to provide a novel approach for developing the SSA technique, and then apply it to the analysis of biomedical signals. SSA is a reliable technique for separating an arbitrary signal from a noisy time series (signal+noise). It is based upon two main selections: window length, L; and the number of eigenvalues, r. These values play an important role in the reconstruction and forecasting stages. However, the main issue in extracting signals using the SSA procedure lies in identifying the optimal values of L and r required for signal reconstruction. The aim of this thesis is to develop theoretical and methodological aspects of the SSA technique, to present a novel approach to distinguishing between deterministic and stochastic processes, and to present an algorithm for identifying the eigenvalues corresponding to the noise component, and thereby choosing the optimal value of r relating to the desired signal for separability between signal and noise. The algorithm used is considered as an enhanced version of the SSA method, which decomposes a noisy signal into the sum of a signal and noise. Although the main focus of this thesis is on the selection of the optimal value of r, we also provide some results and recommendations to the choice of L for separability. Several criteria are introduced which characterise this separability. The proposed approach is based on the distribution of the eigenvalues of a scaled Hankel matrix, and on dynamical systems, embedding theorem, matrix algebra and statistical theory. The research demonstrates that the proposed approach can be considered as an alternative and promising technique for choosing the optimal values of r and L in SSA, especially for biomedical signals and genetic time series. For the theoretical development of the approach, we present new theoretical results on the eigenvalues of a scaled Hankel matrix, provide some properties of the eigenvalues, and show the effect of the window length and the rank of the Hankel matrix on the eigenvalues. The new theoretical results are examined using simulated and real time series. Furthermore, the effect of window length on the distribution of the largest and smallest eigenvalues of the scaled Hankel matrix is also considered for the white noise process. The results indicate that the distribution of the largest eigenvalue for the white noise process has a positive skewed distribution for different series lengths and different values of window length, whereas the distribution of the smallest eigenvalue has a different pattern with L; the distribution changes from left to right when L increases. These results, together with other results obtained by the different criteria introduced and used in this research, are very promising for the identification of the signal subspace. For the practical aspect and empirical results, various biomedical signals and genetics time series are used. First, to achieve the objectives of the thesis, a comprehensive study has been made on the distribution, pattern; and behaviour of scaled Furthermore, the normal distribution with different parameters is considered and the effect of scale and shape parameters are evaluated. The correlation between eigenvalues is also assessed, using parametric and non-parametric association criteria. In addition, the distribution of eigenvalues for synthetic time series generated from some well known low dimensional chaotic systems are analysed in-depth. The results yield several important properties with broad application, enabling the distinction between chaos and noise in time series analysis. At this stage, the main result of the simulation study is that the findings related to the series generated from normal distribution with mean zero (white noise process) are totally different from those obtained for other series considered in this research, which makes a novel contribution to the area of signal processing and noise reduction. Second, the proposed approach and its criteria are applied to a number of simulated and real data with different levels of noise and structures. Our results are compared with those obtained by common and well known criteria in order to evaluate, enhance and confirm the accuracy of the approach and its criteria. The results indicate that the proposed approach has the potential to split the eigenvalues into two groups; the first corresponding to the signal and the second to the noise component. In addition, based on the results, the optimal value of L that one needs for the reconstruction of a noise free signal from a noisy series should be the median of the series length. The results confirm that the performance of the proposed approach can improve the quality of the reconstruction step for signal extraction. Finally, the thesis seeks to explore the applicability of the proposed approach for discriminating between normal and epileptic seizure electroencephalography (EEG) signals, and filtering the signal segments to make them free from noise. Various criteria based on the largest eigenvalue are also presented and used as features to distinguish between normal and epileptic EEG segments. These features can be considered as useful information to classify brain signals. In addition, the approach is applied to the removal of nonspecific noise from Drosophila segmentation genes. Our findings indicate that when extracting signal from different genes, for optimised signal and noise separation, a different number of eigenvalues need to be chosen for each gene

    Intelligent Biosignal Processing in Wearable and Implantable Sensors

    Get PDF
    This reprint provides a collection of papers illustrating the state-of-the-art of smart processing of data coming from wearable, implantable or portable sensors. Each paper presents the design, databases used, methodological background, obtained results, and their interpretation for biomedical applications. Revealing examples are brain–machine interfaces for medical rehabilitation, the evaluation of sympathetic nerve activity, a novel automated diagnostic tool based on ECG data to diagnose COVID-19, machine learning-based hypertension risk assessment by means of photoplethysmography and electrocardiography signals, Parkinsonian gait assessment using machine learning tools, thorough analysis of compressive sensing of ECG signals, development of a nanotechnology application for decoding vagus-nerve activity, detection of liver dysfunction using a wearable electronic nose system, prosthetic hand control using surface electromyography, epileptic seizure detection using a CNN, and premature ventricular contraction detection using deep metric learning. Thus, this reprint presents significant clinical applications as well as valuable new research issues, providing current illustrations of this new field of research by addressing the promises, challenges, and hurdles associated with the synergy of biosignal processing and AI through 16 different pertinent studies. Covering a wide range of research and application areas, this book is an excellent resource for researchers, physicians, academics, and PhD or master students working on (bio)signal and image processing, AI, biomaterials, biomechanics, and biotechnology with applications in medicine

    An improved classification approach for echocardiograms embedding temporal information

    Get PDF
    Cardiovascular disease is an umbrella term for all diseases of the heart. At present, computer-aided echocardiogram diagnosis is becoming increasingly beneficial. For echocardiography, different cardiac views can be acquired depending on the location and angulations of the ultrasound transducer. Hence, the automatic echocardiogram view classification is the first step for echocardiogram diagnosis, especially for computer-aided system and even for automatic diagnosis in the future. In addition, heart views classification makes it possible to label images especially for large-scale echo videos, provide a facility for database management and collection. This thesis presents a framework for automatic cardiac viewpoints classification of echocardiogram video data. In this research, we aim to overcome the challenges facing this investigation while analyzing, recognizing and classifying echocardiogram videos from 3D (2D spatial and 1D temporal) space. Specifically, we extend 2D KAZE approach into 3D space for feature detection and propose a histogram of acceleration as feature descriptor. Subsequently, feature encoding follows before the application of SVM to classify echo videos. In addition, comparison with the state of the art methodologies also takes place, including 2D SIFT, 3D SIFT, and optical flow technique to extract temporal information sustained in the video images. As a result, the performance of 2D KAZE, 2D KAZE with Optical Flow, 3D KAZE, Optical Flow, 2D SIFT and 3D SIFT delivers accuracy rate of 89.4%, 84.3%, 87.9%, 79.4%, 83.8% and 73.8% respectively for the eight view classes of echo videos
    corecore