127 research outputs found

    Solving the inverse problem of electrocardiography in a realistic environment

    Get PDF
    Heart disease is a leading cause of death worldwide. Straightforward information about the cardiac electrophysiology can help to improve the quality of diagnosis of heart diseases. The inverse problem of electrocardiography and the intracardiac catheter measurement are two ways to get access to the electrophysiology in the heart. In this thesis six research topics related to these two techniques are included

    A CCBM-based generalized GKB iterative regularizing algorithm for inverse Cauchy problems

    Full text link
    This paper examines inverse Cauchy problems that are governed by a kind of elliptic partial differential equation. The inverse problems involve recovering the missing data on an inaccessible boundary from the measured data on an accessible boundary, which is severely ill-posed. By using the coupled complex boundary method (CCBM), which integrates both Dirichlet and Neumann data into a single Robin boundary condition, we reformulate the underlying problem into an operator equation. Based on this new formulation, we prove the existence of a unique solution even in cases with noisy data. A Golub-Kahan bidiagonalization (GKB) process together with Givens rotation is employed for iteratively solving the proposed operator equation. The regularizing property of the developed method, called CCBM-GKB, and its convergence rate results are proved under a posteriori stopping rule. Finally, a linear finite element method is used for the numerical realization of CCBM-GKB. Various numerical experiments demonstrate that CCBM-GKB is a kind of accelerated iterative regularization method, as it is much faster than the classic Landweber method

    Theoretical Interpretations and Applications of Radial Basis Function Networks

    Get PDF
    Medical applications usually used Radial Basis Function Networks just as Artificial Neural Networks. However, RBFNs are Knowledge-Based Networks that can be interpreted in several way: Artificial Neural Networks, Regularization Networks, Support Vector Machines, Wavelet Networks, Fuzzy Controllers, Kernel Estimators, Instanced-Based Learners. A survey of their interpretations and of their corresponding learning algorithms is provided as well as a brief survey on dynamic learning algorithms. RBFNs' interpretations can suggest applications that are particularly interesting in medical domains

    Multimodal Integration: fMRI, MRI, EEG, MEG

    Get PDF
    This chapter provides a comprehensive survey of the motivations, assumptions and pitfalls associated with combining signals such as fMRI with EEG or MEG. Our initial focus in the chapter concerns mathematical approaches for solving the localization problem in EEG and MEG. Next we document the most recent and promising ways in which these signals can be combined with fMRI. Specically, we look at correlative analysis, decomposition techniques, equivalent dipole tting, distributed sources modeling, beamforming, and Bayesian methods. Due to difculties in assessing ground truth of a combined signal in any realistic experiment difculty further confounded by lack of accurate biophysical models of BOLD signal we are cautious to be optimistic about multimodal integration. Nonetheless, as we highlight and explore the technical and methodological difculties of fusing heterogeneous signals, it seems likely that correct fusion of multimodal data will allow previously inaccessible spatiotemporal structures to be visualized and formalized and thus eventually become a useful tool in brain imaging research

    The Application of Computer Techniques to ECG Interpretation

    Get PDF
    This book presents some of the latest available information on automated ECG analysis written by many of the leading researchers in the field. It contains a historical introduction, an outline of the latest international standards for signal processing and communications and then an exciting variety of studies on electrophysiological modelling, ECG Imaging, artificial intelligence applied to resting and ambulatory ECGs, body surface mapping, big data in ECG based prediction, enhanced reliability of patient monitoring, and atrial abnormalities on the ECG. It provides an extremely valuable contribution to the field

    From Nano to Macro: Overview of the IEEE Bio Image and Signal Processing Technical Committee

    Get PDF
    The Bio Image and Signal Processing (BISP) Technical Committee (TC) of the IEEE Signal Processing Society (SPS) promotes activities within the broad technical field of biomedical image and signal processing. Areas of interest include medical and biological imaging, digital pathology, molecular imaging, microscopy, and associated computational imaging, image analysis, and image-guided treatment, alongside physiological signal processing, computational biology, and bioinformatics. BISP has 40 members and covers a wide range of EDICS, including CIS-MI: Medical Imaging, BIO-MIA: Medical Image Analysis, BIO-BI: Biological Imaging, BIO: Biomedical Signal Processing, BIO-BCI: Brain/Human-Computer Interfaces, and BIO-INFR: Bioinformatics. BISP plays a central role in the organization of the IEEE International Symposium on Biomedical Imaging (ISBI) and contributes to the technical sessions at the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), and the IEEE International Conference on Image Processing (ICIP). In this paper, we provide a brief history of the TC, review the technological and methodological contributions its community delivered, and highlight promising new directions we anticipate

    Identifying Humans by the Shape of Their Heartbeats and Materials by Their X-Ray Scattering Profiles

    Get PDF
    Security needs at access control points presents itself in the form of human identification and/or material identification. The field of Biometrics deals with the problem of identifying individuals based on the signal measured from them. One approach to material identification involves matching their x-ray scattering profiles with a database of known materials. Classical biometric traits such as fingerprints, facial images, speech, iris and retinal scans are plagued by potential circumvention they could be copied and later used by an impostor. To address this problem, other bodily traits such as the electrical signal acquired from the brain (electroencephalogram) or the heart (electrocardiogram) and the mechanical signals acquired from the heart (heart sound, laser Doppler vibrometry measures of the carotid pulse) have been investigated. These signals depend on the physiology of the body, and require the individual to be alive and present during acquisition, potentially overcoming circumvention. We investigate the use of the electrocardiogram (ECG) and carotid laser Doppler vibrometry (LDV) signal, both individually and in unison, for biometric identity recognition. A parametric modeling approach to system design is employed, where the system parameters are estimated from training data. The estimated model is then validated using testing data. A typical identity recognition system can operate in either the authentication (verification) or identification mode. The performance of the biometric identity recognition systems is evaluated using receiver operating characteristic (ROC) or detection error tradeoff (DET) curves, in the authentication mode, and cumulative match characteristic (CMC) curves, in the identification mode. The performance of the ECG- and LDV-based identity recognition systems is comparable, but is worse than those of classical biometric systems. Authentication performance below 1% equal error rate (EER) can be attained when the training and testing data are obtained from a single measurement session. When the training and testing data are obtained from different measurement sessions, allowing for a potential short-term or long-term change in the physiology, the authentication EER performance degrades to about 6 to 7%. Leveraging both the electrical (ECG) and mechanical (LDV) aspects of the heart, we obtain a performance gain of over 50%, relative to each individual ECG-based or LDV-based identity recognition system, bringing us closer to the performance of classical biometrics, with the added advantage of anti-circumvention. We consider the problem of designing combined x-ray attenuation and scatter systems and the algorithms to reconstruct images from the systems. As is the case within a computational imaging framework, we tackle the problem by taking a joint system and algorithm design approach. Accurate modeling of the attenuation of incident and scattered photons within a scatter imaging setup will ultimately lead to more accurate estimates of the scatter densities of an illuminated object. Such scattering densities can then be used in material classification. In x-ray scatter imaging, tomographic measurements of the forward scatter distribution are used to infer scatter densities within a volume. A mask placed between the object and the detector array provides information about scatter angles. An efficient computational implementation of the forward and backward model facilitates iterative algorithms based upon a Poisson log-likelihood. The design of the scatter imaging system influences the algorithmic choices we make. In turn, the need for efficient algorithms guides the system design. We begin by analyzing an x-ray scatter system fitted with a fanbeam source distribution and flat-panel energy-integrating detectors. Efficient algorithms for reconstructing object scatter densities from scatter measurements made on this system are developed. Building on the fanbeam source, energy-integrating at-panel detection model, we develop a pencil beam model and an energy-sensitive detection model. The scatter forward models and reconstruction algorithms are validated on simulated, Monte Carlo, and real data. We describe a prototype x-ray attenuation scanner, co-registered with the scatter system, which was built to provide complementary attenuation information to the scatter reconstruction and present results of applying alternating minimization reconstruction algorithms on measurements from the scanner

    Uncertainty Quantification in Machine Learning for Biosignal Applications -- A Review

    Get PDF
    Uncertainty Quantification (UQ) has gained traction in an attempt to fix the black-box nature of Deep Learning. Specifically (medical) biosignals such as electroencephalography (EEG), electrocardiography (ECG), electroocculography (EOG) and electromyography (EMG) could benefit from good UQ, since these suffer from a poor signal to noise ratio, and good human interpretability is pivotal for medical applications and Brain Computer Interfaces. In this paper, we review the state of the art at the intersection of Uncertainty Quantification and Biosignal with Machine Learning. We present various methods, shortcomings, uncertainty measures and theoretical frameworks that currently exist in this application domain. Overall it can be concluded that promising UQ methods are available, but that research is needed on how people and systems may interact with an uncertainty model in a (clinical) environment
    • …
    corecore