88 research outputs found

    Applications

    Get PDF

    Model Order Reduction

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This three-volume handbook covers methods as well as applications. This third volume focuses on applications in engineering, biomedical engineering, computational physics and computer science

    Dynamic Complexity and Causality Analysis of Scalp EEG for Detection of Cognitive Deficits

    Get PDF
    This dissertation explores the potential of scalp electroencephalography (EEG) for the detection and evaluation of neurological deficits due to moderate/severe traumatic brain injury (TBI), mild cognitive impairment (MCI), and early Alzheimer’s disease (AD). Neurological disorders often cannot be accurately diagnosed without the use of advanced imaging modalities such as computed tomography (CT), magnetic resonance imaging (MRI), and positron emission tomography (PET). Non-quantitative task-based examinations are also used. None of these techniques, however, are typically performed in the primary care setting. Furthermore, the time and expense involved often deters physicians from performing them, leading to potential worse prognoses for patients. If feasible, screening for cognitive deficits using scalp EEG would provide a fast, inexpensive, and less invasive alternative for evaluation of TBI post injury and detection of MCI and early AD. In this work various measures of EEG complexity and causality are explored as means of detecting cognitive deficits. Complexity measures include eventrelated Tsallis entropy, multiscale entropy, inter-regional transfer entropy delays, and regional variation in common spectral features, and graphical analysis of EEG inter-channel coherence. Causality analysis based on nonlinear state space reconstruction is explored in case studies of intensive care unit (ICU) signal reconstruction and detection of cognitive deficits via EEG reconstruction models. Significant contributions in this work include: (1) innovative entropy-based methods for analyzing event-related EEG data; (2) recommendations regarding differences in MCI/AD of common spectral and complexity features for different scalp regions and protocol conditions; (3) development of novel artificial neural network techniques for multivariate signal reconstruction; and (4) novel EEG biomarkers for detection of dementia

    Essays on the nonlinear and nonstochastic nature of stock market data

    Get PDF
    The nature and structure of stock-market price dynamics is an area of ongoing and rigourous scientific debate. For almost three decades, most emphasis has been given on upholding the concepts of Market Efficiency and rational investment behaviour. Such an approach has favoured the development of numerous linear and nonlinear models mainly of stochastic foundations. Advances in mathematics have shown that nonlinear deterministic processes i.e. "chaos" can produce sequences that appear random to linear statistical techniques. Till recently, investment finance has been a science based on linearity and stochasticity. Hence it is important that studies of Market Efficiency include investigations of chaotic determinism and power laws. As far as chaos is concerned, there are rather mixed or inconclusive research results, prone with controversy. This inconclusiveness is attributed to two things: the nature of stock market time series, which are highly volatile and contaminated with a substantial amount of noise of largely unknown structure, and the lack of appropriate robust statistical testing procedures. In order to overcome such difficulties, within this thesis it is shown empirically and for the first time how one can combine novel techniques from recent chaotic and signal analysis literature, under a univariate time series analysis framework. Three basic methodologies are investigated: Recurrence analysis, Surrogate Data and Wavelet transforms. Recurrence Analysis is used to reveal qualitative and quantitative evidence of nonlinearity and nonstochasticity for a number of stock markets. It is then demonstrated how Surrogate Data, under a statistical hypothesis testing framework, can be simulated to provide similar evidence. Finally, it is shown how wavelet transforms can be applied in order to reveal various salient features of the market data and provide a platform for nonparametric regression and denoising. The results indicate that without the invocation of any parametric model-based assumptions, one can easily deduce that there is more to linearity and stochastic randomness in the data. Moreover, substantial evidence of recurrent patterns and aperiodicities is discovered which can be attributed to chaotic dynamics. These results are therefore very consistent with existing research indicating some types of nonlinear dependence in financial data. Concluding, the value of this thesis lies in its contribution to the overall evidence on Market Efficiency and chaotic determinism in financial markets. The main implication here is that the theory of equilibrium pricing in financial markets may need reconsideration in order to accommodate for the structures revealed

    Reconstruction of electric fields and source distributions in EEG brain imaging

    Get PDF
    In this thesis, three different approaches are developed for the estimation of focal brain activity using EEG measurements. The proposed approaches have been tested and found feasible using simulated data. First, we develop a robust solver for the recovery of focal dipole sources. The solver uses a weighted dipole strength penalty term (also called weighted L1,2 norm) as prior information in order to ensure that the sources are sparse and focal, and that both the source orientation and depth bias are reduced. The solver is based on the truncated Newton interior point method combined with a logarithmic barrier method for the approximation of the penalty term. In addition, we use a Bayesian framework to derive the depth weights in the prior that are used to reduce the tendency of the solver to favor superficial sources. In the second approach, vector field tomography (VFT) is used for the estimation of underlying electric fields inside the brain from external EEG measurements. The electric field is reconstructed using a set of line integrals. This is the first time that VFT has been used for the recovery of fields when the dipole source lies inside the domain of reconstruction. The benefit of this approach is that we do not need a mathematical model for the sources. The test cases indicated that the approach can accurately localize the source activity. In the last part of the thesis, we show that, by using the Bayesian approximation error approach (AEA), precise knowledge of the tissue conductivities and head geometry are not always needed. We deliberately use a coarse head model and we take the typical variations in the head geometry and tissue conductivities into account statistically in the inverse model. We demonstrate that the AEA results are comparable to those obtained with an accurate head model.Open Acces

    Registration Methods for Quantitative Imaging

    Get PDF
    At the core of most image registration problems is determining a spatial transformation that relates the physical coordinates of two or more images. Registration methods have become ubiquitous in many quantitative imaging applications. They represent an essential step for many biomedical and bioengineering applications. For example, image registration is a necessary step for removing motion and distortion related artifacts in serial images, for studying the variation of biological tissue properties, such as shape and composition, across different populations, and many other applications. Here fully automatic intensity based methods for image registration are reviewed within a global energy minimization framework. A linear, shift-invariant, stochastic model for the image formation process is used to describe several important aspects of typical implementations of image registration methods. In particular, we show that due to the stochastic nature of the image formation process, most methods for automatic image registration produce answers biased towards `blurred' images. In addition we show how image approximation and interpolation procedures necessary to compute the registered images can have undesirable effects on subsequent quantitative image analysis methods. We describe the exact sources of such artifacts and propose methods through which these can be mitigated. The newly proposed methodology is tested using both simulated and real image data. Case studies using three-dimensional diffusion weighted magnetic resonance images, diffusion tensor images, and two-dimensional optical images are presented. Though the specific examples shown relate exclusively to the fields of biomedical imaging and biomedical engineering, the methods described are general and should be applicable to a wide variety of imaging problems
    corecore