12 research outputs found

    Towards Predictive Rendering in Virtual Reality

    Get PDF
    The strive for generating predictive images, i.e., images representing radiometrically correct renditions of reality, has been a longstanding problem in computer graphics. The exactness of such images is extremely important for Virtual Reality applications like Virtual Prototyping, where users need to make decisions impacting large investments based on the simulated images. Unfortunately, generation of predictive imagery is still an unsolved problem due to manifold reasons, especially if real-time restrictions apply. First, existing scenes used for rendering are not modeled accurately enough to create predictive images. Second, even with huge computational efforts existing rendering algorithms are not able to produce radiometrically correct images. Third, current display devices need to convert rendered images into some low-dimensional color space, which prohibits display of radiometrically correct images. Overcoming these limitations is the focus of current state-of-the-art research. This thesis also contributes to this task. First, it briefly introduces the necessary background and identifies the steps required for real-time predictive image generation. Then, existing techniques targeting these steps are presented and their limitations are pointed out. To solve some of the remaining problems, novel techniques are proposed. They cover various steps in the predictive image generation process, ranging from accurate scene modeling over efficient data representation to high-quality, real-time rendering. A special focus of this thesis lays on real-time generation of predictive images using bidirectional texture functions (BTFs), i.e., very accurate representations for spatially varying surface materials. The techniques proposed by this thesis enable efficient handling of BTFs by compressing the huge amount of data contained in this material representation, applying them to geometric surfaces using texture and BTF synthesis techniques, and rendering BTF covered objects in real-time. Further approaches proposed in this thesis target inclusion of real-time global illumination effects or more efficient rendering using novel level-of-detail representations for geometric objects. Finally, this thesis assesses the rendering quality achievable with BTF materials, indicating a significant increase in realism but also confirming the remainder of problems to be solved to achieve truly predictive image generation

    Statistical methods for sparse image time series of remote-sensing lake environmental measurements

    Get PDF
    Remote-sensing technology is widely used in Earth observation, from everyday weather forecasting to long-term monitoring of the air, sea and land. The remarkable coverage and resolution of remote sensing data are extremely beneficial to the investigation of environmental problems, such as the state and function of lakes under climate change. However, the attractive features of remote-sensing data bring new challenges to statistical analysis. The wide coverage and high resolution means that data are usually of large volume. The orbit track of the satellite and the occasional obscuring of the instruments due to atmospheric factors could result in substantial missing observations. Applying conventional statistical methods to this type of data can be ineffective and computationally intensive due to its volume and dimensionality. Modifications to existing methods are often required in order to incorporate the missingness. There is a great need of novel statistical approaches to tackle these challenges. This thesis aims to investigate and develop statistical approaches that can be used in the analysis of the sparse remote-sensing image time series of environmental data. Specifically, three aspects of the data are considered, (a) the high dimensionality, which is associated with the volume and the dimension of data, (b) the sparsity, in the sense of high missing percentages and (c) the spatial/temporal structures, including the patterns and the correlations. Initially, methods for temporal and spatial modelling are explored and implemented with care, e.g. harmonic regression and bivariate spline regression with residual correlation structures. In recognizing the drawbacks of these methods, functional data analysis is employed as a general approach in this thesis. Specifically, functional principal component analysis (FPCA) is used to achieve the goal of dimension reduction. Bivariate basis functions are proposed to transform the satellite image data, which typically consists of thousands/millions of pixels, into functional data with low dimensional representations. This approach has the advantage of identifying spatial variation patterns through the principal component (PC) loadings, i.e. eigenfunctions. To overcome the high missing percentages that might invalidate the standard implementation of the FPCA, the mixed model FPCA (MM-FPCA) was investigated in Chapter 3. Through estimating the PCs using a mixed effect model, the influence of sparsity could be accounted for appropriately. Data imputation can be obtained from the fitted model using the (truncated) Karhunen-Loeve expansion. The method's applicability to sparse image series is examined through a simulation study. To incorporate the temporal dependence into the MM-FPCA, a novel spatio-temporal model consisting of a state space component and a FPCA component is proposed in Chapter 4. The model, referred to as SS-FPCA in the thesis, is developed based on the dynamic spatio-temporal model framework. The SS-FPCA exploits a flexible hierarchical design with (a) a data model consisting of a time varying mean function and random component for the common spatial variation patterns formulated as the FPCA, (b) a process model specifying the type of temporal dynamic of the mean function and (c) a parameter model ensuring the identifiability of the model components. A 2-cycle alternating expectation - conditional maximization (AECM) algorithm is proposed to estimate the SS-FPCA model. The AECM algorithm allows different data augmentations and parameter combinations in various cycles within an iteration, which in this case results in analytical solutions for all the MLEs of model parameters. The algorithm uses the Kalman filter/smoother to update the system states according to the data model and the process model. Model investigations are carried out in Chapter 5, including a simulation study on a 1-dimensional space to assess the performance of the model and the algorithm. This is accompanied by a brief summary of the asymptotic results of the EM-type algorithm, some of which can be used to approximate the standard errors of model estimates. Applications of the MM-FPCA and SS-FPCA to the remote-sensing lake surface water temperature and Chlorophyll data of Lake Victoria (obtained from the European Space Agency's Envisat mission) are presented at the end of Chapter 3 and 5. Remarks on the implications and limitations of these two methods are provided in Chapter 6, along with the potential future extensions of both methods. The Appendices provide some additional theorems, computation and derivation details of the methods investigated in the thesis

    A survey of the application of soft computing to investment and financial trading

    Get PDF

    Generalized averaged Gaussian quadrature and applications

    Get PDF
    A simple numerical method for constructing the optimal generalized averaged Gaussian quadrature formulas will be presented. These formulas exist in many cases in which real positive GaussKronrod formulas do not exist, and can be used as an adequate alternative in order to estimate the error of a Gaussian rule. We also investigate the conditions under which the optimal averaged Gaussian quadrature formulas and their truncated variants are internal

    MS FT-2-2 7 Orthogonal polynomials and quadrature: Theory, computation, and applications

    Get PDF
    Quadrature rules find many applications in science and engineering. Their analysis is a classical area of applied mathematics and continues to attract considerable attention. This seminar brings together speakers with expertise in a large variety of quadrature rules. It is the aim of the seminar to provide an overview of recent developments in the analysis of quadrature rules. The computation of error estimates and novel applications also are described

    Neurotransmitter profiling with high and ultra-high field magnetic resonance spectroscopy : optimization for clinical and translational studies in schizophrenia

    Get PDF
    Growing interest in the research community has been shown in clinical neuroscience to assess neurotransmitter profiling both in healthy and diseased subjects. A large body of research in this field focuses on schizophrenia to characterise its glutamatergic level according to the most recent hypothesis of NMDA (N-Methyl-D-aspartic acid) receptors hypofunction. Magnetic Resonance Spectroscopy (MRS) is able to detect some of the most common neurotransmitters but a number of issues, such as low signal to noise ratio (SNR), spectra overlapping and line broadening prevents MRS from being clinically relevant for neuropsychiatry. Four important aims were considered relevant for this work. Firstly, we aimed to compare the reliability of conventional and timing-optimized sequences for the detection and measurement of most of the visible metabolites and, in particular, for glutamate (Glu), glutamine (GIn) and gamma-aminobutyric acid (GABA) to assess the best available sequence for a study in schizophrenia. Secondly, we also intended to investigate whether glutamatergic activity might predict the oscillatory activity and how this link might survive or not in schizophrenia. Thirdly, we wanted to study whether the well known animal model of schizophrenia, the rearing in isolation model, exacerbates the effect of ketamine and determines more profound changes on neurotransmitter profile in rats. Fourthly, a further goal focuses on the improved data acquisition and on the data processing to reliably resolve GABA and to be able to quantify a wider range of metabolites. To address those points five studies were performed. The first work (Chapter 3) describes a study of reproducibility on sequences which have been reported in the literature to be capable to detect Glu and GIn. The study was performed on 14 healthy subjects by scanning them twice and repositioning between the two scans. The absolute percentage difference was then computed to assess the accuracy per sequence and metabolite. A good compromise was found in PRESS sequence (TE=80 ms) which was exploited subsequently for the following study on schizophrenic patients (Chapter 4). Twenty-seven early stage schizophrenic patients and twenty-three aged-matched controls were recruited to undergo a protocol including, in two separate sessions, MRS and electroencephalography (EEG). Anterior Cingulate Cortex Glu was found to predict the induced theta activity in healthy controls but not in patients. Furthermore, the NAA values have also been found to be reduced in schizophrenia and linked to N100, an Event Related Potential (ERP) which is well known to be decreased in schizophrenia. Following on from the findings of the study on the early stage of schizophrenia, further investigations were undertaken to study the psychotic state occurring in the disease via a functional MRS, where 25mg/kg of ketamine (NMDA antagonist) injection was administered to two groups of rats. The two groups were group-housed and reared in isolation. This work was able to show increase of prefrontal GIn levels in both groups but showed a selective GABA decrease only in isolated rats. It would have been very interesting to be able to detect GABA changes in the study at 3T but the used protocol did not allow its accurate quantification. Simulations and reliability tests (Chapter 6)were then utilized to optimize a standard sequence to obtain an accurate and reliable GABA concentration. The optimized sequence reproduces the quantification with 12% of accuracy. The preliminary results of the last study (Chapter 7) give an evidence of the potential of combined use of Monte Carlo, Levenberg-Marquardt and NNLS methods embedded in a novel fitting approach for two-dimensional spectra. The three appendices at the end of this work illustrate the details of some of the algorithms and softwares used throughout the studies

    Neurotransmitter profiling with high and ultra-high field magnetic resonance spectroscopy : optimization for clinical and translational studies in schizophrenia

    Get PDF
    Growing interest in the research community has been shown in clinical neuroscience to assess neurotransmitter profiling both in healthy and diseased subjects. A large body of research in this field focuses on schizophrenia to characterise its glutamatergic level according to the most recent hypothesis of NMDA (N-Methyl-D-aspartic acid) receptors hypofunction. Magnetic Resonance Spectroscopy (MRS) is able to detect some of the most common neurotransmitters but a number of issues, such as low signal to noise ratio (SNR), spectra overlapping and line broadening prevents MRS from being clinically relevant for neuropsychiatry. Four important aims were considered relevant for this work. Firstly, we aimed to compare the reliability of conventional and timing-optimized sequences for the detection and measurement of most of the visible metabolites and, in particular, for glutamate (Glu), glutamine (GIn) and gamma-aminobutyric acid (GABA) to assess the best available sequence for a study in schizophrenia. Secondly, we also intended to investigate whether glutamatergic activity might predict the oscillatory activity and how this link might survive or not in schizophrenia. Thirdly, we wanted to study whether the well known animal model of schizophrenia, the rearing in isolation model, exacerbates the effect of ketamine and determines more profound changes on neurotransmitter profile in rats. Fourthly, a further goal focuses on the improved data acquisition and on the data processing to reliably resolve GABA and to be able to quantify a wider range of metabolites. To address those points five studies were performed. The first work (Chapter 3) describes a study of reproducibility on sequences which have been reported in the literature to be capable to detect Glu and GIn. The study was performed on 14 healthy subjects by scanning them twice and repositioning between the two scans. The absolute percentage difference was then computed to assess the accuracy per sequence and metabolite. A good compromise was found in PRESS sequence (TE=80 ms) which was exploited subsequently for the following study on schizophrenic patients (Chapter 4). Twenty-seven early stage schizophrenic patients and twenty-three aged-matched controls were recruited to undergo a protocol including, in two separate sessions, MRS and electroencephalography (EEG). Anterior Cingulate Cortex Glu was found to predict the induced theta activity in healthy controls but not in patients. Furthermore, the NAA values have also been found to be reduced in schizophrenia and linked to N100, an Event Related Potential (ERP) which is well known to be decreased in schizophrenia. Following on from the findings of the study on the early stage of schizophrenia, further investigations were undertaken to study the psychotic state occurring in the disease via a functional MRS, where 25mg/kg of ketamine (NMDA antagonist) injection was administered to two groups of rats. The two groups were group-housed and reared in isolation. This work was able to show increase of prefrontal GIn levels in both groups but showed a selective GABA decrease only in isolated rats. It would have been very interesting to be able to detect GABA changes in the study at 3T but the used protocol did not allow its accurate quantification. Simulations and reliability tests (Chapter 6)were then utilized to optimize a standard sequence to obtain an accurate and reliable GABA concentration. The optimized sequence reproduces the quantification with 12% of accuracy. The preliminary results of the last study (Chapter 7) give an evidence of the potential of combined use of Monte Carlo, Levenberg-Marquardt and NNLS methods embedded in a novel fitting approach for two-dimensional spectra. The three appendices at the end of this work illustrate the details of some of the algorithms and softwares used throughout the studies
    corecore