3,106 research outputs found

    Computerized Analysis of Magnetic Resonance Images to Study Cerebral Anatomy in Developing Neonates

    Get PDF
    The study of cerebral anatomy in developing neonates is of great importance for the understanding of brain development during the early period of life. This dissertation therefore focuses on three challenges in the modelling of cerebral anatomy in neonates during brain development. The methods that have been developed all use Magnetic Resonance Images (MRI) as source data. To facilitate study of vascular development in the neonatal period, a set of image analysis algorithms are developed to automatically extract and model cerebral vessel trees. The whole process consists of cerebral vessel tracking from automatically placed seed points, vessel tree generation, and vasculature registration and matching. These algorithms have been tested on clinical Time-of- Flight (TOF) MR angiographic datasets. To facilitate study of the neonatal cortex a complete cerebral cortex segmentation and reconstruction pipeline has been developed. Segmentation of the neonatal cortex is not effectively done by existing algorithms designed for the adult brain because the contrast between grey and white matter is reversed. This causes pixels containing tissue mixtures to be incorrectly labelled by conventional methods. The neonatal cortical segmentation method that has been developed is based on a novel expectation-maximization (EM) method with explicit correction for mislabelled partial volume voxels. Based on the resulting cortical segmentation, an implicit surface evolution technique is adopted for the reconstruction of the cortex in neonates. The performance of the method is investigated by performing a detailed landmark study. To facilitate study of cortical development, a cortical surface registration algorithm for aligning the cortical surface is developed. The method first inflates extracted cortical surfaces and then performs a non-rigid surface registration using free-form deformations (FFDs) to remove residual alignment. Validation experiments using data labelled by an expert observer demonstrate that the method can capture local changes and follow the growth of specific sulcus

    Field theoretic formulation and empirical tracking of spatial processes

    Get PDF
    Spatial processes are attacked on two fronts. On the one hand, tools from theoretical and statistical physics can be used to understand behaviour in complex, spatially-extended multi-body systems. On the other hand, computer vision and statistical analysis can be used to study 4D microscopy data to observe and understand real spatial processes in vivo. On the rst of these fronts, analytical models are developed for abstract processes, which can be simulated on graphs and lattices before considering real-world applications in elds such as biology, epidemiology or ecology. In the eld theoretic formulation of spatial processes, techniques originating in quantum eld theory such as canonical quantisation and the renormalization group are applied to reaction-di usion processes by analogy. These techniques are combined in the study of critical phenomena or critical dynamics. At this level, one is often interested in the scaling behaviour; how the correlation functions scale for di erent dimensions in geometric space. This can lead to a better understanding of how macroscopic patterns relate to microscopic interactions. In this vein, the trace of a branching random walk on various graphs is studied. In the thesis, a distinctly abstract approach is emphasised in order to support an algorithmic approach to parts of the formalism. A model of self-organised criticality, the Abelian sandpile model, is also considered. By exploiting a bijection between recurrent con gurations and spanning trees, an e cient Monte Carlo algorithm is developed to simulate sandpile processes on large lattices. On the second front, two case studies are considered; migratory patterns of leukaemia cells and mitotic events in Arabidopsis roots. In the rst case, tools from statistical physics are used to study the spatial dynamics of di erent leukaemia cell lineages before and after a treatment. One key result is that we can discriminate between migratory patterns in response to treatment, classifying cell motility in terms of sup/super/di usive regimes. For the second case study, a novel algorithm is developed to processes a 4D light-sheet microscopy dataset. The combination of transient uorescent markers and a poorly localised specimen in the eld of view leads to a challenging tracking problem. A fuzzy registration-tracking algorithm is developed to track mitotic events so as to understand their spatiotemporal dynamics under normal conditions and after tissue damage.Open Acces

    The Distribution of the Size of Price Changes

    Get PDF
    Different theories of price stickiness have distinct implications on the number of modes in the distribution of price changes. We formally test for the number of modes in the price change distribution of 36 supermarkets, spanning 22 countries and 5 continents. We present results for three modality tests: the two best-known tests in the statistical literature, Hartigan's Dip and Silverman's Bandwidth, and a test designed in this paper, called the Proportional Mass test (PM). Three main results are uncovered. First, when the traditional tests are used, unimodality is rejected in about 90 percent of the retailers. When we used the PM test, which reduces the impact of smaller modes in the distribution and can be applied to test for modality around zero percent, we still reject unimodality in two thirds of the supermarkets. Second, category-level heterogeneity can account for about half of the PM test's rejections of unimodality. Finally, a simulation of the model in Alvarez, Lippi, and Paciello (2010) shows that the data is consistent a combination of both time and state-dependent pricing behaviors.

    Online Spectral Clustering on Network Streams

    Get PDF
    Graph is an extremely useful representation of a wide variety of practical systems in data analysis. Recently, with the fast accumulation of stream data from various type of networks, significant research interests have arisen on spectral clustering for network streams (or evolving networks). Compared with the general spectral clustering problem, the data analysis of this new type of problems may have additional requirements, such as short processing time, scalability in distributed computing environments, and temporal variation tracking. However, to design a spectral clustering method to satisfy these requirements certainly presents non-trivial efforts. There are three major challenges for the new algorithm design. The first challenge is online clustering computation. Most of the existing spectral methods on evolving networks are off-line methods, using standard eigensystem solvers such as the Lanczos method. It needs to recompute solutions from scratch at each time point. The second challenge is the parallelization of algorithms. To parallelize such algorithms is non-trivial since standard eigen solvers are iterative algorithms and the number of iterations can not be predetermined. The third challenge is the very limited existing work. In addition, there exists multiple limitations in the existing method, such as computational inefficiency on large similarity changes, the lack of sound theoretical basis, and the lack of effective way to handle accumulated approximate errors and large data variations over time. In this thesis, we proposed a new online spectral graph clustering approach with a family of three novel spectrum approximation algorithms. Our algorithms incrementally update the eigenpairs in an online manner to improve the computational performance. Our approaches outperformed the existing method in computational efficiency and scalability while retaining competitive or even better clustering accuracy. We derived our spectrum approximation techniques GEPT and EEPT through formal theoretical analysis. The well established matrix perturbation theory forms a solid theoretic foundation for our online clustering method. We facilitated our clustering method with a new metric to track accumulated approximation errors and measure the short-term temporal variation. The metric not only provides a balance between computational efficiency and clustering accuracy, but also offers a useful tool to adapt the online algorithm to the condition of unexpected drastic noise. In addition, we discussed our preliminary work on approximate graph mining with evolutionary process, non-stationary Bayesian Network structure learning from non-stationary time series data, and Bayesian Network structure learning with text priors imposed by non-parametric hierarchical topic modeling

    Forecasting interest rates: A Comparative assessment of some second generation non-linear model

    Get PDF
    Modelling and forecasting of interest rates has traditionally proceeded in the framework of linear stationary models such as ARMA and VAR, but only with moderate success. We examine here four models which account for several specific features of real world asset prices such as non-stationarity and non-linearity. Our four candidate models are based respectively on wavelet analysis, mixed spectrum analysis, non-linear ARMA models with Fourier coefficients, and the Kalman filter. These models are applied to weekly data on interest rates in India, and their forecasting performance is evaluated vis-…-vis three GARCH models (GARCH (1,1), GARCH-M (1,1) and EGARCH (1,1)) as well as the random walk model. The Kalman filter model emerges at the top, with wavelet and mixed spectrum models also showing considerable promise.Interest rates, wavelets, mixed spectra, non-linear ARMA, Kalman filter, GARCH, Forecast encompassing

    FORECASTING INTEREST RATES - A COMPARATIVE ASSESSMENT OF SOME SECOND GENERATION NON-LINEAR MODELS

    Get PDF
    Modelling and forecasting of interest rates has traditionally proceeded in the framework of linear stationary models such as ARMA and VAR, but only with moderate success. We examine here four models which account for several specific features of real world asset prices such as non-stationarity and non-linearity. Our four candidate models are based respectively on wavelet analysis, mixed spectrum analysis, non-linear ARMA models with Fourier coefficients, and the Kalman filter. These models are applied to weekly data on interest rates in India, and their forecasting performance is evaluated vis--vis three GARCH models (GARCH (1,1), GARCH-M (1,1) and EGARCH (1,1)) as well as the random walk model. The Kalman filter model emerges at the top, with wavelet and mixed spectrum models also showing considerable promise.interest rates, wavelets, mixed spectra, non-linear ARMA, Kalman filter, GARCH, Forecast encompassing.
    • …
    corecore