802 research outputs found

    Recent advances in directional statistics

    Get PDF
    Mainstream statistical methodology is generally applicable to data observed in Euclidean space. There are, however, numerous contexts of considerable scientific interest in which the natural supports for the data under consideration are Riemannian manifolds like the unit circle, torus, sphere and their extensions. Typically, such data can be represented using one or more directions, and directional statistics is the branch of statistics that deals with their analysis. In this paper we provide a review of the many recent developments in the field since the publication of Mardia and Jupp (1999), still the most comprehensive text on directional statistics. Many of those developments have been stimulated by interesting applications in fields as diverse as astronomy, medicine, genetics, neurology, aeronautics, acoustics, image analysis, text mining, environmetrics, and machine learning. We begin by considering developments for the exploratory analysis of directional data before progressing to distributional models, general approaches to inference, hypothesis testing, regression, nonparametric curve estimation, methods for dimension reduction, classification and clustering, and the modelling of time series, spatial and spatio-temporal data. An overview of currently available software for analysing directional data is also provided, and potential future developments discussed.Comment: 61 page

    Quantifying Nonequilibrium Thermodynamics of Proteins Using Single Molecule Spectroscopy

    Get PDF
    Department of Biomedical EngineeringProteins are dynamic molecules, and their structural fluctuations play a crucial role in their function. Quantification of these motions is hence central to our understanding of a protein???s function. The timescales of these fluctuations span over 12 orders of magnitude from ultrafast picoseconds vibrational motions to slow seconds long global changes. While the ultrafast and slow motions have been well characterized, the intermediate conformational changes occurring at microsecond to millisecond timescales remain to be clearly understood. This thesis explores the use of single molecule multidimensional fluorescence spectroscopy to investigate protein conformational changes occurring at microsecond to millisecond timescales. The information of conformational transition rates gives insight into their operating principles and the mechanisms behind their biological functions. Single-molecule Two-Dimensional Fluorescence Lifetime Correlation Spectroscopy (sm-2DFLCS) is used to resolve distinct conformations of protein molecules based on different fluorescence lifetimes. Time resolved fluorescence photon traces are converted to two-dimensional emission delay correlation maps which are then converted to lifetime correlation spectra using a nonlinear optimization algorithm. Key advancements in the technique were achieved by adapting analytical strategies from 2D NMR analysis protocols which convert the underlying optimization problem from constrained to unconstrained optimization resulting in a fast and robust 2D Inverse Laplace Transformation algorithm (2D-ILT). This enables monitoring of forward and reverse conformational transitions in a protein independently. Nonequilibrium thermodynamics of pump protein Bacteriorhodopsin is characterized using the improvised 2D-ILT algorithm. Independent measurement of forward and reverse conformational transitions reveals microscopically irreversible transitions in the reaction cycle of the proton pump. Nonequilibrium thermodynamic properties such as entropy production, flux and affinity are quantified through portions of the reaction cycle. It is shown that the rate of irreversible transitions shows an inverse dependence on temperature and fitting the trend with Gibbs-Helmholtz relation yields experimentally determined enthalpy of transition. Coupling of proteins internal reaction coordinate and surrounding solvent is investigated by monitoring microsecond equilibrium fluctuations in the chromophore pocket of eGFP. It is observed that the dynamics of local rearrangements around the chromophore are coupled to the bulk viscosity of the solvent. The dependence is observed to deviate from Kramer???s scaling and the deviation is attributed to proteins??? internal friction.clos

    Advances in Evolutionary Algorithms

    Get PDF
    With the recent trends towards massive data sets and significant computational power, combined with evolutionary algorithmic advances evolutionary computation is becoming much more relevant to practice. Aim of the book is to present recent improvements, innovative ideas and concepts in a part of a huge EA field

    Identifying Structure Transitions Using Machine Learning Methods

    Get PDF
    Methodologies from data science and machine learning, both new and old, provide an exciting opportunity to investigate physical systems using extremely expressive statistical modeling techniques. Physical transitions are of particular interest, as they are accompanied by pattern changes in the configurations of the systems. Detecting and characterizing pattern changes in data happens to be a particular strength of statistical modeling in data science, especially with the highly expressive and flexible neural network models that have become increasingly computationally accessible in recent years through performance improvements in both hardware and algorithmic implementations. Conceptually, the machine learning approach can be regarded as one that employing algorithms that eschew explicit instructions in favor of strategies based around pattern extraction and inference driven by statistical analysis and large complex data sets. This allows for the investigation of physical systems using only raw configurational information to make inferences instead of relying on physical information obtained from a priori knowledge of the system. This work focuses on the extraction of useful compressed representations of physical configurations from systems of interest to automate phase classification tasks in addition to the identification of critical points and crossover regions

    Neuroengineering of Clustering Algorithms

    Get PDF
    Cluster analysis can be broadly divided into multivariate data visualization, clustering algorithms, and cluster validation. This dissertation contributes neural network-based techniques to perform all three unsupervised learning tasks. Particularly, the first paper provides a comprehensive review on adaptive resonance theory (ART) models for engineering applications and provides context for the four subsequent papers. These papers are devoted to enhancements of ART-based clustering algorithms from (a) a practical perspective by exploiting the visual assessment of cluster tendency (VAT) sorting algorithm as a preprocessor for ART offline training, thus mitigating ordering effects; and (b) an engineering perspective by designing a family of multi-criteria ART models: dual vigilance fuzzy ART and distributed dual vigilance fuzzy ART (both of which are capable of detecting complex cluster structures), merge ART (aggregates partitions and lessens ordering effects in online learning), and cluster validity index vigilance in fuzzy ART (features a robust vigilance parameter selection and alleviates ordering effects in offline learning). The sixth paper consists of enhancements to data visualization using self-organizing maps (SOMs) by depicting in the reduced dimension and topology-preserving SOM grid information-theoretic similarity measures between neighboring neurons. This visualization\u27s parameters are estimated using samples selected via a single-linkage procedure, thereby generating heatmaps that portray more homogeneous within-cluster similarities and crisper between-cluster boundaries. The seventh paper presents incremental cluster validity indices (iCVIs) realized by (a) incorporating existing formulations of online computations for clusters\u27 descriptors, or (b) modifying an existing ART-based model and incrementally updating local density counts between prototypes. Moreover, this last paper provides the first comprehensive comparison of iCVIs in the computational intelligence literature --Abstract, page iv

    Sequential optimal design of neurophysiology experiments

    Get PDF
    For well over 200 years, scientists and doctors have been poking and prodding brains in every which way in an effort to understand how they work. The earliest pokes were quite crude, often involving permanent forms of brain damage. Though neural injury continues to be an active area of research within neuroscience, technology has given neuroscientists a number of tools for stimulating and observing the brain in very subtle ways. Nonetheless, the basic experimental paradigm remains the same; poke the brain and see what happens. For example, neuroscientists studying the visual or auditory system can easily generate any image or sound they can imagine to see how an organism or neuron will respond. Since neuroscientists can now easily design more pokes then they could every deliver, a fundamental question is ``What pokes should they actually use?' The complexity of the brain means that only a small number of the pokes scientists can deliver will produce any information about the brain. One of the fundamental challenges of experimental neuroscience is finding the right stimulus parameters to produce an informative response in the system being studied. This thesis addresses this problem by developing algorithms to sequentially optimize neurophysiology experiments. Every experiment we conduct contains information about how the brain works. Before conducting the next experiment we should use what we have already learned to decide which experiment we should perform next. In particular, we should design an experiment which will reveal the most information about the brain. At a high level, neuroscientists already perform this type of sequential, optimal experimental design; for example crude experiments which knockout entire regions of the brain have given rise to modern experimental techniques which probe the responses of individual neurons using finely tuned stimuli. The goal of this thesis is to develop automated and rigorous methods for optimizing neurophysiology experiments efficiently and at a much finer time scale. In particular, we present methods for near instantaneous optimization of the stimulus being used to drive a neuron.Ph.D.Committee Co-Chair: Butera, Robert; Committee Co-Chair: Paninski, Liam; Committee Member: Isbell, Charles; Committee Member: Rozell, Chris; Committee Member: Stanley, Garrett; Committee Member: Vidakovic, Bran

    Subject index volumes 1–92

    Get PDF

    Across chiral and achiral worlds : statistical validation in VCD and explorations in momentum space

    Get PDF
    This work covers two very different topics, which differ by one main characteristic: chirality. This is a geometric property an object possesses when its mirror image is not superimposable with itself. It is highly important in chemistry and the source of Vibrational Circular Dichroism, the first of the two subjects. As for many spectroscopic techniques, comparing theory with experiment can be a non-trivial task. The aim of this work is to aid with this process using a statistical scheme. A second topic covers the abstract realm of momentum space electron densities, which are - in contrast to the former topic - achiral distributions. They can be experimentally probed using Compton Scattering experiments and Electron Momentum Spectroscopy, revealing a fundamentally different point of view to approach chemistry

    Blind restoration of images with penalty-based decision making : a consensus approach

    Get PDF
    In this thesis we show a relationship between fuzzy decision making and image processing . Various applications for image noise reduction with consensus methodology are introduced. A new approach is introduced to deal with non-stationary Gaussian noise and spatial non-stationary noise in MRI

    Error propagation in pattern recognition systems: Impact of quality on fingerprint categorization

    Get PDF
    The aspect of quality in pattern classification has recently been explored in the context of biometric identification and authentication systems. The results presented in the literature indicate that incorporating information about quality of the input pattern leads to improved classification performance. The quality itself, however, can be defined in a number of ways, and its role in the various stages of pattern classification is often ambiguous or ad hoc. In this dissertation a more systematic approach to the incorporation of localized quality metrics into the pattern recognition process is developed for the specific task of fingerprint categorization. Quality is defined not as an intrinsic property of the image, but rather in terms of a set of defects introduced to it. A number of fingerprint images have been examined and the important quality defects have been identified and modeled in a mathematically tractable way. The models are flexible and can be used to generate synthetic images that can facilitate algorithm development and large scale, less time consuming performance testing. The effect of quality defects on various stages of the fingerprint recognition process are examined both analytically and empirically. For these defect models, it is shown that the uncertainty of parameter estimates, i.e. extracted fingerprint features, is the key quantity that can be calculated and propagated forward through the stages of the fingerprint classification process. Modified image processing techniques that explicitly utilize local quality metrics in the extraction of features useful in fingerprint classification, such as ridge orientation flow field, are presented and their performance is investigated
    • 

    corecore