470 research outputs found

    Functional Sensory Representations of Natural Stimuli: the Case of Spatial Hearing

    Get PDF
    In this thesis I attempt to explain mechanisms of neuronal coding in the auditory system as a form of adaptation to statistics of natural stereo sounds. To this end I analyse recordings of real-world auditory environments and construct novel statistical models of these data. I further compare regularities present in natural stimuli with known, experimentally observed neuronal mechanisms of spatial hearing. In a more general perspective, I use binaural auditory system as a starting point to consider the notion of function implemented by sensory neurons. In particular I argue for two, closely-related tenets: 1. The function of sensory neurons can not be fully elucidated without understanding statistics of natural stimuli they process. 2. Function of sensory representations is determined by redundancies present in the natural sensory environment. I present the evidence in support of the first tenet by describing and analysing marginal statistics of natural binaural sound. I compare observed, empirical distributions with knowledge from reductionist experiments. Such comparison allows to argue that the complexity of the spatial hearing task in the natural environment is much higher than analytic, physics-based predictions. I discuss the possibility that early brain stem circuits such as LSO and MSO do not \"compute sound localization\" as is often being claimed in the experimental literature. I propose that instead they perform a signal transformation, which constitutes the first step of a complex inference process. To support the second tenet I develop a hierarchical statistical model, which learns a joint sparse representation of amplitude and phase information from natural stereo sounds. I demonstrate that learned higher order features reproduce properties of auditory cortical neurons, when probed with spatial sounds. Reproduced aspects were hypothesized to be a manifestation of a fine-tuned computation specific to the sound-localization task. Here it is demonstrated that they rather reflect redundancies present in the natural stimulus. Taken together, results presented in this thesis suggest that efficient coding is a strategy useful for discovering structures (redundancies) in the input data. Their meaning has to be determined by the organism via environmental feedback

    BEYOND THE RECEPTIVE FIELD: AN ANALYSIS OF NATURAL SCENES AND A GEOMETRIC INTERPRETATION OF EFFICIENT CODING STRATEGIES BY THE MAMMALIAN VISUAL SYSTEM

    Full text link
    In biological and artificial neural networks the response properties of a visual neuron are often described in terms of a two-dimensional response map called the receptive field. This receptive field is intended to capture the basic behavior of a neuron and predict how that neuron will respond to a novel stimulus. However, the receptive field provides a good description of the neuron’s behavior only if the neurons in the network are linear. Neurons in an organism are in fact highly nonlinear, which means their responses are not completely described by their receptive fields. A number of studies have attempted to explain the properties of these neurons in terms of an efficient representation of natural scenes. In this thesis I will demonstrate the hidden computations and interactions a network of neurons performs which are not described by their receptive field. In the first study (Chapter 2), I address an aspect of natural scenes that is rarely considered in discussions of efficient coding. This study explores how the structural properties of an edge relate to the cause of the edge. I will show that neurons at the earliest stages of the visual system rather than just detecting edges (as depicted by their receptive fields) could potentially use these structural properties to identify the causes of an edge. The next three studies (Chapters 3,4, and 5), I explore the non-linear response of neurons. Most neurons in the visual pathway are nonlinear. To account for their behavior, we need an approach that goes beyond the classic receptive field. A variety of different approaches has attempted to explain this behavior. I present a geometric framework which attempts to provide a better description of the nonlinear response properties of neurons in the sparse coding network. I explore the geometric characterization of neurons in the efficient coding mechanisms like gain-control, a “fan equation” model for optimal sparsity, and a cascaded linear-nonlinear model. This geometric approach provides a deeper understanding of why sparse representations (including those of cortical visual neurons) give rise to nonlinear responses. The nonlinearities in artificial neurons are visualized and quantified in terms of the curvature of iso-response surfaces. I show that the magnitude of nonlinearities increases as the overcompleteness of the network increases, even though the linear receptive fields appears to be similar. In the next study (Chapter 6), I explore and define two forms of selectivity based on the curvature of the iso-response surfaces. The first form is “classic selectivity”, which is the stimulus that produces the optimum response from a neuron. The second form is “hyperselectivity” which is defined by the dropoff in response around the optimal stimulus due to the curvature of the isoresponse surfaces. I show that the hyperselectivity is unrelated to the classic selectivity. For example, it is possible for a neuron to be narrowly tuned (hyperselective) to a broadband stimulus. Further, I show that hyperselectivity in a neurons response profile breaks the Gabor-Heisenberg limits. Finally (Chapter 7), I show the effect of different learning rules, enforced by various cost functions used in the sparse coding network, on the response geometry of neurons. I demonstrate how different learning rules affect the interaction between the neurons in three-dimensional networks and the implications these findings have for a better representation of natural scene data in higher dimensions of image state space

    Disease Ecology and Adaptive Management of Brucellosis in Greater Yellowstone Elk

    Get PDF
    Brucellosis is a bacterial infection that primarily affects livestock and can also be transmitted to humans. In the Greater Yellowstone Ecosystem (GYE), elk (Cervus canadensis) and bison (Bison bison) are habitual carriers of Brucella abortus, which arrived to the region with cattle over a century ago. The disease was eliminated from cattle in the United States through widespread control efforts, but is now periodically transmitted back to cattle on open rangelands where they can come into contact with fetal tissues and fluids from disease-induced abortions that occur among elk during the late winter and spring. In Wyoming, south of Yellowstone National Park, there are 23 supplemental feedgrounds that operate annually and feed the majority of the region’s elk during a portion of the winter. The feedgrounds are controversial because of their association with brucellosis and may be shuttered in the future in part due to the arrival of chronic wasting disease. Using data collected at these feedgrounds, this study investigates the role of winter feedgrounds in the ecology of this host-pathogen relationship: it evaluates the full reproductive costs of the disease to affected elk, how herd demography influences pathogen transmission, and assesses management strategies aimed at reducing pathogen spread among elk. Using blood tests for pregnancy status and brucellosis exposure in female elk, I demonstrated a previously undocumented fertility cost associated with the pathogen which is not due to abortions, but which nearly doubles the estimated fertility cost to affected individuals. I also built mechanistic transmission models using time-series disease and count data from feedgrounds. Within that framework, I assessed various management actions including test-and-slaughter of test-positive elk, which I found to be counterproductive due to rapid recovery times and the protective effects of herd immunity. The overall picture that emerges of winter feedgrounds is one of imperfect practicality driven by social and political consideration, not pathogen control. These results illustrate the underappreciated importance that recruitment and population turnover have on the transmission dynamics of brucellosis in elk, a pathogen which itself flourishes in the reproductive tracts of individual animals and thus impacts vital rates at the population level. Together, this study contributes to the field of disease ecology using a unique long term disease data set of free-ranging wild ungulates

    Big Data Analytics and Information Science for Business and Biomedical Applications

    Get PDF
    The analysis of Big Data in biomedical as well as business and financial research has drawn much attention from researchers worldwide. This book provides a platform for the deep discussion of state-of-the-art statistical methods developed for the analysis of Big Data in these areas. Both applied and theoretical contributions are showcased

    Bayesian inference in neural circuits and synapses

    Get PDF
    Bayesian inference describes how to reason optimally under uncertainty. As the brain faces considerable uncertainty, it may be possible to understand aspects of neural computation using Bayesian inference. In this thesis, I address several questions within this broad theme. First, I show that con dence reports may, in some circumstances be Bayes optimal, by taking a \doubly Bayesian" strategy: computing the Bayesian model evidence for several di erent models of participant's behaviour, one of which is itself Bayesian. Second, I address a related question concerning features of the probability distributions realised by neural activity. In particular, it has been show that neural activity obeys Zipf's law, as do many other statistical distributions. We show the emergence of Zipf's law is in fact unsurprising, as it emerges from the existence of an underlying latent variable: ring rate. Third, I show that synaptic plasticity can be formulated as a Bayesian inference problem, and I give neural evidence in support of this proposition, based on the hypothesis that neurons sample from the resulting posterior distributions. Fourth, I consider how oscillatory excitatory-inhibitory circuits might perform inference by relating these circuits to a highly effective method for probabilistic inference: Hamiltonian Monte Carlo

    Logic and lineage impacts on functional transcription factor deployment for T-cell fate commitment

    Get PDF
    Transcription factors are the major agents that read the regulatory sequence information in the genome to initiate changes in expression of specific genes, both in development and in physiological activation responses. Their actions depend on site-specific DNA binding and are largely guided by their individual DNA target sequence specificities. However, their action is far more conditional in a real developmental context than would be expected for simple reading of local genomic DNA sequence, which is common to all cells in the organism. They are constrained by slow-changing chromatin states and by interactions with other transcription factors, which affect their occupancy patterns of potential sites across the genome. These mechanisms lead to emergent discontinuities in function even for transcription factors with minimally changing expression. This is well revealed by diverse lineages of blood cells developing throughout life from hematopoietic stem cells, which use overlapping combinations of transcription factors to drive strongly divergent gene regulation programs. Here, using development of T lymphocytes from hematopoietic multipotent progenitor cells as a focus, recent evidence is reviewed on how binding specificity and dynamics, transcription factor cooperativity, and chromatin state changes impact the effective regulatory functions of key transcription factors including PU.1, Runx1, Notch/RBPJ, and Bcl11b

    Conditional covariance estimation for dimension reduction and sensivity analysis

    Get PDF
    Cette thèse se concentre autour du problème de l'estimation de matrices de covariance conditionnelles et ses applications, en particulier sur la réduction de dimension et l'analyse de sensibilités. Dans le Chapitre 2 nous plaçons dans un modèle d'observation de type régression en grande dimension pour lequel nous souhaitons utiliser une méthodologie de type régression inverse par tranches. L'utilisation d'un opérateur fonctionnel, nous permettra d'appliquer une décomposition de Taylor autour d'un estimateur préliminaire de la densité jointe. Nous prouverons deux choses : notre estimateur est asymptoticalement normale avec une variance que dépend de la partie linéaire, et cette variance est efficace selon le point de vue de Cramér-Rao. Dans le Chapitre 3, nous étudions l'estimation de matrices de covariance conditionnelle dans un premier temps coordonnée par coordonnée, lesquelles dépendent de la densité jointe inconnue que nous remplacerons par un estimateur à noyaux. Nous trouverons que l'erreur quadratique moyenne de l'estimateur converge à une vitesse paramétrique si la distribution jointe appartient à une classe de fonctions lisses. Sinon, nous aurons une vitesse plus lent en fonction de la régularité de la densité de la densité jointe. Pour l'estimateur de la matrice complète, nous allons appliquer une transformation de régularisation de type "banding". Finalement, dans le Chapitre 4, nous allons utiliser nos résultats pour estimer des indices de Sobol utilisés en analyses de sensibilité. Ces indices mesurent l'influence des entrées par rapport a la sortie dans modèles complexes. L'avantage de notre implémentation est d'estimer les indices de Sobol sans l'utilisation de coûteuses méthodes de type Monte-Carlo. Certaines illustrations sont présentées dans le chapitre pour montrer les capacités de notre estimateur.This thesis will be focused in the estimation of conditional covariance matrices and their applications, in particular, in dimension reduction and sensitivity analyses. In Chapter 2, we are in a context of high-dimensional nonlinear regression. The main objective is to use the sliced inverse regression methodology. Using a functional operator depending on the joint density, we apply a Taylor decomposition around a preliminary estimator. We will prove two things: our estimator is asymptotical normal with variance depending only the linear part, and this variance is efficient from the Cramér-Rao point of view. In the Chapter 3, we study the estimation of conditional covariance matrices, first coordinate-wise where those parameters depend on the unknown joint density which we will replace it by a kernel estimator. We prove that the mean squared error of the nonparametric estimator has a parametric rate of convergence if the joint distribution belongs to some class of smooth functions. Otherwise, we get a slower rate depending on the regularity of the model. For the estimator of the whole matrix estimator, we will apply a regularization of type "banding". Finally, in Chapter 4, we apply our results to estimate the Sobol or sensitivity indices. These indices measure the influence of the inputs with respect to the output in complex models. The advantage of our implementation is that we can estimate the Sobol indices without use computing expensive Monte-Carlo methods. Some illustrations are presented in the chapter showing the capabilities of our estimator

    Radar Technology

    Get PDF
    In this book “Radar Technology”, the chapters are divided into four main topic areas: Topic area 1: “Radar Systems” consists of chapters which treat whole radar systems, environment and target functional chain. Topic area 2: “Radar Applications” shows various applications of radar systems, including meteorological radars, ground penetrating radars and glaciology. Topic area 3: “Radar Functional Chain and Signal Processing” describes several aspects of the radar signal processing. From parameter extraction, target detection over tracking and classification technologies. Topic area 4: “Radar Subsystems and Components” consists of design technology of radar subsystem components like antenna design or waveform design
    • …
    corecore