12,758 research outputs found

    Computational aspects of DNA mixture analysis

    Full text link
    Statistical analysis of DNA mixtures is known to pose computational challenges due to the enormous state space of possible DNA profiles. We propose a Bayesian network representation for genotypes, allowing computations to be performed locally involving only a few alleles at each step. In addition, we describe a general method for computing the expectation of a product of discrete random variables using auxiliary variables and probability propagation in a Bayesian network, which in combination with the genotype network allows efficient computation of the likelihood function and various other quantities relevant to the inference. Lastly, we introduce a set of diagnostic tools for assessing the adequacy of the model for describing a particular dataset

    Computers from plants we never made. Speculations

    Full text link
    We discuss possible designs and prototypes of computing systems that could be based on morphological development of roots, interaction of roots, and analog electrical computation with plants, and plant-derived electronic components. In morphological plant processors data are represented by initial configuration of roots and configurations of sources of attractants and repellents; results of computation are represented by topology of the roots' network. Computation is implemented by the roots following gradients of attractants and repellents, as well as interacting with each other. Problems solvable by plant roots, in principle, include shortest-path, minimum spanning tree, Voronoi diagram, α\alpha-shapes, convex subdivision of concave polygons. Electrical properties of plants can be modified by loading the plants with functional nanoparticles or coating parts of plants of conductive polymers. Thus, we are in position to make living variable resistors, capacitors, operational amplifiers, multipliers, potentiometers and fixed-function generators. The electrically modified plants can implement summation, integration with respect to time, inversion, multiplication, exponentiation, logarithm, division. Mathematical and engineering problems to be solved can be represented in plant root networks of resistive or reaction elements. Developments in plant-based computing architectures will trigger emergence of a unique community of biologists, electronic engineering and computer scientists working together to produce living electronic devices which future green computers will be made of.Comment: The chapter will be published in "Inspired by Nature. Computing inspired by physics, chemistry and biology. Essays presented to Julian Miller on the occasion of his 60th birthday", Editors: Susan Stepney and Andrew Adamatzky (Springer, 2017

    Gaussian Approximation of Collective Graphical Models

    Full text link
    The Collective Graphical Model (CGM) models a population of independent and identically distributed individuals when only collective statistics (i.e., counts of individuals) are observed. Exact inference in CGMs is intractable, and previous work has explored Markov Chain Monte Carlo (MCMC) and MAP approximations for learning and inference. This paper studies Gaussian approximations to the CGM. As the population grows large, we show that the CGM distribution converges to a multivariate Gaussian distribution (GCGM) that maintains the conditional independence properties of the original CGM. If the observations are exact marginals of the CGM or marginals that are corrupted by Gaussian noise, inference in the GCGM approximation can be computed efficiently in closed form. If the observations follow a different noise model (e.g., Poisson), then expectation propagation provides efficient and accurate approximate inference. The accuracy and speed of GCGM inference is compared to the MCMC and MAP methods on a simulated bird migration problem. The GCGM matches or exceeds the accuracy of the MAP method while being significantly faster.Comment: Accepted by ICML 2014. 10 page version with appendi

    Exploiting sparsity and sharing in probabilistic sensor data models

    Get PDF
    Probabilistic sensor models defined as dynamic Bayesian networks can possess an inherent sparsity that is not reflected in the structure of the network. Classical inference algorithms like variable elimination and junction tree propagation cannot exploit this sparsity. Also, they do not exploit the opportunities for sharing calculations among different time slices of the model. We show that, using a relational representation, inference expressions for these sensor models can be rewritten to make efficient use of sparsity and sharing
    corecore