41,379 research outputs found

    Robust phase retrieval with the swept approximate message passing (prSAMP) algorithm

    Full text link
    In phase retrieval, the goal is to recover a complex signal from the magnitude of its linear measurements. While many well-known algorithms guarantee deterministic recovery of the unknown signal using i.i.d. random measurement matrices, they suffer serious convergence issues some ill-conditioned matrices. As an example, this happens in optical imagers using binary intensity-only spatial light modulators to shape the input wavefront. The problem of ill-conditioned measurement matrices has also been a topic of interest for compressed sensing researchers during the past decade. In this paper, using recent advances in generic compressed sensing, we propose a new phase retrieval algorithm that well-adopts for both Gaussian i.i.d. and binary matrices using both sparse and dense input signals. This algorithm is also robust to the strong noise levels found in some imaging applications

    The Extended Parameter Filter

    Full text link
    The parameters of temporal models, such as dynamic Bayesian networks, may be modelled in a Bayesian context as static or atemporal variables that influence transition probabilities at every time step. Particle filters fail for models that include such variables, while methods that use Gibbs sampling of parameter variables may incur a per-sample cost that grows linearly with the length of the observation sequence. Storvik devised a method for incremental computation of exact sufficient statistics that, for some cases, reduces the per-sample cost to a constant. In this paper, we demonstrate a connection between Storvik's filter and a Kalman filter in parameter space and establish more general conditions under which Storvik's filter works. Drawing on an analogy to the extended Kalman filter, we develop and analyze, both theoretically and experimentally, a Taylor approximation to the parameter posterior that allows Storvik's method to be applied to a broader class of models. Our experiments on both synthetic examples and real applications show improvement over existing methods

    Structural Drift: The Population Dynamics of Sequential Learning

    Get PDF
    We introduce a theory of sequential causal inference in which learners in a chain estimate a structural model from their upstream teacher and then pass samples from the model to their downstream student. It extends the population dynamics of genetic drift, recasting Kimura's selectively neutral theory as a special case of a generalized drift process using structured populations with memory. We examine the diffusion and fixation properties of several drift processes and propose applications to learning, inference, and evolution. We also demonstrate how the organization of drift process space controls fidelity, facilitates innovations, and leads to information loss in sequential learning with and without memory.Comment: 15 pages, 9 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/sdrift.ht

    Inference via low-dimensional couplings

    Full text link
    We investigate the low-dimensional structure of deterministic transformations between random variables, i.e., transport maps between probability measures. In the context of statistics and machine learning, these transformations can be used to couple a tractable "reference" measure (e.g., a standard Gaussian) with a target measure of interest. Direct simulation from the desired measure can then be achieved by pushing forward reference samples through the map. Yet characterizing such a map---e.g., representing and evaluating it---grows challenging in high dimensions. The central contribution of this paper is to establish a link between the Markov properties of the target measure and the existence of low-dimensional couplings, induced by transport maps that are sparse and/or decomposable. Our analysis not only facilitates the construction of transformations in high-dimensional settings, but also suggests new inference methodologies for continuous non-Gaussian graphical models. For instance, in the context of nonlinear state-space models, we describe new variational algorithms for filtering, smoothing, and sequential parameter inference. These algorithms can be understood as the natural generalization---to the non-Gaussian case---of the square-root Rauch-Tung-Striebel Gaussian smoother.Comment: 78 pages, 25 figure
    • …
    corecore