20,040 research outputs found

    Active Self-Assembly of Algorithmic Shapes and Patterns in Polylogarithmic Time

    Get PDF
    We describe a computational model for studying the complexity of self-assembled structures with active molecular components. Our model captures notions of growth and movement ubiquitous in biological systems. The model is inspired by biology's fantastic ability to assemble biomolecules that form systems with complicated structure and dynamics, from molecular motors that walk on rigid tracks and proteins that dynamically alter the structure of the cell during mitosis, to embryonic development where large-scale complicated organisms efficiently grow from a single cell. Using this active self-assembly model, we show how to efficiently self-assemble shapes and patterns from simple monomers. For example, we show how to grow a line of monomers in time and number of monomer states that is merely logarithmic in the length of the line. Our main results show how to grow arbitrary connected two-dimensional geometric shapes and patterns in expected time that is polylogarithmic in the size of the shape, plus roughly the time required to run a Turing machine deciding whether or not a given pixel is in the shape. We do this while keeping the number of monomer types logarithmic in shape size, plus those monomers required by the Kolmogorov complexity of the shape or pattern. This work thus highlights the efficiency advantages of active self-assembly over passive self-assembly and motivates experimental effort to construct general-purpose active molecular self-assembly systems

    Automatic Differentiation Variational Inference

    Full text link
    Probabilistic modeling is iterative. A scientist posits a simple model, fits it to her data, refines it according to her analysis, and repeats. However, fitting complex models to large data is a bottleneck in this process. Deriving algorithms for new models can be both mathematically and computationally challenging, which makes it difficult to efficiently cycle through the steps. To this end, we develop automatic differentiation variational inference (ADVI). Using our method, the scientist only provides a probabilistic model and a dataset, nothing else. ADVI automatically derives an efficient variational inference algorithm, freeing the scientist to refine and explore many models. ADVI supports a broad class of models-no conjugacy assumptions are required. We study ADVI across ten different models and apply it to a dataset with millions of observations. ADVI is integrated into Stan, a probabilistic programming system; it is available for immediate use

    Output-Feedback Control of Nonlinear Systems using Control Contraction Metrics and Convex Optimization

    Get PDF
    Control contraction metrics (CCMs) are a new approach to nonlinear control design based on contraction theory. The resulting design problems are expressed as pointwise linear matrix inequalities and are and well-suited to solution via convex optimization. In this paper, we extend the theory on CCMs by showing that a pair of "dual" observer and controller problems can be solved using pointwise linear matrix inequalities, and that when a solution exists a separation principle holds. That is, a stabilizing output-feedback controller can be found. The procedure is demonstrated using a benchmark problem of nonlinear control: the Moore-Greitzer jet engine compressor model.Comment: Conference submissio
    • …
    corecore