436 research outputs found

    The Parameter-Less Self-Organizing Map algorithm

    Get PDF
    The Parameter-Less Self-Organizing Map (PLSOM) is a new neural network algorithm based on the Self-Organizing Map (SOM). It eliminates the need for a learning rate and annealing schemes for learning rate and neighbourhood size. We discuss the relative performance of the PLSOM and the SOM and demonstrate some tasks in which the SOM fails but the PLSOM performs satisfactory. Finally we discuss some example applications of the PLSOM and present a proof of ordering under certain limited conditions.Comment: 29 pages, 27 figures. Based on publication in IEEE Trans. on Neural Network

    Universal dynamical properties preclude standard clustering in a large class of biochemical data

    Get PDF
    Motivation: Clustering of chemical and biochemical data based on observed features is a central cognitive step in the analysis of chemical substances, in particular in combinatorial chemistry, or of complex biochemical reaction networks. Often, for reasons unknown to the researcher, this step produces disappointing results. Once the sources of the problem are known, improved clustering methods might revitalize the statistical approach of compound and reaction search and analysis. Here, we present a generic mechanism that may be at the origin of many clustering difficulties. Results: The variety of dynamical behaviors that can be exhibited by complex biochemical reactions on variation of the system parameters are fundamental system fingerprints. In parameter space, shrimp-like or swallow-tail structures separate parameter sets that lead to stable periodic dynamical behavior from those leading to irregular behavior. We work out the genericity of this phenomenon and demonstrate novel examples for their occurrence in realistic models of biophysics. Although we elucidate the phenomenon by considering the emergence of periodicity in dependence on system parameters in a low-dimensional parameter space, the conclusions from our simple setting are shown to continue to be valid for features in a higher-dimensional feature space, as long as the feature-generating mechanism is not too extreme and the dimension of this space is not too high compared with the amount of available data. Availability and implementation: For online versions of super-paramagnetic clustering see http://stoop.ini.uzh.ch/research/clustering. Contact: [email protected] Supplementary information: Supplementary data are available at Bioinformatics onlin

    Memristors for the Curious Outsiders

    Full text link
    We present both an overview and a perspective of recent experimental advances and proposed new approaches to performing computation using memristors. A memristor is a 2-terminal passive component with a dynamic resistance depending on an internal parameter. We provide an brief historical introduction, as well as an overview over the physical mechanism that lead to memristive behavior. This review is meant to guide nonpractitioners in the field of memristive circuits and their connection to machine learning and neural computation.Comment: Perpective paper for MDPI Technologies; 43 page

    Generative learning for nonlinear dynamics

    Full text link
    Modern generative machine learning models demonstrate surprising ability to create realistic outputs far beyond their training data, such as photorealistic artwork, accurate protein structures, or conversational text. These successes suggest that generative models learn to effectively parametrize and sample arbitrarily complex distributions. Beginning half a century ago, foundational works in nonlinear dynamics used tools from information theory to infer properties of chaotic attractors from time series, motivating the development of algorithms for parametrizing chaos in real datasets. In this perspective, we aim to connect these classical works to emerging themes in large-scale generative statistical learning. We first consider classical attractor reconstruction, which mirrors constraints on latent representations learned by state space models of time series. We next revisit early efforts to use symbolic approximations to compare minimal discrete generators underlying complex processes, a problem relevant to modern efforts to distill and interpret black-box statistical models. Emerging interdisciplinary works bridge nonlinear dynamics and learning theory, such as operator-theoretic methods for complex fluid flows, or detection of broken detailed balance in biological datasets. We anticipate that future machine learning techniques may revisit other classical concepts from nonlinear dynamics, such as transinformation decay and complexity-entropy tradeoffs.Comment: 23 pages, 4 figure

    Data-driven discovery of the mechanism of systems described by partial differential equations

    Get PDF
    openTuring patterns, a phenomenon introduced by mathematician and computer scientist Alan Turing, are intricate spatial patterns that emerge in reaction-diffusion systems, reflecting the dynamic interplay between chemical reactions and diffusion. These patterns, ranging from spots to stripes, offer physicists a captivating playground to explore the fundamental principles governing self-organization in complex systems. Studying Turing patterns not only unveils the underlying mechanisms of pattern formation but also provides valuable insights into the universal principles guiding the spontaneous emergence of order in nature. This master thesis explores the application of a sparse regression framework to analyze synthetic Turing patterns and experimental data from Drosophila embryos. Focusing on the Brussellator model, the algorithm aims to identify coefficients of reaction-diffusion equations based on stationary state and oscillation data. In the analysis of synthetic Turing patterns, we observe that the algorithm's success is intricately linked to parameter selection, particularly the regularization strength and sparsity threshold. The delicate balance between the model's error and complexity, as measured by interaction sparsity, and illustrated by a complexity-error tradeoff. Despite promising results in low to moderate noise scenarios, the algorithm's sensitivity to noise, especially in Laplacian computation, remains a limitation. Extending the study to synthetic oscillation data reveals the algorithm's improved robustness to noise, with successful identification of diffusion in the second-best sparse model. However, challenges persist in accurate diffusion constant estimation and sensitivity to noise (mainly in derivative calculations). Applying the sparse regression framework to Drosophila Turing patterns uncovers additional challenges, primarily related to the high noise level in Laplacian computation, limiting the algorithm's accuracy in identifying reaction terms. The conclusions highlight the need for future developments, including the exploration of noise-robust methods, data augmentation strategies, integration with biological models, and advanced parameter optimization techniques to enhance the framework's robustness and applicability to real-world cases. Our work has thus contributed to a deeper understanding of complex biological phenomena governed by reaction-diffusion dynamics

    The Capabilities of Chaos and Complexity

    Get PDF
    To what degree could chaos and complexity have organized a Peptide or RNA World of crude yet necessarily integrated protometabolism? How far could such protolife evolve in the absence of a heritable linear digital symbol system that could mutate, instruct, regulate, optimize and maintain metabolic homeostasis? To address these questions, chaos, complexity, self-ordered states, and organization must all be carefully defined and distinguished. In addition their cause-and-effect relationships and mechanisms of action must be delineated. Are there any formal (non physical, abstract, conceptual, algorithmic) components to chaos, complexity, self-ordering and organization, or are they entirely physicodynamic (physical, mass/energy interaction alone)? Chaos and complexity can produce some fascinating self-ordered phenomena. But can spontaneous chaos and complexity steer events and processes toward pragmatic benefit, select function over non function, optimize algorithms, integrate circuits, produce computational halting, organize processes into formal systems, control and regulate existing systems toward greater efficiency? The question is pursued of whether there might be some yet-to-be discovered new law of biology that will elucidate the derivation of prescriptive information and control. “System” will be rigorously defined. Can a low-informational rapid succession of Prigogine’s dissipative structures self-order into bona fide organization
    • …
    corecore