75 research outputs found

    Optimization by adaptive stochastic descent

    Get PDF
    Published: March 16, 2018When standard optimization methods fail to find a satisfactory solution for a parameter fitting problem, a tempting recourse is to adjust parameters manually. While tedious, this approach can be surprisingly powerful in terms of achieving optimal or near-optimal solutions. This paper outlines an optimization algorithm, Adaptive Stochastic Descent (ASD), that has been designed to replicate the essential aspects of manual parameter fitting in an automated way. Specifically, ASD uses simple principles to form probabilistic assumptions about (a) which parameters have the greatest effect on the objective function, and (b) optimal step sizes for each parameter. We show that for a certain class of optimization problems (namely, those with a moderate to large number of scalar parameter dimensions, especially if some dimensions are more important than others), ASD is capable of minimizing the objective function with far fewer function evaluations than classic optimization methods, such as the Nelder-Mead nonlinear simplex, Levenberg-Marquardt gradient descent, simulated annealing, and genetic algorithms. As a case study, we show that ASD outperforms standard algorithms when used to determine how resources should be allocated in order to minimize new HIV infections in Swaziland.Cliff C. Kerr, Salvador Dura-Bernal, Tomasz G. Smolinski, George L. Chadderdon, David P. Wilso

    Integrating Machine Learning and Multiscale Modeling: Perspectives, Challenges, and Opportunities in the Biological, Biomedical, and Behavioral Sciences

    Full text link
    Fueled by breakthrough technology developments, the biological, biomedical, and behavioral sciences are now collecting more data than ever before. There is a critical need for time- and cost-efficient strategies to analyze and interpret these data to advance human health. The recent rise of machine learning as a powerful technique to integrate multimodality, multifidelity data, and reveal correlations between intertwined phenomena presents a special opportunity in this regard. However, classical machine learning techniques often ignore the fundamental laws of physics and result in ill-posed problems or non-physical solutions. Multiscale modeling is a successful strategy to integrate multiscale, multiphysics data and uncover mechanisms that explain the emergence of function. However, multiscale modeling alone often fails to efficiently combine large data sets from different sources and different levels of resolution. We show how machine learning and multiscale modeling can complement each other to create robust predictive models that integrate the underlying physics to manage ill-posed problems and explore massive design spaces. We critically review the current literature, highlight applications and opportunities, address open questions, and discuss potential challenges and limitations in four overarching topical areas: ordinary differential equations, partial differential equations, data-driven approaches, and theory-driven approaches. Towards these goals, we leverage expertise in applied mathematics, computer science, computational biology, biophysics, biomechanics, engineering mechanics, experimentation, and medicine. Our multidisciplinary perspective suggests that integrating machine learning and multiscale modeling can provide new insights into disease mechanisms, help identify new targets and treatment strategies, and inform decision making for the benefit of human health

    NetPyNE, a tool for data-driven multiscale modeling of brain circuits.

    Full text link
    Biophysical modeling of neuronal networks helps to integrate and interpret rapidly growing and disparate experimental datasets at multiple scales. The NetPyNE tool (www.netpyne.org) provides both programmatic and graphical interfaces to develop data-driven multiscale network models in NEURON. NetPyNE clearly separates model parameters from implementation code. Users provide specifications at a high level via a standardized declarative language, for example connectivity rules, to create millions of cell-to-cell connections. NetPyNE then enables users to generate the NEURON network, run efficiently parallelized simulations, optimize and explore network parameters through automated batch runs, and use built-in functions for visualization and analysis - connectivity matrices, voltage traces, spike raster plots, local field potentials, and information theoretic measures. NetPyNE also facilitates model sharing by exporting and importing standardized formats (NeuroML and SONATA). NetPyNE is already being used to teach computational neuroscience students and by modelers to investigate brain regions and phenomena

    NetPyNE, a tool for data-driven multiscale modeling of brain circuits

    Get PDF
    Biophysical modeling of neuronal networks helps to integrate and interpret rapidly growing and disparate experimental datasets at multiple scales. The NetPyNE tool (www.netpyne.org) provides both programmatic and graphical interfaces to develop data-driven multiscale network models in NEURON. NetPyNE clearly separates model parameters from implementation code. Users provide specifications at a high level via a standardized declarative language, for example connectivity rules, to create millions of cell-to-cell connections. NetPyNE then enables users to generate the NEURON network, run efficiently parallelized simulations, optimize and explore network parameters through automated batch runs, and use built-in functions for visualization and analysis – connectivity matrices, voltage traces, spike raster plots, local field potentials, and information theoretic measures. NetPyNE also facilitates model sharing by exporting and importing standardized formats (NeuroML and SONATA). NetPyNE is already being used to teach computational neuroscience students and by modelers to investigate brain regions and phenomena

    Geppetto: a reusable modular open platform for exploring neuroscience data and models

    Get PDF
    Geppetto is an open-source platform that provides generic middleware infrastructure for building both online and desktop tools for visualizing neuroscience models and data and managing simulations. Geppetto underpins a number of neuroscience applications, including Open Source Brain (OSB), Virtual Fly Brain (VFB), NEURON-UI and NetPyNE-UI. OSB is used by researchers to create and visualize computational neuroscience models described in NeuroML and simulate them through the browser. VFB is the reference hub for Drosophila melanogaster neural anatomy and imaging data including neuropil, segmented neurons, microscopy stacks and gene expression pattern data. Geppetto is also being used to build a new user interface for NEURON, a widely used neuronal simulation environment, and for NetPyNE, a Python package for network modelling using NEURON. Geppetto defines domain agnostic abstractions used by all these applications to represent their models and data and offers a set of modules and components to integrate, visualize and control simulations in a highly accessible way. The platform comprises a backend which can connect to external data sources, model repositories and simulators together with a highly customizable frontend.This article is part of a discussion meeting issue 'Connectome to behaviour: modelling C. elegans at cellular resolution'

    Open Source Brain: A Collaborative Resource for Visualizing, Analyzing, Simulating, and Developing Standardized Models of Neurons and Circuits

    Get PDF
    Computational models are powerful tools for exploring the properties of complex biological systems. In neuroscience, data-driven models of neural circuits that span multiple scales are increasingly being used to understand brain function in health and disease. But their adoption and reuse has been limited by the specialist knowledge required to evaluate and use them. To address this, we have developed Open Source Brain, a platform for sharing, viewing, analyzing, and simulating standardized models from different brain regions and species. Model structure and parameters can be automatically visualized and their dynamical properties explored through browser-based simulations. Infrastructure and tools for collaborative interaction, development, and testing are also provided. We demonstrate how existing components can be reused by constructing new models of inhibition-stabilized cortical networks that match recent experimental results. These features of Open Source Brain improve the accessibility, transparency, and reproducibility of models and facilitate their reuse by the wider community

    26th Annual Computational Neuroscience Meeting (CNS*2017): Part 1

    Get PDF
    • …
    corecore