12,936 research outputs found

    Computer-aided whole-cell design:taking a holistic approach by integrating synthetic with systems biology

    Get PDF
    Computer-aided design for synthetic biology promises to accelerate the rational and robust engineering of biological systems; it requires both detailed and quantitative mathematical and experimental models of the processes to (re)design, and software and tools for genetic engineering and DNA assembly. Ultimately, the increased precision in the design phase will have a dramatic impact on the production of designer cells and organisms with bespoke functions and increased modularity. Computer-aided design strategies require quantitative representations of cells, able to capture multiscale processes and link genotypes to phenotypes. Here, we present a perspective on how whole-cell, multiscale models could transform design-build-test-learn cycles in synthetic biology. We show how these models could significantly aid in the design and learn phases while reducing experimental testing by presenting case studies spanning from genome minimization to cell-free systems, and we discuss several challenges for the realization of our vision. The possibility to describe and build in silico whole-cells offers an opportunity to develop increasingly automatized, precise and accessible computer-aided design tools and strategies throughout novel interdisciplinary collaborations

    Agent-based modeling: a systematic assessment of use cases and requirements for enhancing pharmaceutical research and development productivity.

    Get PDF
    A crisis continues to brew within the pharmaceutical research and development (R&D) enterprise: productivity continues declining as costs rise, despite ongoing, often dramatic scientific and technical advances. To reverse this trend, we offer various suggestions for both the expansion and broader adoption of modeling and simulation (M&S) methods. We suggest strategies and scenarios intended to enable new M&S use cases that directly engage R&D knowledge generation and build actionable mechanistic insight, thereby opening the door to enhanced productivity. What M&S requirements must be satisfied to access and open the door, and begin reversing the productivity decline? Can current methods and tools fulfill the requirements, or are new methods necessary? We draw on the relevant, recent literature to provide and explore answers. In so doing, we identify essential, key roles for agent-based and other methods. We assemble a list of requirements necessary for M&S to meet the diverse needs distilled from a collection of research, review, and opinion articles. We argue that to realize its full potential, M&S should be actualized within a larger information technology framework--a dynamic knowledge repository--wherein models of various types execute, evolve, and increase in accuracy over time. We offer some details of the issues that must be addressed for such a repository to accrue the capabilities needed to reverse the productivity decline

    Generating realistic scaled complex networks

    Get PDF
    Research on generative models is a central project in the emerging field of network science, and it studies how statistical patterns found in real networks could be generated by formal rules. Output from these generative models is then the basis for designing and evaluating computational methods on networks, and for verification and simulation studies. During the last two decades, a variety of models has been proposed with an ultimate goal of achieving comprehensive realism for the generated networks. In this study, we (a) introduce a new generator, termed ReCoN; (b) explore how ReCoN and some existing models can be fitted to an original network to produce a structurally similar replica, (c) use ReCoN to produce networks much larger than the original exemplar, and finally (d) discuss open problems and promising research directions. In a comparative experimental study, we find that ReCoN is often superior to many other state-of-the-art network generation methods. We argue that ReCoN is a scalable and effective tool for modeling a given network while preserving important properties at both micro- and macroscopic scales, and for scaling the exemplar data by orders of magnitude in size.Comment: 26 pages, 13 figures, extended version, a preliminary version of the paper was presented at the 5th International Workshop on Complex Networks and their Application

    Partial differential equations for self-organization in cellular and developmental biology

    Get PDF
    Understanding the mechanisms governing and regulating the emergence of structure and heterogeneity within cellular systems, such as the developing embryo, represents a multiscale challenge typifying current integrative biology research, namely, explaining the macroscale behaviour of a system from microscale dynamics. This review will focus upon modelling how cell-based dynamics orchestrate the emergence of higher level structure. After surveying representative biological examples and the models used to describe them, we will assess how developments at the scale of molecular biology have impacted on current theoretical frameworks, and the new modelling opportunities that are emerging as a result. We shall restrict our survey of mathematical approaches to partial differential equations and the tools required for their analysis. We will discuss the gap between the modelling abstraction and biological reality, the challenges this presents and highlight some open problems in the field

    Particle detection and tracking in fluorescence time-lapse imaging: a contrario approach

    Full text link
    This paper proposes a probabilistic approach for the detection and the tracking of particles in fluorescent time-lapse imaging. In the presence of a very noised and poor-quality data, particles and trajectories can be characterized by an a contrario model, that estimates the probability of observing the structures of interest in random data. This approach, first introduced in the modeling of human visual perception and then successfully applied in many image processing tasks, leads to algorithms that neither require a previous learning stage, nor a tedious parameter tuning and are very robust to noise. Comparative evaluations against a well-established baseline show that the proposed approach outperforms the state of the art.Comment: Published in Journal of Machine Vision and Application
    • …
    corecore