53 research outputs found

    Output Stability and Semilinear Sets in Chemical Reaction Networks and Deciders

    Full text link
    Abstract. We study the set of output stable configurations of chemical reaction deciders (CRDs). It turns out that CRDs with only bimolecular reactions (which are almost equivalent to population protocols) have a special structure that allows for an algorithm to efficiently calculate the (finite) set of minimal output stable configurations. As a consequence, a relatively large sequence of configurations may be efficiently checked for output stability. We also provide a number of observations regarding the semilinearity result of Angluin et al. [Distrib. Comput., 2007] from the context of population protocols (which is a central result for output stable CRDs). In particular, we observe that the computation-friendly class of totally stable CRDs has equal expressive power as the larger class of output stable CRDs.

    On the Runtime of Chemical Reaction Networks Beyond Idealized Conditions

    Get PDF
    This paper studies the (discrete) chemical reaction network (CRN) computational model that emerged in the last two decades as an abstraction for molecular programming. The correctness of CRN protocols is typically established under one of two possible schedulers that determine how the execution advances: (1) a stochastic scheduler that obeys the (continuous time) Markov process dictated by the standard model of stochastic chemical kinetics; or (2) an adversarial scheduler whose only commitment is to maintain a certain fairness condition. The latter scheduler is justified by the fact that the former one crucially assumes "idealized conditions" that more often than not, do not hold in real wet-lab experiments. However, when it comes to analyzing the runtime of CRN protocols, the existing literature focuses strictly on the stochastic scheduler, thus raising the research question that drives this work: Is there a meaningful way to quantify the runtime of CRNs without the idealized conditions assumption? The main conceptual contribution of the current paper is to answer this question in the affirmative, formulating a new runtime measure for CRN protocols that does not rely on idealized conditions. This runtime measure is based on an adapted (weaker) fairness condition as well as a novel scheme that enables partitioning the execution into short rounds and charging the runtime for each round individually (inspired by definitions for the runtime of asynchronous distributed algorithms). Following that, we turn to investigate various fundamental computational tasks and establish (often tight) bounds on the runtime of the corresponding CRN protocols operating under the adversarial scheduler. This includes an almost complete chart of the runtime complexity landscape of predicate decidability tasks

    Evolution from the ground up with Amee – From basic concepts to explorative modeling

    Get PDF
    Evolutionary theory has been the foundation of biological research for about a century now, yet over the past few decades, new discoveries and theoretical advances have rapidly transformed our understanding of the evolutionary process. Foremost among them are evolutionary developmental biology, epigenetic inheritance, and various forms of evolu- tionarily relevant phenotypic plasticity, as well as cultural evolution, which ultimately led to the conceptualization of an extended evolutionary synthesis. Starting from abstract principles rooted in complexity theory, this thesis aims to provide a unified conceptual understanding of any kind of evolution, biological or otherwise. This is used in the second part to develop Amee, an agent-based model that unifies development, niche construction, and phenotypic plasticity with natural selection based on a simulated ecology. Amee is implemented in Utopia, which allows performant, integrated implementation and simulation of arbitrary agent-based models. A phenomenological overview over Amee’s capabilities is provided, ranging from the evolution of ecospecies down to the evolution of metabolic networks and up to beyond-species-level biological organization, all of which emerges autonomously from the basic dynamics. The interaction of development, plasticity, and niche construction has been investigated, and it has been shown that while expected natural phenomena can, in principle, arise, the accessible simulation time and system size are too small to produce natural evo-devo phenomena and –structures. Amee thus can be used to simulate the evolution of a wide variety of processes

    Brain-Computer Interfaces using Electrocorticography and Surface Stimulation

    Get PDF
    The brain connects to, modulates, and receives information from every organ in the body. As such, brain-computer interfaces (BCIs) have vast potential for diagnostics, medical therapies, and even augmentation or enhancement of normal functions. BCIs provide a means to explore the furthest corners of what it means to think, to feel, and to act—to experience the world and to be who you are. This work focuses on the development of a chronic bi-directional BCI for sensorimotor restoration through the use of separable frequency bands for recording motor intent and providing sensory feedback via electrocortical stimulation. Epidural cortical surface electrodes are used to both record electrocorticographic (ECoG) signals and provide stimulation without adverse effects associated with penetration through the protective dural barrier of brain. Chronic changes in electrode properties and signal characteristics are discussed, which inform optimal electrode designs and co-adaptive algorithms for decoding high-dimensional information. Additionally, a multi-layered approach to artifact suppression is presented, which includes a systems-level design of electronics, signal processing, and stimulus waveforms. The results of this work are relevant to a wider range of applications beyond ECoG and BCIs that involve closed-loop recording and stimulation throughout the body. By enabling simultaneous recording and stimulation through the techniques described here, responsive therapies can be developed that are tuned to individual patients and provide precision therapies at exactly the right place and time. This has the potential to improve targeted therapeutic outcomes while reducing undesirable side effects

    Building the knowledge base for environmental action and sustainability

    Get PDF

    A robust machine learning approach for the prediction of allosteric binding sites

    Get PDF
    Previously held under moratorium from 28 March 2017 until 28 March 2022Allosteric regulatory sites are highly prized targets in drug discovery. They remain difficult to detect by conventional methods, with the vast majority of known examples being found serendipitously. Herein, a rigorous, wholly-computational protocol is presented for the prediction of allosteric sites. Previous attempts to predict the location of allosteric sites by computational means drew on only a small amount of data. Moreover, no attempt was made to modify the initial crystal structure beyond the in silico deletion of the allosteric ligand. This behaviour can leave behind a conformation with a significant structural deformation, often betraying the location of the allosteric binding site. Despite this artificial advantage, modest success rates are observed at best. This work addresses both of these issues. A set of 60 protein crystal structures with known allosteric modulators was collected. To remove the imprint on protein structure caused by the presence of bound modulators, molecular dynamics was performed on each protein prior to analysis. A wide variety of analytical techniques were then employed to extract meaningful data from the trajectories. Upon fusing them into a single, coherent dataset, random forest - a machine learning algorithm - was applied to train a high performance classification model. After successive rounds of optimisation, the final model presented in this work correctly identified the allosteric site for 72% of the proteins tested. This is not only an improvement over alternative strategies in the literature; crucially, this method is unique among site prediction tools in that is does not abuse crystal structures containing imprints of bound ligands - of key importance when making live predictions, where no allosteric regulatory sites are known.Allosteric regulatory sites are highly prized targets in drug discovery. They remain difficult to detect by conventional methods, with the vast majority of known examples being found serendipitously. Herein, a rigorous, wholly-computational protocol is presented for the prediction of allosteric sites. Previous attempts to predict the location of allosteric sites by computational means drew on only a small amount of data. Moreover, no attempt was made to modify the initial crystal structure beyond the in silico deletion of the allosteric ligand. This behaviour can leave behind a conformation with a significant structural deformation, often betraying the location of the allosteric binding site. Despite this artificial advantage, modest success rates are observed at best. This work addresses both of these issues. A set of 60 protein crystal structures with known allosteric modulators was collected. To remove the imprint on protein structure caused by the presence of bound modulators, molecular dynamics was performed on each protein prior to analysis. A wide variety of analytical techniques were then employed to extract meaningful data from the trajectories. Upon fusing them into a single, coherent dataset, random forest - a machine learning algorithm - was applied to train a high performance classification model. After successive rounds of optimisation, the final model presented in this work correctly identified the allosteric site for 72% of the proteins tested. This is not only an improvement over alternative strategies in the literature; crucially, this method is unique among site prediction tools in that is does not abuse crystal structures containing imprints of bound ligands - of key importance when making live predictions, where no allosteric regulatory sites are known

    Multiscale modelling of delayed hydride cracking

    Get PDF
    A mechanistic model of delayed hydride cracking (DHC) is crucial to the nuclear industry as a predictive tool for understanding the structural failure of zirconium alloy components that are used to clad fuel pins in water-cooled reactors. Such a model of DHC failure must be both physically accurate and computationally efficient so that it can inform and guide nuclear safety assessments. However, this endeavour has so far proved to be an unsurmountable challenge because of the seemingly intractable multiscale complexity of the DHC phenomenon, which is a manifestation of hydrogen embrittlement that involves the interplay and repetition of three constituent processes: atomic scale diffusion, microscale precipitation and continuum scale fracture. This investigation aims to blueprint a novel multiscale modelling strategy to simulate the early stages of DHC initiation: stress-driven hydrogen diffusion-controlled precipitation of hydrides near loaded flaws in polycrystalline zirconium. Following a careful review of the experimental observations in the literature as well as the standard modelling techniques that are commonplace in nuclear fuel performance codes in the first part of this dissertation, the second and third parts introduce a hybrid multiscale modelling strategy that integrates concepts across a spectrum of length and time scales into one self-consistent framework whilst accounting for the complicated nuances of the zirconium-hydrogen system. In particular, this strategy dissects the DHC mechanism into three interconnected modules: (i) stress analysis, which performs defect micromechanics in hexagonal close-packed zirconium through the application of the mathematical theory of planar elasticity to anisotropic continua; (ii) stress-diffusion analysis, which bridges the classical long-range elastochemical transport with the quantum structure of the hydrogen interstitialcy in the trigonal environment of the tetrahedral site; and (iii) diffusion-precipitation analysis, which translates empirical findings into an optimised algorithm that emulates the thermodynamically favourable spatial assembly of the microscopic hydride needles into macroscopic hydride colonies at prospective nucleation sites. Each module explores several unique mechanistic modelling considerations, including a multipolar expansion of the forces exerted by hydrogen interstitials, a distributed dislocation representation of the hydride platelets, and a stoichiometric hydrogen mass conservation criterion that dictates the lifecycle of hydrides. The investigation proceeds to amalgamate the stress, stress-diffusion and diffusion-precipitation analyses into a unified theory of the mesoscale mechanics that underpin the early stages of DHC failure and a comprehensive simulation of the flaw-tip hydrogen profiles and hydride microstructures. The multiscale theory and simulation are realised within a bespoke software which incorporates computer vision to generate mesoscale micrographs that depict the geometries, morphologies and contours of key metallographic entities: cracks and notches, grains, intergranular and intragranular nucleation sites as well as regions of hydrogen enhancement and complex networks of hydride features. Computer vision mediates the balance between simulation accuracy and simulation efficiency, which is completely novel in the context of DHC research as a paradigm at the intersection of computational science and computer science. Preliminary tests show that the simulation environment of the hybrid model is significantly more accurate and efficient in comparison with the traditional finite element and phase field methodologies. Due to this unprecedented simulation accuracy-efficiency balance, realistic flaw-tip hydrogen profiles and hydride microstructures can be simulated within seconds, which naturally facilitates statistical averaging over ensembles. Such statistical capabilities are highly relevant to nuclear safety assessments and, therefore, a systematic breakdown of the model formulation is presented in the style of a code specification manual so that the bespoke software can be readily adapted within an industrial setting. As the main contribution to DHC research, the proposed multiscale model comprises a state-of-the-art microstructural solver whose unrivalled versatility is demonstrated by showcasing a series of simulated micrographs that are parametrised by flaw acuity, grain size, texture, alloy composition, and histories of thermomechanical cycles. Direct comparisons with experimental micrographs indicate good quantitative agreement and provide some justification to the known qualitative trends. Furthermore, the overall hybrid methodology is proven to scale linearly with the number of hydrides, which is computationally advantageous in its own right because it allows the bespoke software to be extended without compromising its speed. Several possible extensions are outlined which would improve the phenomological accuracy of the multiscale model whilst retaining its efficiency. In its current form, however, this hybrid multiscale model of the early stages of DHC goes far beyond existing methodologies in terms of simulation scope.Open Acces

    10th HyMeX Workshop

    Get PDF
    • …
    corecore