45,149 research outputs found

    Synthesis of Biological and Mathematical Methods for Gene Network Control

    Get PDF
    abstract: Synthetic biology is an emerging field which melds genetics, molecular biology, network theory, and mathematical systems to understand, build, and predict gene network behavior. As an engineering discipline, developing a mathematical understanding of the genetic circuits being studied is of fundamental importance. In this dissertation, mathematical concepts for understanding, predicting, and controlling gene transcriptional networks are presented and applied to two synthetic gene network contexts. First, this engineering approach is used to improve the function of the guide ribonucleic acid (gRNA)-targeted, dCas9-regulated transcriptional cascades through analysis and targeted modification of the RNA transcript. In so doing, a fluorescent guide RNA (fgRNA) is developed to more clearly observe gRNA dynamics and aid design. It is shown that through careful optimization, RNA Polymerase II (Pol II) driven gRNA transcripts can be strong enough to exhibit measurable cascading behavior, previously only shown in RNA Polymerase III (Pol III) circuits. Second, inherent gene expression noise is used to achieve precise fractional differentiation of a population. Mathematical methods are employed to predict and understand the observed behavior, and metrics for analyzing and quantifying similar differentiation kinetics are presented. Through careful mathematical analysis and simulation, coupled with experimental data, two methods for achieving ratio control are presented, with the optimal schema for any application being dependent on the noisiness of the system under study. Together, these studies push the boundaries of gene network control, with potential applications in stem cell differentiation, therapeutics, and bio-production.Dissertation/ThesisDoctoral Dissertation Biomedical Engineering 201

    Coordinating visualizations of polysemous action: Values added for grounding proportion

    Get PDF
    We contribute to research on visualization as an epistemic learning tool by inquiring into the didactical potential of having students visualize one phenomenon in accord with two different partial meanings of the same concept. 22 Grade 4-6 students participated in a design study that investigated the emergence of proportional-equivalence notions from mediated perceptuomotor schemas. Working as individuals or pairs in tutorial clinical interviews, students solved non-symbolic interaction problems that utilized remote-sensing technology. Next, they used symbolic artifacts interpolated into the problem space as semiotic means to objectify in mathematical register a variety of both additive and multiplicative solution strategies. Finally, they reflected on tensions between these competing visualizations of the space. Micro-ethnographic analyses of episodes from three paradigmatic case studies suggest that students reconciled semiotic conflicts by generating heuristic logico-mathematical inferences that integrated competing meanings into cohesive conceptual networks. These inferences hinged on revisualizing additive elements multiplicatively. Implications are drawn for rethinking didactical design for proportions. © 2013 FIZ Karlsruhe

    Using the general link transmission model in a dynamic traffic assignment to simulate congestion on urban networks

    Get PDF
    This article presents two new models of Dynamic User Equilibrium that are particularly suited for ITS applications, where the evolution of vehicle flows and travel times must be simulated on large road networks, possibly in real-time. The key feature of the proposed models is the detail representation of the main congestion phenomena occurring at nodes of urban networks, such as vehicle queues and their spillback, as well as flow conflicts in mergins and diversions. Compared to the simple word of static assignment, where only the congestion along the arc is typically reproduced through a separable relation between vehicle flow and travel time, this type of DTA models are much more complex, as the above relation becomes non-separable, both in time and space. Traffic simulation is here attained through a macroscopic flow model, that extends the theory of kinematic waves to urban networks and non-linear fundamental diagrams: the General Link Transmission Model. The sub-models of the GLTM, namely the Node Intersection Model, the Forward Propagation Model of vehicles and the Backward Propagation Model of spaces, can be combined in two different ways to produce arc travel times starting from turn flows. The first approach is to consider short time intervals of a few seconds and process all nodes for each temporal layer in chronological order. The second approach allows to consider long time intervals of a few minutes and for each sub-model requires to process the whole temporal profile of involved variables. The two resulting DTA models are here analyzed and compared with the aim of identifying their possible use cases. A rigorous mathematical formulation is out of the scope of this paper, as well as a detailed explanation of the solution algorithm. The dynamic equilibrium is anyhow sought through a new method based on Gradient Projection, which is capable to solve both proposed models with any desired precision in a reasonable number of iterations. Its fast convergence is essential to show that the two proposed models for network congestion actually converge at equilibrium to nearly identical solutions in terms of arc flows and travel times, despite their two diametrical approaches wrt the dynamic nature of the problem, as shown in the numerical tests presented here

    AutoBayes: A System for Generating Data Analysis Programs from Statistical Models

    No full text
    Data analysis is an important scientific task which is required whenever information needs to be extracted from raw data. Statistical approaches to data analysis, which use methods from probability theory and numerical analysis, are well-founded but difficult to implement: the development of a statistical data analysis program for any given application is time-consuming and requires substantial knowledge and experience in several areas. In this paper, we describe AutoBayes, a program synthesis system for the generation of data analysis programs from statistical models. A statistical model specifies the properties for each problem variable (i.e., observation or parameter) and its dependencies in the form of a probability distribution. It is a fully declarative problem description, similar in spirit to a set of differential equations. From such a model, AutoBayes generates optimized and fully commented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Code is produced by a schema-guided deductive synthesis process. A schema consists of a code template and applicability constraints which are checked against the model during synthesis using theorem proving technology. AutoBayes augments schema-guided synthesis by symbolic-algebraic computation and can thus derive closed-form solutions for many problems. It is well-suited for tasks like estimating best-fitting model parameters for the given data. Here, we describe AutoBayes's system architecture, in particular the schema-guided synthesis kernel. Its capabilities are illustrated by a number of advanced textbook examples and benchmarks
    corecore