128 research outputs found

    Rule-based Modeling of Transcriptional Attenuation at the Tryptophan Operon

    Get PDF
    Transcriptional attenuation at E.coli\u27s tryptophan operon is a prime example of RNA-mediated gene regulation. In this paper, we present a discrete stochastic model for this phenomenon based on chemical reactions. Our model is compact and intelligible, due to n-ary reactions (which preclude object-centric approaches) and to rule schemas that define finite sets of chemical reactions. Stochastic simulations with our model confirm results that were previously obtained by master equations or differential equations. In addition, our approach permits to reflect mutation experiments by simple model modifications, and to re-use model components for transcriptional attenuation in other genes and organisms

    A Lightweight Multilevel Markup Language for Connecting Software Requirements and Simulations

    Get PDF
    [Context] Simulation is a powerful tool to validate specified requirements especially for complex systems that constantly monitor and react to characteristics of their environment. The simulators for such systems are complex themselves as they simulate multiple actors with multiple interacting functions in a number of different scenarios. To validate requirements in such simulations, the requirements must be related to the simulation runs. [Problem] In practice, engineers are reluctant to state their requirements in terms of structured languages or models that would allow for a straightforward relation of requirements to simulation runs. Instead, the requirements are expressed as unstructured natural language text that is hard to assess in a set of complex simulation runs. Therefore, the feedback loop between requirements and simulation is very long or non-existent at all. [Principal idea] We aim to close the gap between requirements specifications and simulation by proposing a lightweight markup language for requirements. Our markup language provides a set of annotations on different levels that can be applied to natural language requirements. The annotations are mapped to simulation events. As a result, meaningful information from a set of simulation runs is shown directly in the requirements specification. [Contribution] Instead of forcing the engineer to write requirements in a specific way just for the purpose of relating them to a simulator, the markup language allows annotating the already specified requirements up to a level that is interesting for the engineer. We evaluate our approach by analyzing 8 original requirements of an automotive system in a set of 100 simulation runs

    Efficient Algorithms on the Family Associated to an Implicational System

    Get PDF
    International audienceAn implication system (IS) on a finite set S is a set of rules called Σ -implications of the kind A →_Σ B, with A,B ⊆ S. A subset X ⊆ S satisfies A →_Σ B when ''A ⊆ X implies B ⊆ X'' holds, so ISs can be used to describe constraints on sets of elements, such as dependency or causality. ISs are formally closely linked to the well known notions of closure operators and Moore families. This paper focuses on their algorithmic aspects. A number of problems issued from an IS Σ (e.g. is it minimal, is a given implication entailed by the system) can be reduced to the computation of closures φ _Σ (X), where φ _Σ is the closure operator associated to Σ . We propose a new approach to compute such closures, based on the characterization of the direct-optimal IS Σ _do which has the following properties: \beginenumerate ıtemit is equivalent to Σ ıtemφ _Σ _do(X) (thus φ _Σ (X)) can be computed by a single scanning of Σ _do-implications ıtemit is of minimal size with respect to ISs satisfying 1. and 2. \endenumerate We give algorithms that compute Σ _do, and from Σ _do closures φ _Σ (X) and the Moore family associated to φ _Σ

    Learning Model Transformations from Examples using FCA: One for All or All for One?

    No full text
    International audienceIn Model-Driven Engineering (MDE), model transformations are basic and primordial entities. An efficient way to assist the definition of these transformations consists in completely or partially learning them. MTBE (Model Transformation By-Example) is an approach that aims at learning a model transformation from a set of examples, i.e. pairs of transformation source and target models. To implement this approach, we use Formal Concept Analysis as a learning mechanism in order to extract executable rules. In this paper, we investigate two learning strategies. In the first strategy, transformation rules are learned independently from each example. Then we gather these rules into a single set of rules. In the second strategy, we learn the set of rules from all the examples. The comparison of the two strategies on the well-known transformation problem of class diagrams to relational schema showed that the rules obtained from the two strategies are interesting. Besides the first one produces rules which are more proper to their examples and apply well compared to the second one which builds more detailed rules but larger and more difficult to analyze and to apply

    New methods to analyse fragmentation mechanisms of precipitated silicas

    Get PDF
    International audiencePrecipitated silica is traditionally used as reinforcing filler in rubber applications. In pneumatic manufacture, it offers several advantages compared to carbon black. Indeed, in tyres treads, precipitated silica can yield a lower rolling resistance and better wet grip at equal wear resistance than carbon black. The development of a new method to study the fragmentation mechanism of precipitated silica is investigated. Usually, the dispersion of granules is done in high viscous media (elastomer or oils). In this paper, the study is carried out in a friendlier medium (air or water)

    Relational Data Exploration by Relational Concept Analysis

    No full text
    Relational Concept Analysis [4] is an extension to FCA con- sidering several contexts with relations between them. Often used to extend the knowledge that can be learned with FCA, RCA also meets the issue of combinatorial explosion. The initial specification of RCA implies a monotonic growth of the number of concepts and an exhaustiveness of all the concepts that can be obtained when a fixed point is reached. In this position paper we propose a different specification of RCA that permits an interactive exploration of the data by letting the choice of the user for each step. This change will permit to handle richer relational data in a more flexible way by restraining the relations explored at each step hence reducing the number of created concepts

    Class Model Normalization Outperforming Formal Concept Analysis approaches with AOC-posets

    Get PDF
    International audienceDesigning or reengineering class models in the domain of programming or modeling involves capturing technical and domain concepts , finding the right abstractions and avoiding duplications. Making this last task in a systematic way corresponds to a kind of model nor-malization. Several approaches have been proposed, that all converge towards the use of Formal Concept Analysis (FCA). An extension of FCA to linked data, Relational Concept Analysis (RCA) helped to mine better reusable abstractions. But RCA relies on iteratively building concept lattices, which may cause a combinatorial explosion in the number of the built artifacts. In this paper, we investigate the use of an alternative RCA process, relying on a specific sub-order of the concept lattice (AOC-poset) which preserves the most relevant part of the normal form. We measure, on case studies from Java models extracted from Java code and from UML models, the practical reduction that AOC-posets bring to the normal form of the class model

    Un Framework de traçabilité pour des transformations à caractère impératif

    Get PDF
    National audienceCet article s’inscrit dans le cadre de l’ingénierie dirigée par les mo- dèles et apporte une contribution au problème de la traçabilité des artefacts de modélisation durant une chaîne de transformations écrites dans un langage impé- ratif. L’approche que nous proposons nécessite peu d’interventions de l’utilisa- teur. Nous introduisons un métamodèle générique des traces qui permet entre autres d’apporter une dimension multi-échelles aux traces grâce à l’applica- tion du patron de conception composite. Le principe de notre approche est de surveiller certaines catégories d’opérations intéressantes pour la génération de traces pertinentes. Ces catégories sont définies à l’aide du type des objets mani- pulés par les opérations. Une fois les catégories définies, la trace est générée par du code dédié qui est injecté automatiquement dans la transformation, autour des opérations caractérisées par les catégories définies. Un prototype a été réa- lisé pour les transformations de modèles écrites en Java, sur le framework EMF. L’injection du code dédié à la traçabilité est réalisée à l’aide de la programmation par aspects

    Models of Tet-On System with Epigenetic Effects

    Get PDF
    International audienceWe present the first results of ongoing work investigating two models of the artificial inducible promoter Tet-On that include epigenetic regulation. We consider chromatin states and 1D diffusion of transcription factors that reveal, respectively, stochastic noise and a memory effect

    Utilisation de l'analyse formelle de concepts pour extraire le plus grand modèle commun

    Get PDF
    International audienceThe development of information systems follows a long and complex process in which various actors are involved. We report an experiment in which we observe the evolution of the analysis model of an information system through 15 successive versions. We use indicators on the underlying concept lattices built by applying Relational Concept Analysis (RCA) to each version. RCA is an extension of FCA which groups entities based on characteristics they share, including links to other entities. It here helps in analyzing their evolution. From this experience, we establish recommendations to monitor and verify the proper evolution of the analysis process
    • …
    corecore