872 research outputs found

    A Logic Programming Approach to Reaction Systems

    Get PDF
    Reaction systems (RS) are a computational framework inspired by the functioning of living cells, suitable to model the main mechanisms of biochemical reactions. RS have shown to be useful also for computer science applications, e.g. to model circuits or transition systems. Since their introduction about 10 years ago, RS matured into a fruitful and dynamically evolving research area. They have become a popular novel model of interactive computation. RS can be seen as a rewriting system interacting with the environment represented by the context. RS pose some problems of implementation, as it is a relatively recent computation model, and several extensions of the basic model have been designed. In this paper we present some preliminary work on how to implement this formalism in a logic programming language (Prolog). To the best of our knowledge this is the first approach to RS in logic programming. Our prototypical implementation does not aim to be highly performing, but has the advantage of being high level and easily modifiable. So it is suitable as a rapid prototyping tool for implementing several extensions of reaction systems in the literature as well as new ones. We also make a preliminary implementation of a kind of memoization mechanism for stopping potentially infinite and repetitive computations. Then we show how to implement in our interpreter an extension of RS for modeling a nondeterministic context and interaction between components of a (biological) system. We then present an extension of the interpreter for implementing the recently introduced networks of RS

    Preface

    Get PDF

    A Coverage Criterion for Spaced Seeds and its Applications to Support Vector Machine String Kernels and k-Mer Distances

    Get PDF
    Spaced seeds have been recently shown to not only detect more alignments, but also to give a more accurate measure of phylogenetic distances (Boden et al., 2013, Horwege et al., 2014, Leimeister et al., 2014), and to provide a lower misclassification rate when used with Support Vector Machines (SVMs) (On-odera and Shibuya, 2013), We confirm by independent experiments these two results, and propose in this article to use a coverage criterion (Benson and Mak, 2008, Martin, 2013, Martin and No{\'e}, 2014), to measure the seed efficiency in both cases in order to design better seed patterns. We show first how this coverage criterion can be directly measured by a full automaton-based approach. We then illustrate how this criterion performs when compared with two other criteria frequently used, namely the single-hit and multiple-hit criteria, through correlation coefficients with the correct classification/the true distance. At the end, for alignment-free distances, we propose an extension by adopting the coverage criterion, show how it performs, and indicate how it can be efficiently computed.Comment: http://online.liebertpub.com/doi/abs/10.1089/cmb.2014.017

    A map of dependencies among three-valued logics

    Get PDF
    International audienceThree-valued logics arise in several fields of computer science, both inspired by concrete problems (such as in the management of the null value in databases) and theoretical considerations. Several three-valued logics have been defined. They differ by their choice of basic connectives, hence also from a syntactic and proof-theoretic point of view. Different interpretations of the third truth value have also been suggested. They often carry an epistemic flavor. In this work, relationships between logical connectives on three-valued functions are explored. Existing theorems of functional completeness have laid bare some of these links, based on specific connectives. However we try to draw a map of such relationships between conjunctions, negations and implications that extend Boolean ones. It turns out that all reasonable connectives can be defined from a few of them and so all known three-valued logics appear as a fragment of only one logic. These results can be instrumental when choosing, for each application context, the appropriate fragment where the basic connectives make full sense, based on the appropriate meaning of the third truth-value

    A Coverage Criterion for Spaced Seeds and its Applications to Support Vector Machine String Kernels and k-Mer Distances

    Get PDF
    Spaced seeds have been recently shown to not only detect more alignments, but also to give a more accurate measure of phylogenetic distances (Boden et al., 2013, Horwege et al., 2014, Leimeister et al., 2014), and to provide a lower misclassification rate when used with Support Vector Machines (SVMs) (On-odera and Shibuya, 2013), We confirm by independent experiments these two results, and propose in this article to use a coverage criterion (Benson and Mak, 2008, Martin, 2013, Martin and No{\'e}, 2014), to measure the seed efficiency in both cases in order to design better seed patterns. We show first how this coverage criterion can be directly measured by a full automaton-based approach. We then illustrate how this criterion performs when compared with two other criteria frequently used, namely the single-hit and multiple-hit criteria, through correlation coefficients with the correct classification/the true distance. At the end, for alignment-free distances, we propose an extension by adopting the coverage criterion, show how it performs, and indicate how it can be efficiently computed.Comment: http://online.liebertpub.com/doi/abs/10.1089/cmb.2014.017

    Analysis of Petri Nets and Transition Systems

    Full text link
    This paper describes a stand-alone, no-frills tool supporting the analysis of (labelled) place/transition Petri nets and the synthesis of labelled transition systems into Petri nets. It is implemented as a collection of independent, dedicated algorithms which have been designed to operate modularly, portably, extensibly, and efficiently.Comment: In Proceedings ICE 2015, arXiv:1508.0459

    AutomatĂĄk , fixpontok, Ă©s logika = Automata, fixed points, and logic

    Get PDF
    Megmutattuk, hogy a vĂ©ges automatĂĄk (faautomatĂĄk, sĂșlyozott automatĂĄk, stb.) viselkedĂ©se vĂ©gesen leĂ­rhatĂł a fixpont mƱvelet ĂĄltalĂĄnos tulajdonsĂĄgainak felhasznĂĄlĂĄsĂĄval. Teljes axiomatizĂĄlĂĄst adtunk a vĂ©ges automatĂĄk viselkedĂ©sĂ©t leĂ­rĂł racionĂĄlis hatvĂĄnysorokra Ă©s fasorokra, ill. a vĂ©ges automatĂĄk biszimulĂĄciĂł alapĂș viselkedĂ©sĂ©re. Megmutattuk, hogy az automatĂĄk elmĂ©letĂ©nek alapvetƑ Kleene tĂ©tele Ă©s ĂĄltalĂĄnosĂ­tĂĄsai a fixpont mƱvelet azonossĂĄgainak következmĂ©nye. Algebrai eszközökkel vizsgĂĄltuk az elĂĄgazĂł idejƱ temporĂĄlis logikĂĄk Ă©s a monadikus mĂĄsodrendƱ logika frĂĄgmenseinek kifejezƑ erejĂ©t fĂĄkon. FƑ eredmĂ©nyĂŒnk egy olyan kölcsönösen egyĂ©rtelmƱ kapcsolat kimutatĂĄsa, amely ezen logikĂĄk kifejezƑ erejĂ©nek vizsgĂĄlatĂĄt visszavezeti vĂ©ges algebrĂĄk Ă©s preklĂłnok bizonyos pszeudovarietĂĄsainak vizsgĂĄlatĂĄra. JellemeztĂŒk a regulĂĄris Ă©s környezetfĂŒggetlen nyelvek lexikografikus rendezĂ©seit, vĂ©gtelen szavakra ĂĄltalĂĄnosĂ­tottuk a környezetfĂŒggetlen nyelv fogalmĂĄt, Ă©s tisztĂĄztuk ezek szĂĄmos algoritmikus tulajdonsĂĄgĂĄt. | We have proved that the the bahavior of finite automata (tree automata, weighted automata, etc.) has a finite description with respect to the general properties of fixed point operations. We have obtained complete axiomatizations of rational power series and tree series, and the bisimulation based behavior of finite automata. As an intermediate step of the completeness proofs, we have shown that Kleene's fundamental theorem and its generalizations follow from the equational properties of fixed point operations. We have developed an algebraic framework for describing the expressive power of branching time temporal logics and fragments of monadic second-order logic on trees. Our main results establish a bijective correspondence between these logics and certain pseudo-varieties of finite algebras and/or finitary preclones. We have characterized the lexicographic orderings of the regular and context-free languages and generalized the notion of context-free languages to infinite words and established several of their algorithmic properties

    Nondeterministic State Complexity for Suffix-Free Regular Languages

    Full text link
    We investigate the nondeterministic state complexity of basic operations for suffix-free regular languages. The nondeterministic state complexity of an operation is the number of states that are necessary and sufficient in the worst-case for a minimal nondeterministic finite-state automaton that accepts the language obtained from the operation. We consider basic operations (catenation, union, intersection, Kleene star, reversal and complementation) and establish matching upper and lower bounds for each operation. In the case of complementation the upper and lower bounds differ by an additive constant of two.Comment: In Proceedings DCFS 2010, arXiv:1008.127

    Metric Semantics and Full Abstractness for Action Refinement and Probabilistic Choice

    Get PDF
    This paper provides a case-study in the field of metric semantics for probabilistic programming. Both an operational and a denotational semantics are presented for an abstract process language L_pr, which features action refinement and probabilistic choice. The two models are constructed in the setting of complete ultrametric spaces, here based on probability measures of compact support over sequences of actions. It is shown that the standard toolkit for metric semantics works well in the probabilistic context of L_pr, e.g. in establishing the correctness of the denotational semantics with respect to the operational one. In addition, it is shown how the method of proving full abstraction --as proposed recently by the authors for a nondeterministic language with action refinement-- can be adapted to deal with the probabilistic language L_pr as well

    Corecursive featherweight Java revisited

    Get PDF
    We describe a Java-like calculus which supports cyclic data structures, and offers a mechanism of flexible regular corecursion for their manipulation. The calculus enhances an earlier proposal by a more sophisticated reduction semantics, which filters out, by an additional check, some spurious results which were obtained in the previous model
    • 

    corecore