69 research outputs found

    Needed Computations Shortcutting Needed Steps

    Get PDF
    We define a compilation scheme for a constructor-based, strongly-sequential, graph rewriting system which shortcuts some needed steps. The object code is another constructor-based graph rewriting system. This system is normalizing for the original system when using an innermost strategy. Consequently, the object code can be easily implemented by eager functions in a variety of programming languages. We modify this object code in a way that avoids total or partial construction of the contracta of some needed steps of a computation. When computing normal forms in this way, both memory consumption and execution time are reduced compared to ordinary rewriting computations in the original system.Comment: In Proceedings TERMGRAPH 2014, arXiv:1505.0681

    Exploring Conditional Rewriting Logic Computations

    Get PDF
    [EN] Trace exploration is concerned with techniques that allow computation traces to be dynamically searched for specific contents. Depending on whether the exploration is carried backward or forward, trace exploration techniques allow provenance tracking or impact tracking to be done. The aim of provenance tracking is to show how (parts of) a program output depends on (parts of) its input and to help estimate which input data need to be modified to accomplish a change in the outcome. The aim of impact tracking is to identify the scope and potential consequences of changing the program input. Rewriting Logic (RWL) is a logic of change that supplements (an extension of) the equational logic by adding rewrite rules that are used to describe (nondeterministic) transitions between states. In this paper, we present a rich and highly dynamic, parameterized technique for the forward inspection of RWL computations that allows the nondeterministic execution of a given conditional rewrite theory to be followed up in different ways. With this technique, an analyst can browse, slice, filter, or search the traces as they come to life during the program execution. The navigation of the trace is driven by a user-defined, inspection criterion that specifies the required exploration mode. By selecting different inspection criteria, one can automatically derive a family of practical algorithms such as program steppers and more sophisticatedThis work has been partially supported by the EU (FEDER) and the Spanish MEC project Ref. TIN2010-21062-C02-02, the Spanish MICINN complementary action Ref. TIN2009-07495-E, and by Generalitat Valenciana Ref. PROMETEO2011/052. This work was carried out during the tenure of D. Ballis' ERCIM "Alain Bensoussan" Postdoctoral Fellowship. The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement N. 246016. F. Frechina was supported by FPU-ME grant AP2010-5681, and J. Sapina was supported by FPI-UPV grant SP2013-0083.Alpuente Frasnedo, M.; Ballis, D.; Frechina Navarro, F.; Sapiña Sanchis, J. (2015). Exploring Conditional Rewriting Logic Computations. Journal of Symbolic Computation. 69:3-39. https://doi.org/10.1016/j.jsc.2014.09.028S3396

    Term graph rewriting and garbage collection using opfibrations

    Get PDF
    AbstractThe categorical semantics of (an abstract version of) the general term graph rewriting language DACTL is investigated. The operational semantics is reformulated in order to reveal its universal properties. The technical dissonance between the matchings of left-hand sides of rules to redexes, and the properties of rewrite rules themselves, is taken as the impetus for expressing the core of the model as a Grothendieck opfibration of a category of general rewrites over a base of general rewrite rules. Garbage collection is examined in this framework in order to reconcile the treatment with earlier approaches. It is shown that term rewriting has particularly good garbage-theoretic properties that do not generalise to all cases of graph rewriting and that this has been a stumbling block for aspects of some earlier models for graph rewriting

    Inspecting rewriting logic computations (in a parametric and stepwise way)

    Full text link
    The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-642-54624-2_12Trace inspection is concerned with techniques that allow the trace content to be searched for specific components. This paper presents a rich and highly dynamic, parameterized technique for the trace inspection of Rewriting Logic theories that allows the non-deterministic execution of a given unconditional rewrite theory to be followed up in different ways. Using this technique, an analyst can browse, slice, filter, or search the traces as they come to life during the program execution. Starting from a selected state in the computation tree, the navigation of the trace is driven by a user-defined, inspection criterion that specifies the required exploration mode. By selecting different inspection criteria, one can automatically derive a family of practical algorithms such as program steppers and more sophisticated dynamic trace slicers that facilitate the dynamic detection of control and data dependencies across the computation tree. Our methodology, which is implemented in the Anima graphical tool, allows users to capture the impact of a given criterion thereby facilitating the detection of improper program behaviors.This work has been partially supported by the EU (FEDER), the Spanish MEC project ref. TIN2010-21062-C02-02, the Spanish MICINN complementary action ref. TIN2009-07495-E, and by Generalitat Valenciana ref. PROMETEO2011/052. This work was carried out during the tenure of D. Ballis’ ERCIM “Alain Bensoussan ”Postdoctoral Fellowship. The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement n. 246016. F. Frechina was supported by FPU-ME grant AP2010-5681.Alpuente Frasnedo, M.; Ballis, D.; Frechina, F.; Sapiña Sanchis, J. (2014). Inspecting rewriting logic computations (in a parametric and stepwise way). En Specification, algebra, and software: essays dedicated to Kokichi Futatsugi. Springer Verlag (Germany). 229-255. https://doi.org/10.1007/978-3-642-54624-2_12S229255Alpuente, M., Ballis, D., Baggi, M., Falaschi, M.: A Fold/Unfold Transformation Framework for Rewrite Theories extended to CCT. In: Proc. PEPM 2010, pp. 43–52. ACM (2010)Alpuente, M., Ballis, D., Espert, J., Romero, D.: Model-checking Web Applications with Web-TLR. In: Bouajjani, A., Chin, W.-N. (eds.) ATVA 2010. LNCS, vol. 6252, pp. 341–346. Springer, Heidelberg (2010)Alpuente, M., Ballis, D., Espert, J., Romero, D.: Backward Trace Slicing for Rewriting Logic Theories. In: Bjørner, N., Sofronie-Stokkermans, V. (eds.) CADE 2011. LNCS, vol. 6803, pp. 34–48. Springer, Heidelberg (2011)Alpuente, M., Ballis, D., Frechina, F., Sapiña, J.: Slicing-Based Trace Analysis of Rewriting Logic Specifications with iJulienne. In: Felleisen, M., Gardner, P. (eds.) ESOP 2013. LNCS, vol. 7792, pp. 121–124. Springer, Heidelberg (2013)Alpuente, M., Ballis, D., Frechina, F., Romero, D.: Using Conditional Trace Slicing for improving Maude programs. Science of Computer Programming (2013) (to appear)Alpuente, M., Ballis, D., Romero, D.: A Rewriting Logic Approach to the Formal Specification and Verification of Web applications. Science of Computer Programming (2013) (to appear)Baggi, M., Ballis, D., Falaschi, M.: Quantitative Pathway Logic for Computational Biology. In: Degano, P., Gorrieri, R. (eds.) CMSB 2009. LNCS, vol. 5688, pp. 68–82. Springer, Heidelberg (2009)Bruni, R., Meseguer, J.: Semantic Foundations for Generalized Rewrite Theories. Theoretical Computer Science 360(1-3), 386–414 (2006)Clavel, M., Durán, F., Eker, S., Lincoln, P., Martí-Oliet, N., Meseguer, J., Talcott, C.: Maude Manual (Version 2.6). Technical report, SRI Int’l Computer Science Laboratory (2011), http://maude.cs.uiuc.edu/maude2-manual/Clements, J., Flatt, M., Felleisen, M.: Modeling an Algebraic Stepper. In: Sands, D. (ed.) ESOP 2001. LNCS, vol. 2028, pp. 320–334. Springer, Heidelberg (2001)Durán, F., Meseguer, J.: A Maude Coherence Checker Tool for Conditional Order-Sorted Rewrite Theories. In: Ölveczky, P.C. (ed.) WRLA 2010. LNCS, vol. 6381, pp. 86–103. Springer, Heidelberg (2010)Eker, S.: Associative-Commutative Matching via Bipartite Graph Matching. The Computer Journal 38(5), 381–399 (1995)Eker, S.: Associative-Commutative Rewriting on Large Terms. In: Nieuwenhuis, R. (ed.) RTA 2003. LNCS, vol. 2706, pp. 14–29. Springer, Heidelberg (2003)Klop, J.W.: Term Rewriting Systems. In: Abramsky, S., Gabbay, D., Maibaum, T. (eds.) Handbook of Logic in Computer Science, vol. I, pp. 1–112. Oxford University Press (1992)Martí-Oliet, N., Meseguer, J.: Rewriting Logic: Roadmap and Bibliography. Theoretical Computer Science 285(2), 121–154 (2002)Meseguer, J.: Conditional Rewriting Logic as a Unified Model of Concurrency. Theoretical Computer Science 96(1), 73–155 (1992)Meseguer, J.: The Temporal Logic of Rewriting: A Gentle Introduction. In: Degano, P., De Nicola, R., Meseguer, J. (eds.) Montanari Festschrift. LNCS, vol. 5065, pp. 354–382. Springer, Heidelberg (2008)Plotkin, G.D.: The Origins of Structural Operational Semantics. The Journal of Logic and Algebraic Programming 60-61(1), 3–15 (2004)Riesco, A., Verdejo, A., Caballero, R., Martí-Oliet, N.: Declarative Debugging of Rewriting Logic Specifications. In: Corradini, A., Montanari, U. (eds.) WADT 2008. LNCS, vol. 5486, pp. 308–325. Springer, Heidelberg (2009)Riesco, A., Verdejo, A., Martí-Oliet, N.: Declarative Debugging of Missing Answers for Maude. In: Proc. RTA 2010. LIPIcs, vol. 6, pp. 277–294 (2010)TeReSe. Term Rewriting Systems. Cambridge University Press (2003

    Verificación de aplicaciones web dinámicas con Web-TLR

    Full text link
    Web-TLR is a software tool designed for model-checking Web applications that is based on rewriting logic. Web applications are expressed as rewrite theories that can be formally verified by using the Maude built-in LTLR model-checker. Whenever a property is refuted, it produces a counterexample trace that underlies the failing model checking computation. However, the analysis (or even the simple inspection) of large counterexamples may prove to be unfeasible due to the size and complexity of the traces under examination. This work aims to improve the understandability of the counterexamples generated by Web-TLR by developing an integrated framework for debugging Web applications that integrates a trace-slicing technique for rewriting logic theories that is particularly tailored to Web-TLR. The verification environment is also provided with a user-friendly, graphical Web interface that shields the user from unnecessary information. Trace slicing is a widely used technique for execution trace analysis that is effectively used in program debugging, analysis and comprehension. Our trace slicing technique allows us to systematically trace back rewrite sequences modulo equational axioms (such as associativity and commutativity) by means of an algorithm that dynamically simpli es the traces by detecting control and data dependencies, and dropping useless data that do not infuence the final result. Our methodology is particularly suitable for analyzing complex, textually-large system computations such as those delivered as counter-example traces by Maude model-checkers. The slicing facility implemented in Web-TLR allows the user to select the pieces of information that she is interested into by means of a suitable pattern-matching language supported by wildcards. The selected information is then traced back through inverse rewrite sequences. The slicing process drastically simpli es the computation trace by dropping useless data that do not influence the nal result. By using this facility, the Web engineer can focus on the relevant fragments of the failing application, which greatly reduces the manual debugging e ort and also decreases the number of iterative verfications.Espert Real, J. (2011). Verificación de aplicaciones web dinámicas con Web-TLR. http://hdl.handle.net/10251/11219.Archivo delegad

    Backward Trace Slicing for Rewriting Logic Theories -Technical report -

    Full text link
    Trace slicing is a widely used technique for execution trace analysis that is effectively used in program debugging, analysis and comprehension. In this paper, we present a backward trace slicing technique that can be used for the analysis of Rewriting Logic theories. Our trace slicing technique allows us to systematically trace back rewrite sequences modulo equational axioms (such as associativity and commutativity) by means of an algorithm that dynamically simplifies the traces by detecting control and data dependencies, and dropping useless data that do not influence the final result. Our methodology is particularly suitable for analyzing complex, textually-large system computations such as those delivered as counter-example traces by Maude model-checkers.Alpuente Frasnedo, M.; Ballis, D.; Espert, J.; Romero, D. (2011). Backward Trace Slicing for Rewriting Logic Theories -Technical report -. http://hdl.handle.net/10251/1077

    On Computational Small Steps and Big Steps: Refocusing for Outermost Reduction

    Get PDF
    We study the relationship between small-step semantics, big-step semantics and abstract machines, for programming languages that employ an outermost reduction strategy, i.e., languages where reductions near the root of the abstract syntax tree are performed before reductions near the leaves.In particular, we investigate how Biernacka and Danvy's syntactic correspondence and Reynolds's functional correspondence can be applied to inter-derive semantic specifications for such languages.The main contribution of this dissertation is three-fold:First, we identify that backward overlapping reduction rules in the small-step semantics cause the refocusing step of the syntactic correspondence to be inapplicable.Second, we propose two solutions to overcome this in-applicability: backtracking and rule generalization.Third, we show how these solutions affect the other transformations of the two correspondences.Other contributions include the application of the syntactic and functional correspondences to Boolean normalization.In particular, we show how to systematically derive a spectrum of normalization functions for negational and conjunctive normalization
    • …
    corecore