11,975 research outputs found
Parsing Expression Grammars Made Practical
Parsing Expression Grammars (PEGs) define languages by specifying
recursive-descent parser that recognises them. The PEG formalism exhibits
desirable properties, such as closure under composition, built-in
disambiguation, unification of syntactic and lexical concerns, and closely
matching programmer intuition. Unfortunately, state of the art PEG parsers
struggle with left-recursive grammar rules, which are not supported by the
original definition of the formalism and can lead to infinite recursion under
naive implementations. Likewise, support for associativity and explicit
precedence is spotty. To remedy these issues, we introduce Autumn, a general
purpose PEG library that supports left-recursion, left and right associativity
and precedence rules, and does so efficiently. Furthermore, we identify infix
and postfix operators as a major source of inefficiency in left-recursive PEG
parsers and show how to tackle this problem. We also explore the extensibility
of the PEG paradigm by showing how one can easily introduce new parsing
operators and how our parser accommodates custom memoization and error handling
strategies. We compare our parser to both state of the art and battle-tested
PEG and CFG parsers, such as Rats!, Parboiled and ANTLR.Comment: "Proceedings of the International Conference on Software Language
Engineering (SLE 2015)" - 167-172 (ISBN : 978-1-4503-3686-4
CoInDiVinE: Parallel Distributed Model Checker for Component-Based Systems
CoInDiVinE is a tool for parallel distributed model checking of interactions
among components in hierarchical component-based systems. The tool extends the
DiVinE framework with a new input language (component-interaction automata) and
a property specification logic (CI-LTL). As the language differs from the input
language of DiVinE, our tool employs a new state space generation algorithm
that also supports partial order reduction. Experiments indicate that the tool
has good scaling properties when run in parallel setting.Comment: In Proceedings PDMC 2011, arXiv:1111.006
Monitoring-Oriented Programming: A Tool-Supported Methodology for Higher Quality Object-Oriented Software
This paper presents a tool-supported methodological paradigm for object-oriented software development, called monitoring-oriented programming and abbreviated MOP, in which runtime monitoring is a basic software design principle. The general idea underlying MOP is that software developers insert specifications in their code via annotations. Actual monitoring code is automatically synthesized from these annotations before compilation and integrated at appropriate places in the program, according to user-defined configuration attributes. This way, the specification is checked at runtime against the implementation. Moreover, violations and/or validations of specifications can trigger user-defined code at any points in the program, in particular recovery code, outputting or sending messages, or raising exceptions.
The MOP paradigm does not promote or enforce any specific formalism to specify requirements: it allows the users to plug-in their favorite or domain-specific specification formalisms via logic plug-in modules. There are two major technical challenges that MOP supporting tools unavoidably face: monitor synthesis and monitor integration. The former is heavily dependent on the specification formalism and comes as part of the corresponding logic plug-in, while the latter is uniform for all specification formalisms and depends only on the target programming language. An experimental prototype tool, called Java-MOP, is also discussed, which currently supports most but not all of the desired MOP features. MOP aims at reducing the gap between formal specification and implementation, by integrating the two and allowing them together to form a system
Weighted Modal Transition Systems
Specification theories as a tool in model-driven development processes of
component-based software systems have recently attracted a considerable
attention. Current specification theories are however qualitative in nature,
and therefore fragile in the sense that the inevitable approximation of systems
by models, combined with the fundamental unpredictability of hardware
platforms, makes it difficult to transfer conclusions about the behavior, based
on models, to the actual system. Hence this approach is arguably unsuited for
modern software systems. We propose here the first specification theory which
allows to capture quantitative aspects during the refinement and implementation
process, thus leveraging the problems of the qualitative setting.
Our proposed quantitative specification framework uses weighted modal
transition systems as a formal model of specifications. These are labeled
transition systems with the additional feature that they can model optional
behavior which may or may not be implemented by the system. Satisfaction and
refinement is lifted from the well-known qualitative to our quantitative
setting, by introducing a notion of distances between weighted modal transition
systems. We show that quantitative versions of parallel composition as well as
quotient (the dual to parallel composition) inherit the properties from the
Boolean setting.Comment: Submitted to Formal Methods in System Desig
Reductionism and the Universal Calculus
In the seminal essay, "On the unreasonable effectiveness of mathematics in
the physical sciences," physicist Eugene Wigner poses a fundamental
philosophical question concerning the relationship between a physical system
and our capacity to model its behavior with the symbolic language of
mathematics. In this essay, I examine an ambitious 16th and 17th-century
intellectual agenda from the perspective of Wigner's question, namely, what
historian Paolo Rossi calls "the quest to create a universal language." While
many elite thinkers pursued related ideas, the most inspiring and forceful was
Gottfried Leibniz's effort to create a "universal calculus," a pictorial
language which would transparently represent the entirety of human knowledge,
as well as an associated symbolic calculus with which to model the behavior of
physical systems and derive new truths. I suggest that a deeper understanding
of why the efforts of Leibniz and others failed could shed light on Wigner's
original question. I argue that the notion of reductionism is crucial to
characterizing the failure of Leibniz's agenda, but that a decisive argument
for the why the promises of this effort did not materialize is still lacking.Comment: 11 pages, 1 figur
From Cbits to Qbits: Teaching computer scientists quantum mechanics
A strategy is suggested for teaching mathematically literate students, with
no background in physics, just enough quantum mechanics for them to understand
and develop algorithms in quantum computation and quantum information theory.
Although the article as a whole addresses teachers of physics, well versed in
quantum mechanics, the central pedagogical development is addressed directly to
computer scientists and mathematicians, with only occasional asides to their
teacher. Physicists uninterested in quantum pedagogy may be amused (or
irritated) by some of the views of standard quantum mechanics that arise
naturally from this unorthodox perspective.Comment: 19 pages, no figures. Submitted to the American Journal of Physic
- …