16,655 research outputs found
Congruent Weak Conformance
This research addresses the problem of verifying implementations against specifications through an innovative logic approach. Congruent weak conformance, a formal relationship between agents and specifications, has been developed and proven to be a congruent partial order. This property arises from a set of relations called weak conformations. The largest, called weak conformance, is analogous to Milner\u27s observational equivalence. Weak conformance is not an equivalence, however, but rather an ordering relation among processes. Weak conformance allows behaviors in the implementation that are unreachable in the specification. Furthermore, it exploits output concurrencies and allows interleaving of extraneous output actions in the implementation. Finally, reasonable restrictions in CCS syntax strengthen weak conformance to a congruence, called congruent weak conformance. At present, congruent weak conformance is the best known formal relation for verifying implementations against specifications. This precongruence derives maximal flexibility and embodies all weaknesses in input, output, and no-connect signals while retaining a fully replaceable conformance to the specification. Congruent weak conformance has additional utility in verifying transformations between systems of incompatible semantics. This dissertation describes a hypothetical translator from the informal simulation semantics of VHDL to the bisimulation semantics of CCS. A second translator is described from VHDL to a broadcast-communication version of CCS. By showing that they preserve congruent weak conformance, both translators are verified
Using Lightweight Formal Methods for JavaScript Security
The goal of this work was to apply lightweight formal methods to the study of the security of the JavaScript language. Previous work has shown that lightweight formal methods present a new approach to the study of security in the context of the Java Virtual Machine (JVM). The current work has attempted to codify best current practices in the form of a security model for JavaScript. Such a model is a necessary component in analyzing browser actions for vulnerabilities, but it is not sufficient. It is also required to capture actual browser event traces and incorporate these into the model. The work described herein demonstrates that it is (a) possible to construct a model for JavaScript security that captures important properties of current best practices within browsers; and (b) that an event translator has been written that captures the dynamic properties of browser site traversal in such a way that model analysis is tractable, and yields important information about the satisfaction or refutation of the static security rules
Building Specifications in the Event-B Institution
This paper describes a formal semantics for the Event-B specification
language using the theory of institutions. We define an institution for
Event-B, EVT, and prove that it meets the validity requirements for
satisfaction preservation and model amalgamation. We also present a series of
functions that show how the constructs of the Event-B specification language
can be mapped into our institution. Our semantics sheds new light on the
structure of the Event-B language, allowing us to clearly delineate three
constituent sub-languages: the superstructure, infrastructure and mathematical
languages. One of the principal goals of our semantics is to provide access to
the generic modularisation constructs available in institutions, including
specification-building operators for parameterisation and refinement. We
demonstrate how these features subsume and enhance the corresponding features
already present in Event-B through a detailed study of their use in a worked
example. We have implemented our approach via a parser and translator for
Event-B specifications, EBtoEVT, which also provides a gateway to the Hets
toolkit for heterogeneous specification.Comment: 54 pages, 25 figure
Transforming ASN.1 Specifications into CafeOBJ to assist with Property Checking
The adoption of algebraic specification/formal method techniques by the
networks' research community is happening slowly but steadily. We work towards
a software environment that can translate a protocol's specification, from
Abstract Syntax Notation One (ASN.1 - a very popular specification language
with many applications), into the powerful algebraic specification language
CafeOBJ. The resulting code can be used to check, validate and falsify critical
properties of systems, at the pre-coding stage of development. In this paper,
we introduce some key elements of ASN.1 and CafeOBJ and sketch some first steps
towards the implementation of such a tool including a case study.Comment: 8 pages, 12 figure
The translatability of metaphor in LSP: application of a decision-making model
The pragmatic approach to translation implies the consideration of translation as a useful test case for understanding the role of language in social life. Under this view this article analyses the decision-making stage translators go through in the course of formulating a TT. Hence this article contributes both to enhance the status of translation theory and to explain some of the decisions taken by the Spanish translators of three English Manuals of Economics. In short, we have argued that the use of a 'maximax' strategy for translating English metaphors as Spanish similarity-creating metaphors can be attributed to subjective factors, especially to the translators' cognitive system, their knowledge bases, the task
specification, and the text type specific problem space. As a result, we have also
claimed that proposals for translating microtextual problems —for example, metaphors — can benefit from the study of the above-mentioned subjective factors since they allow or inhibit the translators' choices in the decision-making
stage of the translation process
The translatability of metaphor in LSP: application of a decision-making model
The pragmatic approach to translation implies the consideration of translation as a useful test case for understanding the role of language in social life. Under this view this article analyses the decision-making stage translators go through in the course of formulating a TT. Hence this article contributes both to enhance the status of translation theory and to explain some of the decisions taken by the Spanish translators of three English Manuals of Economics. In short, we have argued that the use of a 'maximax' strategy for translating English metaphors as Spanish similarity-creating metaphors can be attributed to subjective factors, especially to the translators' cognitive system, their knowledge bases, the task
specification, and the text type specific problem space. As a result, we have also
claimed that proposals for translating microtextual problems —for example, metaphors — can benefit from the study of the above-mentioned subjective factors since they allow or inhibit the translators' choices in the decision-making
stage of the translation process
Z2SAL: a translation-based model checker for Z
Despite being widely known and accepted in industry, the Z formal specification language has not so far been well supported by automated verification tools, mostly because of the challenges in handling the abstraction of the language. In this paper we discuss a novel approach to building a model-checker for Z, which involves implementing a translation from Z into SAL, the input language for the Symbolic Analysis Laboratory, a toolset which includes a number of model-checkers and a simulator. The Z2SAL translation deals with a number of important issues, including: mapping unbounded, abstract specifications into bounded, finite models amenable to a BDD-based symbolic checker; converting a non-constructive and piecemeal style of functional specification into a deterministic, automaton-based style of specification; and supporting the rich set-based vocabulary of the Z mathematical toolkit. This paper discusses progress made towards implementing as complete and faithful a translation as possible, while highlighting certain assumptions, respecting certain limitations and making use of available optimisations. The translation is illustrated throughout with examples; and a complete working example is presented, together with performance data
A Survey on Service Composition Middleware in Pervasive Environments
The development of pervasive computing has put the light on a challenging problem: how to dynamically compose services in heterogeneous and highly changing environments? We propose a survey that defines the service composition as a sequence of four steps: the translation, the generation, the evaluation, and finally the execution. With this powerful and simple model we describe the major service composition middleware. Then, a classification of these service composition middleware according to pervasive requirements - interoperability, discoverability, adaptability, context awareness, QoS management, security, spontaneous management, and autonomous management - is given. The classification highlights what has been done and what remains to do to develop the service composition in pervasive environments
- …