29,250 research outputs found

    Checking-in on Network Functions

    Full text link
    When programming network functions, changes within a packet tend to have consequences---side effects which must be accounted for by network programmers or administrators via arbitrary logic and an innate understanding of dependencies. Examples of this include updating checksums when a packet's contents has been modified or adjusting a payload length field of a IPv6 header if another header is added or updated within a packet. While static-typing captures interface specifications and how packet contents should behave, it does not enforce precise invariants around runtime dependencies like the examples above. Instead, during the design phase of network functions, programmers should be given an easier way to specify checks up front, all without having to account for and keep track of these consequences at each and every step during the development cycle. In keeping with this view, we present a unique approach for adding and generating both static checks and dynamic contracts for specifying and checking packet processing operations. We develop our technique within an existing framework called NetBricks and demonstrate how our approach simplifies and checks common dependent packet and header processing logic that other systems take for granted, all without adding much overhead during development.Comment: ANRW 2019 ~ https://irtf.org/anrw/2019/program.htm

    The SURE Reliability Analysis Program

    Get PDF
    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool

    Robust Processing of Natural Language

    Full text link
    Previous approaches to robustness in natural language processing usually treat deviant input by relaxing grammatical constraints whenever a successful analysis cannot be provided by ``normal'' means. This schema implies, that error detection always comes prior to error handling, a behaviour which hardly can compete with its human model, where many erroneous situations are treated without even noticing them. The paper analyses the necessary preconditions for achieving a higher degree of robustness in natural language processing and suggests a quite different approach based on a procedure for structural disambiguation. It not only offers the possibility to cope with robustness issues in a more natural way but eventually might be suited to accommodate quite different aspects of robust behaviour within a single framework.Comment: 16 pages, LaTeX, uses pstricks.sty, pstricks.tex, pstricks.pro, pst-node.sty, pst-node.tex, pst-node.pro. To appear in: Proc. KI-95, 19th German Conference on Artificial Intelligence, Bielefeld (Germany), Lecture Notes in Computer Science, Springer 199

    Real-time interactive speech technology at Threshold Technology, Incorporated

    Get PDF
    Basic real-time isolated-word recognition techniques are reviewed. Industrial applications of voice technology are described in chronological order of their development. Future research efforts are also discussed

    Formally based semi-automatic implementation of an open security protocol

    Get PDF
    International audienceThis paper presents an experiment in which an implementation of the client side of the SSH Transport Layer Protocol (SSH-TLP) was semi-automatically derived according to a model-driven development paradigm that leverages formal methods in order to obtain high correctness assurance. The approach used in the experiment starts with the formalization of the protocol at an abstract level. This model is then formally proved to fulfill the desired secrecy and authentication properties by using the ProVerif prover. Finally, a sound Java implementation is semi-automatically derived from the verified model using an enhanced version of the Spi2Java framework. The resulting implementation correctly interoperates with third party servers, and its execution time is comparable with that of other manually developed Java SSH-TLP client implementations. This case study demonstrates that the adopted model-driven approach is viable even for a real security protocol, despite the complexity of the models needed in order to achieve an interoperable implementation

    Experimental Biological Protocols with Formal Semantics

    Full text link
    Both experimental and computational biology is becoming increasingly automated. Laboratory experiments are now performed automatically on high-throughput machinery, while computational models are synthesized or inferred automatically from data. However, integration between automated tasks in the process of biological discovery is still lacking, largely due to incompatible or missing formal representations. While theories are expressed formally as computational models, existing languages for encoding and automating experimental protocols often lack formal semantics. This makes it challenging to extract novel understanding by identifying when theory and experimental evidence disagree due to errors in the models or the protocols used to validate them. To address this, we formalize the syntax of a core protocol language, which provides a unified description for the models of biochemical systems being experimented on, together with the discrete events representing the liquid-handling steps of biological protocols. We present both a deterministic and a stochastic semantics to this language, both defined in terms of hybrid processes. In particular, the stochastic semantics captures uncertainties in equipment tolerances, making it a suitable tool for both experimental and computational biologists. We illustrate how the proposed protocol language can be used for automated verification and synthesis of laboratory experiments on case studies from the fields of chemistry and molecular programming

    Proceedings of the 3rd Workshop on Domain-Specific Language Design and Implementation (DSLDI 2015)

    Full text link
    The goal of the DSLDI workshop is to bring together researchers and practitioners interested in sharing ideas on how DSLs should be designed, implemented, supported by tools, and applied in realistic application contexts. We are both interested in discovering how already known domains such as graph processing or machine learning can be best supported by DSLs, but also in exploring new domains that could be targeted by DSLs. More generally, we are interested in building a community that can drive forward the development of modern DSLs. These informal post-proceedings contain the submitted talk abstracts to the 3rd DSLDI workshop (DSLDI'15), and a summary of the panel discussion on Language Composition

    CTGEN - a Unit Test Generator for C

    Full text link
    We present a new unit test generator for C code, CTGEN. It generates test data for C1 structural coverage and functional coverage based on pre-/post-condition specifications or internal assertions. The generator supports automated stub generation, and data to be returned by the stub to the unit under test (UUT) may be specified by means of constraints. The typical application field for CTGEN is embedded systems testing; therefore the tool can cope with the typical aliasing problems present in low-level C, including pointer arithmetics, structures and unions. CTGEN creates complete test procedures which are ready to be compiled and run against the UUT. In this paper we describe the main features of CTGEN, their technical realisation, and we elaborate on its performance in comparison to a list of competing test generation tools. Since 2011, CTGEN is used in industrial scale test campaigns for embedded systems code in the automotive domain.Comment: In Proceedings SSV 2012, arXiv:1211.587

    SURE reliability analysis: Program and mathematics

    Get PDF
    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool
    • …
    corecore