10,635 research outputs found

    Attack-Resilient Supervisory Control of Discrete-Event Systems

    Full text link
    In this work, we study the problem of supervisory control of discrete-event systems (DES) in the presence of attacks that tamper with inputs and outputs of the plant. We consider a very general system setup as we focus on both deterministic and nondeterministic plants that we model as finite state transducers (FSTs); this also covers the conventional approach to modeling DES as deterministic finite automata. Furthermore, we cover a wide class of attacks that can nondeterministically add, remove, or rewrite a sensing and/or actuation word to any word from predefined regular languages, and show how such attacks can be modeled by nondeterministic FSTs; we also present how the use of FSTs facilitates modeling realistic (and very complex) attacks, as well as provides the foundation for design of attack-resilient supervisory controllers. Specifically, we first consider the supervisory control problem for deterministic plants with attacks (i) only on their sensors, (ii) only on their actuators, and (iii) both on their sensors and actuators. For each case, we develop new conditions for controllability in the presence of attacks, as well as synthesizing algorithms to obtain FST-based description of such attack-resilient supervisors. A derived resilient controller provides a set of all safe control words that can keep the plant work desirably even in the presence of corrupted observation and/or if the control words are subjected to actuation attacks. Then, we extend the controllability theorems and the supervisor synthesizing algorithms to nondeterministic plants that satisfy a nonblocking condition. Finally, we illustrate applicability of our methodology on several examples and numerical case-studies

    Badger: Complexity Analysis with Fuzzing and Symbolic Execution

    Full text link
    Hybrid testing approaches that involve fuzz testing and symbolic execution have shown promising results in achieving high code coverage, uncovering subtle errors and vulnerabilities in a variety of software applications. In this paper we describe Badger - a new hybrid approach for complexity analysis, with the goal of discovering vulnerabilities which occur when the worst-case time or space complexity of an application is significantly higher than the average case. Badger uses fuzz testing to generate a diverse set of inputs that aim to increase not only coverage but also a resource-related cost associated with each path. Since fuzzing may fail to execute deep program paths due to its limited knowledge about the conditions that influence these paths, we complement the analysis with a symbolic execution, which is also customized to search for paths that increase the resource-related cost. Symbolic execution is particularly good at generating inputs that satisfy various program conditions but by itself suffers from path explosion. Therefore, Badger uses fuzzing and symbolic execution in tandem, to leverage their benefits and overcome their weaknesses. We implemented our approach for the analysis of Java programs, based on Kelinci and Symbolic PathFinder. We evaluated Badger on Java applications, showing that our approach is significantly faster in generating worst-case executions compared to fuzzing or symbolic execution on their own

    On the Implementation of GNU Prolog

    Get PDF
    GNU Prolog is a general-purpose implementation of the Prolog language, which distinguishes itself from most other systems by being, above all else, a native-code compiler which produces standalone executables which don't rely on any byte-code emulator or meta-interpreter. Other aspects which stand out include the explicit organization of the Prolog system as a multipass compiler, where intermediate representations are materialized, in Unix compiler tradition. GNU Prolog also includes an extensible and high-performance finite domain constraint solver, integrated with the Prolog language but implemented using independent lower-level mechanisms. This article discusses the main issues involved in designing and implementing GNU Prolog: requirements, system organization, performance and portability issues as well as its position with respect to other Prolog system implementations and the ISO standardization initiative.Comment: 30 pages, 3 figures, To appear in Theory and Practice of Logic Programming (TPLP); Keywords: Prolog, logic programming system, GNU, ISO, WAM, native code compilation, Finite Domain constraint

    Mejora de la evaluación de expresiones regulares sobre hardware reconfigurable

    Get PDF
    Since the Internet was born, the amount of data that systems process has increased in an exponential way and this is the reason because these systems need to be fast, flexible and powerful. Nowadays, communications keep increasing the speed requirements for data processing, and the FPGA‟s are ideal for this task. In data processing, a huge amount of time is dedicated to pattern matching, frequently involving regular expressions matching. As the amount of patterns to be checked grow up, so does the hardware complexity dedicated to its recognition. Thus it needs to be flexible to be able to adapt to the necessary changes with ease. In this project a VHDL code generator implemented in Java is presented. The code generated describes a regular expressions recognizer of various sets given by parameter, which will be synthetized by an FPGA. This module takes various sets of regular expressions and generates the VHDL code that describes the system which recognizes them. The code generator is flexible, due to great modularity and upgradeability that software offers. Thus, the main advantage of this model consists on the possibility of combining the flexibility of software with the speed of hardware in order to create fast and low cost recognizers in a flexible and easy way

    Lily: A parser generator for LL(1) languages

    Get PDF
    This paper discusses the design and implementation of Lily, a language for generating LL(1) language parsers, originally designed by Dr. Thomas J. Sager of the University of Missouri--Rolla. A method for the automatic generation of parser tables is described which creates small, highly optimized tables, suitable for conversion to minimal perfect hash functions. An implementation of Lily is discussed with attention to design goals, implementation of parser table generation, and table optimization techniques. Proposals are made detailing possibilities for further augmentation of the system. Examples of Lily programs are given as well as a manual for the system
    • …
    corecore