143 research outputs found

    Protecting Systems From Exploits Using Language-Theoretic Security

    Get PDF
    Any computer program processing input from the user or network must validate the input. Input-handling vulnerabilities occur in programs when the software component responsible for filtering malicious input---the parser---does not perform validation adequately. Consequently, parsers are among the most targeted components since they defend the rest of the program from malicious input. This thesis adopts the Language-Theoretic Security (LangSec) principle to understand what tools and research are needed to prevent exploits that target parsers. LangSec proposes specifying the syntactic structure of the input format as a formal grammar. We then build a recognizer for this formal grammar to validate any input before the rest of the program acts on it. To ensure that these recognizers represent the data format, programmers often rely on parser generators or parser combinators tools to build the parsers. This thesis propels several sub-fields in LangSec by proposing new techniques to find bugs in implementations, novel categorizations of vulnerabilities, and new parsing algorithms and tools to handle practical data formats. To this end, this thesis comprises five parts that tackle various tenets of LangSec. First, I categorize various input-handling vulnerabilities and exploits using two frameworks. First, I use the mismorphisms framework to reason about vulnerabilities. This framework helps us reason about the root causes leading to various vulnerabilities. Next, we built a categorization framework using various LangSec anti-patterns, such as parser differentials and insufficient input validation. Finally, we built a catalog of more than 30 popular vulnerabilities to demonstrate the categorization frameworks. Second, I built parsers for various Internet of Things and power grid network protocols and the iccMAX file format using parser combinator libraries. The parsers I built for power grid protocols were deployed and tested on power grid substation networks as an intrusion detection tool. The parser I built for the iccMAX file format led to several corrections and modifications to the iccMAX specifications and reference implementations. Third, I present SPARTA, a novel tool I built that generates Rust code that type checks Portable Data Format (PDF) files. The type checker I helped build strictly enforces the constraints in the PDF specification to find deviations. Our checker has contributed to at least four significant clarifications and corrections to the PDF 2.0 specification and various open-source PDF tools. In addition to our checker, we also built a practical tool, PDFFixer, to dynamically patch type errors in PDF files. Fourth, I present ParseSmith, a tool to build verified parsers for real-world data formats. Most parsing tools available for data formats are insufficient to handle practical formats or have not been verified for their correctness. I built a verified parsing tool in Dafny that builds on ideas from attribute grammars, data-dependent grammars, and parsing expression grammars to tackle various constructs commonly seen in network formats. I prove that our parsers run in linear time and always terminate for well-formed grammars. Finally, I provide the earliest systematic comparison of various data description languages (DDLs) and their parser generation tools. DDLs are used to describe and parse commonly used data formats, such as image formats. Next, I conducted an expert elicitation qualitative study to derive various metrics that I use to compare the DDLs. I also systematically compare these DDLs based on sample data descriptions available with the DDLs---checking for correctness and resilience

    Cyber Security of Critical Infrastructures

    Get PDF
    Critical infrastructures are vital assets for public safety, economic welfare, and the national security of countries. The vulnerabilities of critical infrastructures have increased with the widespread use of information technologies. As Critical National Infrastructures are becoming more vulnerable to cyber-attacks, their protection becomes a significant issue for organizations as well as nations. The risks to continued operations, from failing to upgrade aging infrastructure or not meeting mandated regulatory regimes, are considered highly significant, given the demonstrable impact of such circumstances. Due to the rapid increase of sophisticated cyber threats targeting critical infrastructures with significant destructive effects, the cybersecurity of critical infrastructures has become an agenda item for academics, practitioners, and policy makers. A holistic view which covers technical, policy, human, and behavioural aspects is essential to handle cyber security of critical infrastructures effectively. Moreover, the ability to attribute crimes to criminals is a vital element of avoiding impunity in cyberspace. In this book, both research and practical aspects of cyber security considerations in critical infrastructures are presented. Aligned with the interdisciplinary nature of cyber security, authors from academia, government, and industry have contributed 13 chapters. The issues that are discussed and analysed include cybersecurity training, maturity assessment frameworks, malware analysis techniques, ransomware attacks, security solutions for industrial control systems, and privacy preservation methods

    Identity dissolved in isolation: the contrasting notions of density and ‘thin-ness’ in haunted places in the literature of the supernatural from the 18th century to the modern age, with particular reference to works by Shirley Jackson, Stephen King and John Langan, and the development of these themes in the writing of no man.

    Get PDF
    My aim is to illustrate the development of inter-related themes of personal identity and isolation, in both physical and psychological senses, within the literature of the supernatural and to trace the development of a treatment of the horror of physical and mental disintegration which is increasingly psychologically-aware both in authors and readership through the introduction of ‘the thin place’ as a trope as it became an explicit feature of stories within ‘horror fiction’. To begin this thesis I will offer explanations of some key terms relating to the literature of the supernatural, incorporated within a necessarily brief historical review in Chapter 1, DEFINING THE INDEFINABLE: The Development of Themes of ‘Thin-ness’ Within Stories of the Supernatural From Early History to The Gothic. In literary works predating the arrival of ‘the Gothic’ as a distinctly identified form, I will show that there is no clear boundary demarcating the ‘natural’ from the ‘supernatural’. This boundary becomes more clearly defined in later literature wherein the ‘supernatural’ is increasingly seen as a wholly separate, often inimical realm. I will demonstrate that the notion of ‘density’ which I will identify as emerging more fully and precisely in the later twentieth century should be seen as representing a ‘consensual reality’ in contrast to the ‘indeterminacy’ which is one characteristic of the supernatural. In the course of this investigation, I will draw upon a number of different approaches, including definitions of the various associated genres in Section 1.2, with an exploration in sections 1.3 and 1.3.3 CRITICAL ENGAGEMENT: The Horrors of Isolation and The Dissolution of Identity, of some major critical currents shaping the treatment of these themes. This will be linked to the psychological insight which views irruption through ‘thin-ness’ as a transgressive motif, often including both metaphorical and literal ‘penetration of the boundaries’ - metaphysical, as between planes of existence, and physical, as in penetration of the flesh. Thus a new understanding of a hitherto familiar literary trope in this stream of fiction was developed, combining the psychological horror of isolation, the physiological horror of ‘penetration’ (with concomitant death a likely outcome) and a third, metaphysical element of horror in the face of modes of existence wholly inimical to humanity. In Chapter 2, the works of Shirley Jackson, Stephen King and John Langan, along with material from other authors working within broadly similar traditions, will be examined and compared to reveal common threads in their treatment of isolation in ‘thin places’ along with the subsequent dissolution of the ‘density’ of identity suffered by their characters. I will reference the ways in which particular settings have been used in stories by these authors, namely the ‘haunted places’, increasingly described as ‘thin places’, where the boundary between natural and supernatural is easily traversed. In the course of this examination, I will demonstrate the continuing emergence, and import of, the notion of ‘density’ as a marker of normality, in contrast to the ‘thin’ nature of the boundary with the transgressive supernatural, and also show some of the ways in which this treatment manifests in modern stories of the supernatural. This trope, I contend, has developed following a conscious ‘psychologisation’ of the experience of writing and reading tales of the supernatural which suggests a blurring of traditional boundaries of inner and outer experience, and, by extension, of reality and fantasy. I will also demonstrate some of the ways in which this particular stream of literature of the transgressive has developed to reflect the concerns of readerships of the time. There will be a focus upon elements which became of central importance in attempts to define the genre: issues concerning setting as character; and of identity and ontology, the latter in the sense of exploring what there is. I will seek to show how dissolution of identity plays a key part in many related genre stories, and how this dissolution is reflected in the themes and language used in the texts as ‘thinning’. The notion of ‘density’ is taken as being of fundamental importance in modern novels of the supernatural placed within the tradition of ‘Contemporary Gothic’ (as opposed to the more thematic concerns of the ‘new Gothic’). In modern times, physical and mental injury have both become seen as methods for demonstrating the dissolution of identity, in which both body and psyche wear thin (as examined in section 2.2.5). In the third chapter, the thesis sets out a thematic record of the process of the creation of my novel of the supernatural, No Man, tracing influences, techniques and methodologies employed in two sections: looking at characters and settings in section 3.1, and at autobiographical influences on the roots of the story in section 3.2. I will identify where the novel draws upon the methodologies outlined previously, and where it consciously draws upon contrasts of exteriority and interiority and where the boundary which separates these contrasts becomes foregrounded. Finally, I will attempt to place my novel within the literary tradition of tales of the supernatural, and bring to bear an authorial analysis, with explanations of the ways in which elements described above are developed within the story. This element of the thesis will aim to illuminate ways in which traditional themes, tropes and motifs of isolation and the dissolution of identity have been incorporated in a modern novel of the supernatural, developing the contrasting notions of density and ‘thin-ness’ as major thematic concerns and plot elements

    Resilience of Timed Systems

    Get PDF
    This paper addresses reliability of timed systems in the setting of resilience, that considers the behaviors of a system when unspecified timing errors such as missed deadlines occur. Given a fault model that allows transitions to fire later than allowed by their guard, a system is universally resilient (or self-resilient) if after a fault, it always returns to a timed behavior of the non-faulty system. It is existentially resilient if after a fault, there exists a way to return to a timed behavior of the non-faulty system, that is, if there exists a controller which can guide the system back to a normal behavior. We show that universal resilience of timed automata is undecidable, while existential resilience is decidable, in EXPSPACE. To obtain better complexity bounds and decidability of universal resilience, we consider untimed resilience, as well as subclasses of timed automata

    Self-repair during continuous motion with modular robots

    Get PDF
    Through the use of multiple modules with the ability to reconfigure to form different morphologies, modular robots provide a potential method to develop more adaptable and resilient robots. Robots operating in challenging and hard-to-reach environments such as infrastructure inspection, post-disaster search-and-rescue under rubble and planetary surface exploration, could benefit from the capabilities modularity offers, especially the inherent fault tolerance which reconfigurability can provide. With self-reconfigurable modular robots self-repair, removing failed modules from a larger structure to replace them with operating modules, allows the functionality of the multi-robot organism as a whole to be recovered when modules are damaged. Previous self-repair work has, for the duration of self-repair procedures, paused group tasks in which the multi-robot organism was engaged, this thesis investigates Self-repair during continuous motion, ``Dynamic Self-repair", as a way to allow repair and group tasks to proceed concurrently. In this thesis a new modular robotic platform, Omni-Pi-tent, with capabilities for Dynamic Self-repair is developed. This platform provides a unique combination of genderless docking, omnidirectional locomotion, 3D reconfiguration possibilities and onboard sensing and autonomy. The platform is used in a series of simulated experiments to compare the performance of newly developed dynamic strategies for self-repair and self-assembly to adaptations of previous work, and in hardware demonstrations to explore their practical feasibility. Novel data structures for defining modular robotic structures, and the algorithms to process them for self-repair, are explained. It is concluded that self-repair during continuous motion can allow modular robots to complete tasks faster, and more effectively, than self-repair strategies which require collective tasks to be halted. The hardware and strategies developed in this thesis should provide valuable lessons for bringing modular robots closer to real-world applications

    Computer Aided Verification

    Get PDF
    This open access two-volume set LNCS 11561 and 11562 constitutes the refereed proceedings of the 31st International Conference on Computer Aided Verification, CAV 2019, held in New York City, USA, in July 2019. The 52 full papers presented together with 13 tool papers and 2 case studies, were carefully reviewed and selected from 258 submissions. The papers were organized in the following topical sections: Part I: automata and timed systems; security and hyperproperties; synthesis; model checking; cyber-physical systems and machine learning; probabilistic systems, runtime techniques; dynamical, hybrid, and reactive systems; Part II: logics, decision procedures; and solvers; numerical programs; verification; distributed systems and networks; verification and invariants; and concurrency

    Modeling and Intelligent Control for Spatial Processes and Spatially Distributed Systems

    Full text link
    Dynamical systems are often characterized by their time-dependent evolution, named temporal dynamics. The space-dependent evolution of dynamical systems, named spatial dynamics, is another important domain of interest for many engineering applications. By studying both the spatial and temporal evolution, novel modeling and control applications may be developed for many industrial processes. One process of special interest is additive manufacturing, where a three-dimensional object is manufactured in a layer-wise fashion via a numerically controlled process. The material is printed over a spatial domain in each layer and subsequent layers are printed on top of each other. The spatial dynamics of the printing process over the layers is named the layer-to-layer spatial dynamics. Additive manufacturing provides great flexibility in terms of material selection and design geometry for modern manufacturing applications, and has been hailed as a cornerstone technology for smart manufacturing, or Industry 4.0, applications in industry. However, due to the issues in reliability and repeatability, the applicability of additive manufacturing in industry has been limited. Layer-to-layer spatial dynamics represent the dynamics of the printed part. Through the layer-to-layer spatial dynamics, it is possible to represent the physical properties of the part such as dimensional properties of each layer in the form of a heightmap over a spatial domain. Thus, by considering the spatial dynamics, it is possible to develop models and controllers for the physical properties of a printed part. This dissertation develops control-oriented models to characterize the spatial dynamics and layer-to-layer closed-loop controllers to improve the performance of the printed parts in the layer-to-layer spatial domain. In practice, additive manufacturing resources are often utilized as a fleet to improve the throughput and yield of a manufacturing system. An additive manufacturing fleet poses additional challenges in modeling, analysis, and control at a system-level. An additive manufacturing fleet is an instance of the more general class of spatially distributed systems, where the resources in the system (e.g., additive manufacturing machines, robots) are spatially distributed within the system. The goal is to efficiently model, analyze, and control spatially distributed systems by considering the system-level interactions of the resources. This dissertation develops a centralized system-level modeling and control framework for additive manufacturing fleets. Many monitoring and control applications rely on the availability of run-time, up-to-date representations of the physical resources (e.g., the spatial state of a process, connectivity and availability of resources in a fleet). Purpose-driven digital representations of the physical resources, known as digital twins, provide up-to-date digital representations of resources in run-time for analysis and control. This dissertation develops an extensible digital twin framework for cyber-physical manufacturing systems. The proposed digital twin framework is demonstrated through experimental case studies on abnormality detection, cyber-security, and spatial monitoring for additive manufacturing processes. The results and the contributions presented in this dissertation improve the performance and reliability of additive manufacturing processes and fleets for industrial applications, which in turn enables next-generation manufacturing systems with enhanced control and analysis capabilities through intelligent controllers and digital twins.PHDMechanical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/169635/1/baltaefe_1.pd

    Enhancing Computer Network Security through Improved Outlier Detection for Data Streams

    Get PDF
    V několika posledních letech se metody strojového učení (zvláště ty zabývající se detekcí odlehlých hodnot - OD) v oblasti kyberbezpečnosti opíraly o zjišťování anomálií síťového provozu spočívajících v nových schématech útoků. Detekce anomálií v počítačových sítích reálného světa se ale stala stále obtížnější kvůli trvalému nárůstu vysoce objemných, rychlých a dimenzionálních průběžně přicházejících dat (SD), pro která nejsou k dispozici obecně uznané a pravdivé informace o anomalitě. Účinná detekční schémata pro vestavěná síťová zařízení musejí být rychlá a paměťově nenáročná a musejí být schopna se potýkat se změnami konceptu, když se vyskytnou. Cílem této disertace je zlepšit bezpečnost počítačových sítí zesílenou detekcí odlehlých hodnot v datových proudech, obzvláště SD, a dosáhnout kyberodolnosti, která zahrnuje jak detekci a analýzu, tak reakci na bezpečnostní incidenty jako jsou např. nové zlovolné aktivity. Za tímto účelem jsou v práci navrženy čtyři hlavní příspěvky, jež byly publikovány nebo se nacházejí v recenzním řízení časopisů. Zaprvé, mezera ve volbě vlastností (FS) bez učitele pro zlepšování již hotových metod OD v datových tocích byla zaplněna navržením volby vlastností bez učitele pro detekci odlehlých průběžně přicházejících dat označované jako UFSSOD. Následně odvozujeme generický koncept, který ukazuje dva aplikační scénáře UFSSOD ve spojení s online algoritmy OD. Rozsáhlé experimenty ukázaly, že UFSSOD coby algoritmus schopný online zpracování vykazuje srovnatelné výsledky jako konkurenční metoda upravená pro OD. Zadruhé představujeme nový aplikační rámec nazvaný izolovaný les založený na počítání výkonu (PCB-iForest), jenž je obecně schopen využít jakoukoliv online OD metodu založenou na množinách dat tak, aby fungovala na SD. Do tohoto algoritmu integrujeme dvě varianty založené na klasickém izolovaném lese. Rozsáhlé experimenty provedené na 23 multidisciplinárních datových sadách týkajících se bezpečnostní problematiky reálného světa ukázaly, že PCB-iForest jasně překonává už zavedené konkurenční metody v 61 % případů a dokonce dosahuje ještě slibnějších výsledků co do vyváženosti mezi výpočetními náklady na klasifikaci a její úspěšností. Zatřetí zavádíme nový pracovní rámec nazvaný detekce odlehlých hodnot a rozpoznávání schémat útoku proudovým způsobem (SOAAPR), jenž je na rozdíl od současných metod schopen zpracovat výstup z různých online OD metod bez učitele proudovým způsobem, aby získal informace o nových schématech útoku. Ze seshlukované množiny korelovaných poplachů jsou metodou SOAAPR vypočítány tři různé soukromí zachovávající podpisy podobné otiskům prstů, které charakterizují a reprezentují potenciální scénáře útoku s ohledem na jejich komunikační vztahy, projevy ve vlastnostech dat a chování v čase. Evaluace na dvou oblíbených datových sadách odhalila, že SOAAPR může soupeřit s konkurenční offline metodou ve schopnosti korelace poplachů a významně ji překonává z hlediska výpočetního času . Navíc se všechny tři typy podpisů ve většině případů zdají spolehlivě charakterizovat scénáře útoků tím, že podobné seskupují k sobě. Začtvrté představujeme algoritmus nepárového kódu autentizace zpráv (Uncoupled MAC), který propojuje oblasti kryptografického zabezpečení a detekce vniknutí (IDS) pro síťovou bezpečnost. Zabezpečuje síťovou komunikaci (autenticitu a integritu) kryptografickým schématem s podporou druhé vrstvy kódy autentizace zpráv, ale také jako vedlejší efekt poskytuje funkcionalitu IDS tak, že vyvolává poplach na základě porušení hodnot nepárového MACu. Díky novému samoregulačnímu rozšíření algoritmus adaptuje svoje vzorkovací parametry na základě zjištění škodlivých aktivit. Evaluace ve virtuálním prostředí jasně ukazuje, že schopnost detekce se za běhu zvyšuje pro různé scénáře útoku. Ty zahrnují dokonce i situace, kdy se inteligentní útočníci snaží využít slabá místa vzorkování.ObhájenoOver the past couple of years, machine learning methods - especially the Outlier Detection (OD) ones - have become anchored to the cyber security field to detect network-based anomalies rooted in novel attack patterns. Due to the steady increase of high-volume, high-speed and high-dimensional Streaming Data (SD), for which ground truth information is not available, detecting anomalies in real-world computer networks has become a more and more challenging task. Efficient detection schemes applied to networked, embedded devices need to be fast and memory-constrained, and must be capable of dealing with concept drifts when they occur. The aim of this thesis is to enhance computer network security through improved OD for data streams, in particular SD, to achieve cyber resilience, which ranges from the detection, over the analysis of security-relevant incidents, e.g., novel malicious activity, to the reaction to them. Therefore, four major contributions are proposed, which have been published or are submitted journal articles. First, a research gap in unsupervised Feature Selection (FS) for the improvement of off-the-shell OD methods in data streams is filled by proposing Unsupervised Feature Selection for Streaming Outlier Detection, denoted as UFSSOD. A generic concept is retrieved that shows two application scenarios of UFSSOD in conjunction with online OD algorithms. Extensive experiments have shown that UFSSOD, as an online-capable algorithm, achieves comparable results with a competitor trimmed for OD. Second, a novel unsupervised online OD framework called Performance Counter-Based iForest (PCB-iForest) is being introduced, which generalized, is able to incorporate any ensemble-based online OD method to function on SD. Two variants based on classic iForest are integrated. Extensive experiments, performed on 23 different multi-disciplinary and security-related real-world data sets, revealed that PCB-iForest clearly outperformed state-of-the-art competitors in 61 % of cases and even achieved more promising results in terms of the tradeoff between classification and computational costs. Third, a framework called Streaming Outlier Analysis and Attack Pattern Recognition, denoted as SOAAPR is being introduced that, in contrast to the state-of-the-art, is able to process the output of various online unsupervised OD methods in a streaming fashion to extract information about novel attack patterns. Three different privacy-preserving, fingerprint-like signatures are computed from the clustered set of correlated alerts by SOAAPR, which characterize and represent the potential attack scenarios with respect to their communication relations, their manifestation in the data's features and their temporal behavior. The evaluation on two popular data sets shows that SOAAPR can compete with an offline competitor in terms of alert correlation and outperforms it significantly in terms of processing time. Moreover, in most cases all three types of signatures seem to reliably characterize attack scenarios to the effect that similar ones are grouped together. Fourth, an Uncoupled Message Authentication Code algorithm - Uncoupled MAC - is presented which builds a bridge between cryptographic protection and Intrusion Detection Systems (IDSs) for network security. It secures network communication (authenticity and integrity) through a cryptographic scheme with layer-2 support via uncoupled message authentication codes but, as a side effect, also provides IDS-functionality producing alarms based on the violation of Uncoupled MAC values. Through a novel self-regulation extension, the algorithm adapts its sampling parameters based on the detection of malicious actions on SD. The evaluation in a virtualized environment clearly shows that the detection rate increases over runtime for different attack scenarios. Those even cover scenarios in which intelligent attackers try to exploit the downsides of sampling

    Design and Optimization in Near-term Quantum Computation

    Get PDF
    Quantum computers have come a long way since conception, and there is still a long way to go before the dream of universal, fault-tolerant computation is realized. In the near term, quantum computers will occupy a middle ground that is popularly known as the “Noisy, Intermediate-Scale Quantum” (or NISQ) regime. The NISQ era represents a transition in the nature of quantum devices from experimental to computational. There is significant interest in engineering NISQ devices and NISQ algorithms in a manner that will guide the development of quantum computation in this regime and into the era of fault-tolerant quantum computing. In this thesis, we study two aspects of near-term quantum computation. The first of these is the design of device architectures, covered in Chapters 2, 3, and 4. We examine different qubit connectivities on the basis of their graph properties, and present numerical and analytical results on the speed at which large entangled states can be created on nearest-neighbor grids and graphs with modular structure. Next, we discuss the problem of permuting qubits among the nodes of the connectivity graph using only local operations, also known as routing. Using a fast quantum primitive to reverse the qubits in a chain, we construct a hybrid, quantum/classical routing algorithm on the chain. We show via rigorous bounds that this approach is faster than any SWAP-based algorithm for the same problem. The second part, which spans the final three chapters, discusses variational algorithms, which are a class of algorithms particularly suited to near-term quantum computation. Two prototypical variational algorithms, quantum adiabatic optimization (QAO) and the quantum approximate optimization algorithm (QAOA), are studied for the difference in their control strategies. We show that on certain crafted problem instances, bang-bang control (QAOA) can be as much as exponentially faster than quasistatic control (QAO). Next, we demonstrate the performance of variational state preparation on an analog quantum simulator based on trapped ions. We show that using classical heuristics that exploit structure in the variational parameter landscape, one can find circuit parameters efficiently in system size as well as circuit depth. In the experiment, we approximate the ground state of a critical Ising model with long-ranged interactions on up to 40 spins. Finally, we study the performance of Local Tensor, a classical heuristic algorithm inspired by QAOA on benchmarking instances of the MaxCut problem, and suggest physically motivated choices for the algorithm hyperparameters that are found to perform well empirically. We also show that our implementation of Local Tensor mimics imaginary-time quantum evolution under the problem Hamiltonian
    corecore