17 research outputs found

    Behavioral validation in Cyber-physical systems: Safety violations and beyond

    Get PDF
    The advances in software and hardware technologies in the last two decades have paved the way for the development of complex systems we observe around us. Avionics, automotive, power grid, medical devices, and robotics are a few examples of such systems which are usually termed as Cyber-physical systems (CPS) as they often involve both physical and software components. Deployment of a CPS in a safety critical application mandates that the system operates reliably even in adverse scenarios. While effective in improving confidence in system functionality, testing can not ascertain the absence of failures; whereas, formal verification can be exhaustive but it may not scale well as the system complexity grows. Simulation driven analysis tends to bridge this gap by tapping key system properties from the simulations. Despite their differences, all these analyses can be pivotal in providing system behaviors as the evidence to the satisfaction or violation of a given performance specification. However, less attention has been paid to algorithmically validating and characterizing different behaviors of a CPS. The focus of this thesis is on behavioral validation of Cyber-physical systems, which can supplement an existing CPS analysis framework. This thesis develops algorithmic tools for validating verification artifacts by generating a variety of counterexamples for a safety violation in a linear hybrid system. These counterexamples can serve as performance metrics to evaluate different controllers during design and testing phases. This thesis introduces the notion of complete characterization of a safety violation in a linear system with bounded inputs, and it proposes a sound technique to compute and efficiently represent these characterizations. This thesis further presents neural network based frameworks to perform systematic state space exploration guided by sensitivity or its gradient approximation in learning-enabled control (LEC) systems. The presented technique is accompanied with convergence guarantees and yields considerable performance gain over a widely used falsification platform for a class of signal temporal logic (STL) specifications.Doctor of Philosoph

    Declarative symbolic pure-logic model checking

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.Includes bibliographical references (p. 173-181).Model checking, a technique for findings errors in systems, involves building a formal model that describes possible system behaviors and correctness conditions, and using a tool to search for model behaviors violating correctness properties. Existing model checkers are well-suited for analyzing control-intensive algorithms (e.g. network protocols with simple node state). Many important analyses, however, fall outside the capabilities of existing model checkers. Examples include checking algorithms with complex state, distributed algorithms over all network topologies, and highly declarative models. This thesis addresses the problem of building an efficient model checker that overcomes these limitations. The work builds on Alloy, a relational modeling language. Previous work has defined the language and shown that it can be analyzed by translation to SAT. The primary contributions of this thesis include: a modeling paradigm for describing complex structures in Alloy; significant improvements in scalability of the analyzer; and improvements in usability of the analyzer via addition of a debugger for over constraints. Together, these changes make model-checking practical for important new classes of analyses. While the work was done in the context of Alloy, some techniques generalize to other verification tools.by Ilya A. Shlyakhter.S.M

    First-Order Models for Configuration Analysis

    Get PDF
    Our world teems with networked devices. Their configuration exerts an ever-expanding influence on our daily lives. Yet correctly configuring systems, networks, and access-control policies is notoriously difficult, even for trained professionals. Automated static analysis techniques provide a way to both verify a configuration\u27s correctness and explore its implications. One such approach is scenario-finding: showing concrete scenarios that illustrate potential (mis-)behavior. Scenarios even have a benefit to users without technical expertise, as concrete examples can both trigger and improve users\u27 intuition about their system. This thesis describes a concerted research effort toward improving scenario-finding tools for configuration analysis. We developed Margrave, a scenario-finding tool with special features designed for security policies and configurations. Margrave is not tied to any one specific policy language; rather, it provides an intermediate input language as expressive as first-order logic. This flexibility allows Margrave to reason about many different types of policy. We show Margrave in action on Cisco IOS, a common language for configuring firewalls, demonstrating that scenario-finding with Margrave is useful for debugging and validating real-world configurations. This thesis also presents a theorem showing that, for a restricted subclass of first-order logic, if a sentence is satisfiable then there must exist a satisfying scenario no larger than a computable bound. For such sentences scenario-finding is complete: one can be certain that no scenarios are missed by the analysis, provided that one checks up to the computed bound. We demonstrate that many common configurations fall into this subclass and give algorithmic tests for both sentence membership and counting. We have implemented both in Margrave. Aluminum is a tool that eliminates superfluous information in scenarios and allows users\u27 goals to guide which scenarios are displayed. We quantitatively show that our methods of scenario-reduction and exploration are effective and quite efficient in practice. Our work on Aluminum is making its way into other scenario-finding tools. Finally, we describe FlowLog, a language for network programming that we created with analysis in mind. We show that FlowLog can express many common network programs, yet demonstrate that automated analysis and bug-finding for FlowLog are both feasible as well as complete

    Understanding cognitive differences in processing competing visualizations of complex systems

    Get PDF
    Node-link diagrams are used represent systems having different elements and relationships among the elements. Representing the systems using visualizations like node-link diagrams provides cognitive aid to individuals in understanding the system and effectively managing these systems. Using appropriate visual tools aids in task completion by reducing the cognitive load of individuals in understanding the problems and solving them. However, the visualizations that are currently developed lack any cognitive processing based evaluation. Most of the evaluations (if any) are based on the result of tasks performed using these visualizations. Therefore, the evaluations do not provide any perspective from the point of the cognitive processing required in working with the visualization. This research focuses on understanding the effect of different visualization types and complexities on problem understanding and performance using a visual problem solving task. Two informationally equivalent but visually different visualizations - geon diagrams based on structural object perception theory and UML diagrams based on object modeling - are investigated to understand the cognitive processes that underlie reasoning with different types of visualizations. Specifically, the two visualizations are used to represent interdependent critical infrastructures. Participants are asked to solve a problem using the different visualizations. The effectiveness of the task completion is measured in terms of the time taken to complete the task and the accuracy of the result of the task. The differences in the cognitive processing while using the different visualizations are measured in terms of the search path and the search-steps of the individual. The results from this research underscore the difference in the effectiveness of the different diagrams in solving the same problem. The time taken to complete the task is significantly lower in geon diagrams. The error rate is also significantly lower when using geon diagrams. The search path for UML diagrams is more node-dominant but for geon diagrams is a distribution of nodes, links and components (combinations of nodes and links). Evaluation dominates the search-steps in geon diagrams whereas locating steps dominate UML diagrams. The results also show that the differences in search path and search steps for different visualizations increase when the complexity of the diagrams increase. This study helps to establish the importance of cognitive level understanding of the use of diagrammatic representation of information for visual problem solving. The results also highlight that measures of effectiveness of any visualization should include measuring the cognitive process of individuals while they are doing the visual task apart from the measures of time and accuracy of the result of a visual task

    Use of Inferential Statistics to Design Effective Communication Protocols for Wireless Sensor Networks

    Get PDF
    This thesis explores the issues and techniques associated with employing the principles of inferential statistics to design effective Medium Access Control (MAC), routing and duty cycle management strategies for multihop Wireless Sensor Networks (WSNs). The main objective of these protocols are to maximise the throughput of the network, to prolong the lifetime of nodes and to reduce the end-to-end delay of packets over a general network scenario without particular considerations for specific topology configurations, traffic patterns or routing policies. WSNs represent one of the leading-edge technologies that have received substantial research efforts due to their prominent roles in many applications. However, to design effective communication protocols for WSNs is particularly challenging due to the scarce resources of these networks and the requirement for large-scale deployment. The MAC, routing and duty cycle management protocols are amongst the important strategies that are required to ensure correct operations of WSNs. This thesis makes use of the inferential statistics field to design these protocols; inferential statistics was selected as it provides a rich design space with powerful approaches and methods. The MAC protocol proposed in this thesis exploits the statistical characteristics of the Gamma distribution to enable each node to adjust its contention parameters dynamically based on its inference for the channel occupancy. This technique reduces the service time of packets and leverages the throughput by improving the channel utilisation. Reducing the service time minimises the energy consumed in contention to access the channel which in turn prolongs the lifetime of nodes. The proposed duty cycle management scheme uses non-parametric Bayesian inference to enable each node to determine the best times and durations for its sleeping durations without posing overheads on the network. Hence the lifetime of node is prolonged by mitigating the amount of energy wasted in overhearing and idle listening. Prolonging the lifetime of nodes increases the throughput of the network and reduces the end-to-end delay as it allows nodes to route their packets over optimal paths for longer periods. The proposed routing protocol uses one of the state-of-the-art inference techniques dubbed spatial reasoning that enables each node to figure out the spatial relationships between nodes without overwhelming the network with control packets. As a result, the end-to-end delay is reduced while the throughput and lifetime are increased. Besides the proposed protocols, this thesis utilises the analytical aspects of statistics to develop rigorous analytical models that can accurately predict the queuing and medium access delay and energy consumption over multihop networks. Moreover, this thesis provides a broader perspective for design of communication protocols for WSNs by casting the operations of these networks in the domains of the artificial chemistry discipline and the harmony search optimisation algorithm

    An approach to managing the complexity of knowledge intensive business processes

    Get PDF
    Organisations face ever growing complexity in the business environment and use processes to deliver value in a stable, sustainable and controllable way. However complexity in the business environment is threatening the stability of processes and forcing their continuing evolution in ever shorter time cycles, which then creates significant management challenges. Addressing complexity requires a change in management thinking about processes.The research explores the nature of complexity, how businesses respond to it, and the consequent impact on process complexity. The research reviews the notion of complexity and its relevance to organisations, business processes and knowledge contexts. The research focuses on knowledge intensive firms, as these exhibit several of the features and allow early application of the approach suggested by this thesis. The research draws upon concepts from several fields including complexity and complex systems, business process management, and knowledge management.This thesis addresses the question: “How can organisations address the complexity of knowledge intensive business processes?” In answering the question the thesis argues the need to integrate multiple perspectives involved in managing such processes, proposes an approach to complex knowledge intensive business processes that reduces the management challenge, and argues the need to develop an agile shared knowledge context in support of the approach.This thesis develops a theoretical framework consisting of a set of hypotheses rooted in the literature, and then proposes an approach to addressing complex knowledge intensive business processes based upon these hypotheses. Then,through a series of QDS investigations and action research cycles, this thesis tests the hypotheses, further develops the approach and examines its application in different problem domains in multiple organisations. This thesis then discusses the process and the outcomes of applying the approach, identifies its limitations, assesses its contribution to knowledge and suggests directions for further research
    corecore