2 research outputs found

    Which attacks lead to hazards? Combining safety and security analysis for cyber-physical systems

    Get PDF
    Cyber-Physical Systems (CPS) are exposed to a plethora of attacks and their attack surface is only increasing. However, whilst many attack paths are possible, only some can threaten the system's safety and potentially lead to loss of life. Identifying them is of essence. We propose a methodology and develop a tool-chain to systematically analyse and enumerate the attacks leading to safety violations. This is achieved by lazily combining threat modelling and safety analysis with formal verification and with attack graph analysis. We also identify the minimum sets of privileges that must be protected to preserve safety. We demonstrate the effectiveness of our methodology to discover threat scenarios by applying it to a Communication Based Train Control System. Our design choices emphasise compatibility with existing safety and security frameworks, whilst remaining agnostic to specific tools or attack graphs representations

    Deep Learning for Abstraction, Control and Monitoring of Complex Cyber-Physical Systems

    Get PDF
    Cyber-Physical Systems (CPS) consist of digital devices that interact with some physical components. Their popularity and complexity are growing exponentially, giving birth to new, previously unexplored, safety-critical application domains. As CPS permeate our daily lives, it becomes imperative to reason about their reliability. Formal methods provide rigorous techniques for verification, control and synthesis of safe and reliable CPS. However, these methods do not scale with the complexity of the system, thus their applicability to real-world problems is limited. A promising strategy is to leverage deep learning techniques to tackle the scalability issue of formal methods, transforming unfeasible problems into approximately solvable ones. The approximate models are trained over observations which are solutions of the formal problem. In this thesis, we focus on the following tasks, which are computationally challenging: the modeling and the simulation of a complex stochastic model, the design of a safe and robust control policy for a system acting in a highly uncertain environment and the runtime verification problem under full or partial observability. Our approaches, based on deep learning, are indeed applicable to real-world complex and safety-critical systems acting under strict real-time constraints and in presence of a significant amount of uncertainty.Cyber-Physical Systems (CPS) consist of digital devices that interact with some physical components. Their popularity and complexity are growing exponentially, giving birth to new, previously unexplored, safety-critical application domains. As CPS permeate our daily lives, it becomes imperative to reason about their reliability. Formal methods provide rigorous techniques for verification, control and synthesis of safe and reliable CPS. However, these methods do not scale with the complexity of the system, thus their applicability to real-world problems is limited. A promising strategy is to leverage deep learning techniques to tackle the scalability issue of formal methods, transforming unfeasible problems into approximately solvable ones. The approximate models are trained over observations which are solutions of the formal problem. In this thesis, we focus on the following tasks, which are computationally challenging: the modeling and the simulation of a complex stochastic model, the design of a safe and robust control policy for a system acting in a highly uncertain environment and the runtime verification problem under full or partial observability. Our approaches, based on deep learning, are indeed applicable to real-world complex and safety-critical systems acting under strict real-time constraints and in presence of a significant amount of uncertainty
    corecore