20 research outputs found

    Bibliographical review on cyber attacks from a control oriented perspective

    Get PDF
    This paper presents a bibliographical review of definitions, classifications and applications concerning cyber attacks in networked control systems (NCSs) and cyber-physical systems (CPSs). This review tackles the topic from a control-oriented perspective, which is complementary to information or communication ones. After motivating the importance of developing new methods for attack detection and secure control, this review presents security objectives, attack modeling, and a characterization of considered attacks and threats presenting the detection mechanisms and remedial actions. In order to show the properties of each attack, as well as to provide some deeper insight into possible defense mechanisms, examples available in the literature are discussed. Finally, open research issues and paths are presented.Peer ReviewedPostprint (author's final draft

    The Complexity of Verifying Boolean Programs as Differentially Private

    Full text link
    We study the complexity of the problem of verifying differential privacy for while-like programs working over boolean values and making probabilistic choices. Programs in this class can be interpreted into finite-state discrete-time Markov Chains (DTMC). We show that the problem of deciding whether a program is differentially private for specific values of the privacy parameters is PSPACE-complete. To show that this problem is in PSPACE, we adapt classical results about computing hitting probabilities for DTMC. To show PSPACE-hardness we use a reduction from the problem of checking whether a program almost surely terminates or not. We also show that the problem of approximating the privacy parameters that a program provides is PSPACE-hard. Moreover, we investigate the complexity of similar problems also for several relaxations of differential privacy: R\'enyi differential privacy, concentrated differential privacy, and truncated concentrated differential privacy. For these notions, we consider gap-versions of the problem of deciding whether a program is private or not and we show that all of them are PSPACE-complete.Comment: Appeared in CSF 202

    Secure and Privacy-Preserving Cyber-Physical Systems

    Get PDF
    RÉSUMÉ Dans cette thèse de doctorat, nous étudions le problème de conception d’estimateur et de commande préservant la confidentialité de données dans un système multi-algent composé de systèmes individuels linéaires incertains ainsi que le problème de conception d’attaques furtives et d’estimateurs résilients aux attaques dans les système cyber-physiques. Les systèmes de surveillance et de commande à grande échelle permettant une infrastructure de plus en plus intelligente s’appuient de plus en plus sur des données sensibles obtenues auprès d’agents privés. Par exemple, ces systèmes collectent des données de localisation d’utilisateurs d’un système de transport intelligent ou des données médicales de patients pour une détection intelligente d’épidémie. Cependant, les considérations de confidentialité peuvent rendre les agents réticents à partager les informations nécessaires pour améliorer les performances d’une infrastructure intelligente. Dans le but d’encourager la participation de ces agents, il s’avère important de concevoir des algorithmes qui traitent les données d’une manière qui preserve leur confidentialité. Durant la première partie de cette thèse, nous considérons des scénarios dans lesquels les systèmes individuels sont indépendants et sont des systèmes linéaires gaussiens. Nous revisitons les problèmes de filtrage de Kalman et de commande linéaire quadratique gaussienne (LQG), sous contraintes de preservation de la confidentialité. Nous aimerions garantir la confidentialité differentielle, une définition formelle et à la pointe de la technologie concernant la confidentialité, et qui garantit que la sortie d’un algorithme ne soit pas trop sensible aux données collectées auprès d’un seul agent. Nous proposons une architecture en deux étapes, qui agrège et combine d’abord les signaux des agents individuels avant d’ajouter du bruit préservant la confidentialité et post-filtrer le résultat à publier. Nous montrons qu’une amélioration significative des performances est offerte par cette architecture par rapport aux architectures standards de perturbations d’entrée à mesure que le nombre de signaux d’entrée augmente. Nous prouvons qu’un pré-filtre optimal d’agrégation statique peut être conçu en résolvant un programme semi-défini. L’architecture en deux étapes, que nous développons d’abord pour le filtrage de Kalman, est ensuite adaptée au problème de commande LQG en exploitant le principe de séparation. A travers des simulations numériques, nous illustrons les améliorations de performance de notre architecture par rapport aux algorithmes de confidentialité différentielle qui n’utilisent pas d’agrégation de signal.----------ABSTRACT This thesis studies the problem of privacy-preserving estimator and control design in a multiagent system composed of uncertain individual linear systems and the problem of design of undetectable attacks and attack-resilient estimators for cyber-physical systems. Largescale monitoring and control systems enabling a more intelligent infrastructure increasingly rely on sensitive data obtained from private agents, e.g., location traces collected from the users of an intelligent transportation system or medical records collected from patients for intelligent health monitoring. Nevertheless, privacy considerations can make agents reluctant to share the information necessary to improve the performance of an intelligent infrastructure. In order to encourage the participation of these agents, it becomes then critical to design algorithms that process information in a privacy-preserving way. The first part of this thesis consider scenarios in which the individual agent systems are linear Gaussian systems and are independent. We revisit the Kalman filtering and Linear Quadratic Gaussian (LQG) control problems, subject to privacy constraints. We aim to enforce differential privacy, a formal, state-of-the-art definition of privacy ensuring that the output of an algorithm is not too sensitive to the data collected from any single participating agent. We propose a twostage architecture, which first aggregates and combines the individual agent signals before adding privacy-preserving noise and post-filtering the result to be published. We show a significant performance improvement offered by this architecture over input perturbation schemes as the number of input signals increases and that an optimal static aggregation stage can be computed by solving a semidefinite program. The two-stage architecture, which we develop first for Kalman filtering, is then adapted to the LQG control problem by leveraging the separation principle. We provide numerical simulations that illustrate the performance improvements over differentially private algorithms without first-stage signal aggregation. The second part of this thesis considers the problem of privacy-preserving estimator design for a multi-agent system composed of individual linear time-invariant systems affected by uncertainties whose statistical properties are not available. Only bounds are given a priori for these uncertainties. We propose a privacy-preserving interval estimator architecture, which releases publicly estimates of lower and upper bounds for an aggregate of the states of the individual systems. Particularly, we add a bounded privacy-preserving noise to each participant’s data before sending it to the estimator. The estimates published by the observer guarantee differential privacy for the agents’ data. We provide a numerical simulation that illustrates the behavior of the proposed architecture
    corecore