2,348 research outputs found

    Emerging privacy challenges and approaches in CAV systems

    Get PDF
    The growth of Internet-connected devices, Internet-enabled services and Internet of Things systems continues at a rapid pace, and their application to transport systems is heralded as game-changing. Numerous developing CAV (Connected and Autonomous Vehicle) functions, such as traffic planning, optimisation, management, safety-critical and cooperative autonomous driving applications, rely on data from various sources. The efficacy of these functions is highly dependent on the dimensionality, amount and accuracy of the data being shared. It holds, in general, that the greater the amount of data available, the greater the efficacy of the function. However, much of this data is privacy-sensitive, including personal, commercial and research data. Location data and its correlation with identity and temporal data can help infer other personal information, such as home/work locations, age, job, behavioural features, habits, social relationships. This work categorises the emerging privacy challenges and solutions for CAV systems and identifies the knowledge gap for future research, which will minimise and mitigate privacy concerns without hampering the efficacy of the functions

    Integrity and Privacy Protection for Cyber-physical Systems (CPS)

    Get PDF
    The present-day interoperable and interconnected cyber-physical systems (CPS) provides significant value in our daily lives with the incorporation of advanced technologies. Still, it also increases the exposure to many security privacy risks like (1) maliciously manipulating the CPS data and sensors to compromise the integrity of the system (2) launching internal/external cyber-physical attacks on the central controller dependent CPS systems to cause a single point of failure issues (3) running malicious data and query analytics on the CPS data to identify internal insights and use it for achieving financial incentive. Moreover, (CPS) data privacy protection during sharing, aggregating, and publishing has also become challenging nowadays because most of the existing CPS security and privacy solutions have drawbacks, like (a) lack of a proper vulnerability characterization model to accurately identify where privacy is needed, (b) ignoring data providers privacy preference, (c) using uniform privacy protection which may create inadequate privacy for some provider while overprotecting others.Therefore, to address these issues, the primary purpose of this thesis is to orchestrate the development of a decentralized, p2p connected data privacy preservation model to improve the CPS system's integrity against malicious attacks. In that regard, we adopt blockchain to facilitate a decentralized and highly secured system model for CPS with self-defensive capabilities. This proposed model will mitigate data manipulation attacks from malicious entities by introducing bloom filter-based fast CPS device identity validation and Merkle tree-based fast data verification. Finally, the blockchain consensus will help to keep consistency and eliminate malicious entities from the protection framework. Furthermore, to address the data privacy issues in CPS, we propose a personalized data privacy model by introducing a standard vulnerability profiling library (SVPL) to characterize and quantify the CPS vulnerabilities and identify the necessary privacy requirements. Based on this model, we present our personalized privacy framework (PDP) in which Laplace noise is added based on the individual node's selected privacy preferences. Finally, combining these two proposed methods, we demonstrate that the blockchain-based system model is scalable and fast enough for CPS data's integrity verification. Also, the proposed PDP model can attain better data privacy by eliminating the trade-off between privacy, utility, and risk of losing information

    Differential Privacy for Industrial Internet of Things: Opportunities, Applications and Challenges

    Get PDF
    The development of Internet of Things (IoT) brings new changes to various fields. Particularly, industrial Internet of Things (IIoT) is promoting a new round of industrial revolution. With more applications of IIoT, privacy protection issues are emerging. Specially, some common algorithms in IIoT technology such as deep models strongly rely on data collection, which leads to the risk of privacy disclosure. Recently, differential privacy has been used to protect user-terminal privacy in IIoT, so it is necessary to make in-depth research on this topic. In this paper, we conduct a comprehensive survey on the opportunities, applications and challenges of differential privacy in IIoT. We firstly review related papers on IIoT and privacy protection, respectively. Then we focus on the metrics of industrial data privacy, and analyze the contradiction between data utilization for deep models and individual privacy protection. Several valuable problems are summarized and new research ideas are put forward. In conclusion, this survey is dedicated to complete comprehensive summary and lay foundation for the follow-up researches on industrial differential privacy

    Analysis of Privacy-aware Data Sharing in Cyber-physical Energy Systems

    Get PDF
    In this thesis, we determine the key factors and correlations among the privacy, security, and utility requirements of grid networks to ensure effective inter-and intra-actions within physical layer equipment (e.g., distributed energy resources (DERs), intelligent electronic devices (IEDs), etc.). We have conducted a comprehensive analysis of the existing consensus mechanisms in blockchain-enabled smart grids while pointing out the potential research gaps. We develop a practical and effective consensus mechanism for a private and permissioned blockchain-enabled Supervisory control and data acquisition (SCADA) system. Moreover, we bridge a common and popular industrial control system (ICS) protocol, distributed network protocol 3 (DNP3) with the blockchain network to ensure smooth operation. In addition, we develop differential privacy (DP)-enabled strategies to achieve data security, privacy, and utility requirements of the power system network under an adversarial setting. Specifically, we aim to analyze and develop a provable correlation between privacy loss and other DP parameters considering the variations of attacks and their impacts along with DP constraints. This will enable modern power grid designers to develop, design, and employ DP-based fault-tolerant models in data-driven power grid operation and control. Furthermore, we conduct feasibility and quality-of-service (QoS) analysis of the DP mechanism and the grid to achieve certified robustness. Feasibility analysis of the privacy measure provides an assessment of the practicability of differential privacy in grid operation and warns the operators about the possible failures and incoming attacks on physical layer operations. QoS is analyzed in the power grid in terms of data accuracy, computational overhead, and resource utilization

    Local Differential Privacy In Smart Manufacturing: Application Scenario, Mechanisms and Tools

    Get PDF
    To utilize the potential of machine learning and deep learning, enormous amounts of data are required. To find the optimal solution, it is beneficial to share and publish data sets. Due to privacy leaks in publically released datasets and the exposure of sensitive information of individuals by attackers, the research field of differential privacy addresses solutions to avoid this in the future. Compared to other domains, the application of differential privacy in the manufacturing context is very challenging. Manufacturing data contains sensitive information about the companies and their process knowledge, products, and orders. Furthermore, data of individuals operating machines could be exposed and thus their performance evaluated. This paper describes scenarios of how differential privacy can be used in the manufacturing context. In particular, the potential threats that arise when sharing manufacturing data are addressed. This is described by identifying different manufacturing parameters and their variable types. Simplified examples show how the differentially private mechanisms can be applied to binary, numeric, categorical variables, and time series. Finally, libraries are presented which enable the productive use of differential privacy
    • …
    corecore