398 research outputs found

    Smart Meter Privacy with Renewable Energy and a Finite Capacity Battery

    Full text link
    We address the smart meter (SM) privacy problem by considering the availability of a renewable energy source (RES) and a battery which can be exploited by a consumer to partially hide the consumption pattern from the utility provider (UP). Privacy is measured by the mutual information rate between the consumer's energy consumption and the renewable energy generation process, and the energy received from the grid, where the latter is known by the UP through the SM readings, and the former two are to be kept private. By expressing the information leakage as an additive quantity, we cast the problem as a stochastic control problem, and formulate the corresponding Bellman equations.Comment: To appear in IEEE SPAWC 201

    Privacy-Cost Management in Smart Meters with Mutual Information-Based Reinforcement Learning

    Full text link
    The rapid development and expansion of the Internet of Things (IoT) paradigm has drastically increased the collection and exchange of data between sensors and systems, a phenomenon that raises serious privacy concerns in some domains. In particular, Smart Meters (SMs) share fine-grained electricity consumption of households with utility providers that can potentially violate users' privacy as sensitive information is leaked through the data. In order to enhance privacy, the electricity consumers can exploit the availability of physical resources such as a rechargeable battery (RB) to shape their power demand as dictated by a Privacy-Cost Management Unit (PCMU). In this paper, we present a novel method to learn the PCMU policy using Deep Reinforcement Learning (DRL). We adopt the mutual information (MI) between the user's demand load and the masked load seen by the power grid as a reliable and general privacy measure. Unlike previous studies, we model the whole temporal correlation in the data to learn the MI in its general form and use a neural network to estimate the MI-based reward signal to guide the PCMU learning process. This approach is combined with a model-free DRL algorithm known as the Deep Double Q-Learning (DDQL) method. The performance of the complete DDQL-MI algorithm is assessed empirically using an actual SMs dataset and compared with simpler privacy measures. Our results show significant improvements over state-of-the-art privacy-aware demand shaping methods

    Privacy-cost trade-offs in demand-side management with storage

    Get PDF
    Demand-side energy management (EM) is studied from a privacy-cost trade-off perspective, considering time-of-use pricing and the presence of an energy storage unit. Privacy i s measured as the variation of the power withdrawn from the gri d from a fixed target value. Assuming non-causal knowledge of t he household’s aggregate power demand profile and the electric ity prices at the energy management unit (EMU), the privacy-cos t trade-off is formulated as a convex optimization problem, a nd a low-complexity backward water-filling algorithm is proposed to compute the optimal EM policy. The problem is studied also in the online setting assuming that the power demand profile is known to the EMU only causally, and the optimal EM policy is obtained numerically through dynamic programming (DP). Du e to the high computational cost of DP, a low-complexity heuri stic EM policy with a performance close to the optimal online solu tion is also proposed, exploiting the water-filling algorithm ob tained in the offline setting. As an alternative, information theor etic leakage rate is also evaluated, and shown to follow a similar trend as the load variance, which supports the validity of th e load variance as a measure of privacy. Finally, the privacy- cost trade-off, and the impact of the size of the storage unit on th is trade-off are studied through numerical simulations using real smart meter data in both the offline and online settings

    On the Impact of Side Information on Smart Meter Privacy-Preserving Methods

    Full text link
    Smart meters (SMs) can pose privacy threats for consumers, an issue that has received significant attention in recent years. This paper studies the impact of Side Information (SI) on the performance of distortion-based real-time privacy-preserving algorithms for SMs. In particular, we consider a deep adversarial learning framework, in which the desired releaser (a recurrent neural network) is trained by fighting against an adversary network until convergence. To define the loss functions, two different approaches are considered: the Causal Adversarial Learning (CAL) and the Directed Information (DI)-based learning. The main difference between these approaches is in how the privacy term is measured during the training process. On the one hand, the releaser in the CAL method, by getting supervision from the actual values of the private variables and feedback from the adversary performance, tries to minimize the adversary log-likelihood. On the other hand, the releaser in the DI approach completely relies on the feedback received from the adversary and is optimized to maximize its uncertainty. The performance of these two algorithms is evaluated empirically using real-world SMs data, considering an attacker with access to SI (e.g., the day of the week) that tries to infer the occupancy status from the released SMs data. The results show that, although they perform similarly when the attacker does not exploit the SI, in general, the CAL method is less sensitive to the inclusion of SI. However, in both cases, privacy levels are significantly affected, particularly when multiple sources of SI are included

    Privacy-aware smart metering progress and challenges

    Get PDF
    The next-generation energy network, the so-called smart grid (SG), promises tremendous increases in efficiency, safety, and flexibility in managing the electricity grid as compared to the legacy energy network. This is needed today more than ever, as global energy consumption is growing at an unprecedented rate and renewable energy sources (RESs) must be seamlessly integrated into the grid to assure a sustainable human development

    Privacy and security in cyber-physical systems

    Get PDF
    Data privacy has attracted increasing attention in the past decade due to the emerging technologies that require our data to provide utility. Service providers (SPs) encourage users to share their personal data in return for a better user experience. However, users' raw data usually contains implicit sensitive information that can be inferred by a third party. This raises great concern about users' privacy. In this dissertation, we develop novel techniques to achieve a better privacy-utility trade-off (PUT) in various applications. We first consider smart meter (SM) privacy and employ physical resources to minimize the information leakage to the SP through SM readings. We measure privacy using information-theoretic metrics and find private data release policies (PDRPs) by formulating the problem as a Markov decision process (MDP). We also propose noise injection techniques for time-series data privacy. We characterize optimal PDRPs measuring privacy via mutual information (MI) and utility loss via added distortion. Reformulating the problem as an MDP, we solve it using deep reinforcement learning (DRL) for real location trace data. We also consider a scenario for hiding an underlying ``sensitive'' variable and revealing a ``useful'' variable for utility by periodically selecting from among sensors to share the measurements with an SP. We formulate this as an optimal stopping problem and solve using DRL. We then consider privacy-aware communication over a wiretap channel. We maximize the information delivered to the legitimate receiver, while minimizing the information leakage from the sensitive attribute to the eavesdropper. We propose using a variational-autoencoder (VAE) and validate our approach with colored and annotated MNIST dataset. Finally, we consider defenses against active adversaries in the context of security-critical applications. We propose an adversarial example (AE) generation method exploiting the data distribution. We perform adversarial training using the proposed AEs and evaluate the performance against real-world adversarial attacks.Open Acces
    • …
    corecore