5,209 research outputs found
Universal privacy guarantees for smart meters
Smart meters enable improvements in electricity distribution system efficiency at some cost in customer privacy. Users with home batteries can mitigate this privacy loss by applying charging policies that mask their underlying energy use. A battery charging policy is proposed and shown to provide universal privacy guarantees subject to a constraint on energy cost. The guarantee bounds our strategy's maximal information leakage from the user to the utility provider under general stochastic models of user energy consumption. The policy construction adapts coding strategies for non-probabilistic permuting channels to this privacy problem
A Distributed and Real-time Machine Learning Framework for Smart Meter Big Data
The advanced metering infrastructure allows smart meters to collect high-resolution consumption data, thereby enabling consumers and utilities to understand their energy usage at different levels, which has led to numerous smart grid applications. Smart meter data, however, poses different challenges to developing machine learning frameworks than classic theoretical frameworks due to their big data features and privacy limitations.
Therefore, in this work, we aim to address the challenges of building machine learning frameworks for smart meter big data. Specifically, our work includes three parts: 1) We first analyze and compare different learning algorithms for multi-level smart meter big data. A daily activity pattern recognition model has been developed based on non-intrusive load monitoring for appliance-level smart meter data. Then, a consensus-based load profiling and forecasting system has been proposed for individual building level and higher aggregated level smart meter data analysis; 2) Following discussion of multi-level smart meter data analysis from an offline perspective, a universal online functional analysis model has been proposed for multi-level real-time smart meter big data analysis. The proposed model consists of a multi-scale load dynamic profiling unit based on functional clustering and a multi-scale online load forecasting unit based on functional deep neural networks. The two units enable online tracking of the dynamic cluster trajectories and online forecasting of daily multi-scale demand; 3) To enable smart meter data analysis in the distributed environment, FederatedNILM was proposed, which is then combined with differential privacy to provide privacy guarantees for the appliance-level distributed machine learning framework. Based on federated deep learning enhanced with two schemes, namely the utility optimization scheme and the privacy-preserving scheme, the proposed distributed and privacy-preserving machine learning framework enables electric utilities and service providers to offer smart meter services on a large scale
Universal Privacy Gurantees for Smart Meters
Smart meters (SMs) provide advanced monitoring of consumer energy usage, thereby enabling optimized management and control of electricity distribution systems. Unfortunately, the data collected by SMs can reveal information about consumer activity, such as the times at which they run individual appliances. Two approaches have been proposed to tackle the privacy threat posed by such information leakage. One strategy involves manipulating user data before sending it to the utility provider (UP); this approach improves privacy at the cost of reducing the operational insight provided by the SM data to the UP. The alternative strategy employs rechargeable batteries or local energy sources at each consumer site to try decouple energy usage from energy requests. This thesis investigates the latter approach.
Understanding the privacy implications of any strategy requires an appropriate privacy metric.
A variety of metrics are used to study privacy in energy distribution systems. These include statistical distance metrics, differential privacy, distortion metrics, maximal leakage, maximal -leakage and information measures like mutual information. We here use mutual information to measure privacy both because its well understood fundamental properties and because it provides a useful bridge to adjacent fields such as hypothesis testing, estimation, and statistical or machine learning.
Privacy leakage under mutual information measures has been studied under a variety of assumptions on the energy consumption of the user with a strong focus on i.i.d. and some exploration of markov processes. Since user energy consumption may be non-stationary, here we seek privacy guarantees that apply for general random process models of energy consumption. Moreover, we impose finite capacity bounds on batteries and include the price of the energy requested from the grid, thus minimizing the information leakage subject to a bound on the resulting energy bill. To that aim we model the energy management unit (EMU) as a deterministic finite-state channel, and adapt the Ahlswede-Kaspi coding strategy proposed for permuting channels to the SM privacy setting.
Within this setting, we derive battery policies providing privacy guarantees that hold for any bounded process modelling the energy consumption of the user, including non-ergodic and non-stationary processes. These guarantees are also presented for bounded processes with a known expected average consumption. The optimality of the battery policy is characterized by presenting the probability law of a random process that is tight with respect to the upper bound. Moreover, we derive single letter bounds characterizing the privacy-cost trade off in the presence of variable market price. Finally it is shown that the provided results hold for mutual information, maximal leakage, maximal-alpha leakage and the Arimoto and Sibson channel capacity
Privacy-Utility Management of Hypothesis Tests
The trade-off of hypothesis tests on the correlated privacy hypothesis and
utility hypothesis is studied. The error exponent of the Bayesian composite
hypothesis test on the privacy or utility hypothesis can be characterized by
the corresponding minimal Chernoff information rate. An optimal management
protects the privacy by minimizing the error exponent of the privacy hypothesis
test and meanwhile guarantees the utility hypothesis testing performance by
satisfying a lower bound on the corresponding minimal Chernoff information
rate. The asymptotic minimum error exponent of the privacy hypothesis test is
shown to be characterized by the infimum of corresponding minimal Chernoff
information rates subject to the utility guarantees.Comment: accepted in IEEE Information Theory Workshop 201
Energy Disaggregation via Adaptive Filtering
The energy disaggregation problem is recovering device level power
consumption signals from the aggregate power consumption signal for a building.
We show in this paper how the disaggregation problem can be reformulated as an
adaptive filtering problem. This gives both a novel disaggregation algorithm
and a better theoretical understanding for disaggregation. In particular, we
show how the disaggregation problem can be solved online using a filter bank
and discuss its optimality.Comment: Submitted to 51st Annual Allerton Conference on Communication,
Control, and Computin
Privacy-enhancing Aggregation of Internet of Things Data via Sensors Grouping
Big data collection practices using Internet of Things (IoT) pervasive
technologies are often privacy-intrusive and result in surveillance, profiling,
and discriminatory actions over citizens that in turn undermine the
participation of citizens to the development of sustainable smart cities.
Nevertheless, real-time data analytics and aggregate information from IoT
devices open up tremendous opportunities for managing smart city
infrastructures. The privacy-enhancing aggregation of distributed sensor data,
such as residential energy consumption or traffic information, is the research
focus of this paper. Citizens have the option to choose their privacy level by
reducing the quality of the shared data at a cost of a lower accuracy in data
analytics services. A baseline scenario is considered in which IoT sensor data
are shared directly with an untrustworthy central aggregator. A grouping
mechanism is introduced that improves privacy by sharing data aggregated first
at a group level compared as opposed to sharing data directly to the central
aggregator. Group-level aggregation obfuscates sensor data of individuals, in a
similar fashion as differential privacy and homomorphic encryption schemes,
thus inference of privacy-sensitive information from single sensors becomes
computationally harder compared to the baseline scenario. The proposed system
is evaluated using real-world data from two smart city pilot projects. Privacy
under grouping increases, while preserving the accuracy of the baseline
scenario. Intra-group influences of privacy by one group member on the other
ones are measured and fairness on privacy is found to be maximized between
group members with similar privacy choices. Several grouping strategies are
compared. Grouping by proximity of privacy choices provides the highest privacy
gains. The implications of the strategy on the design of incentives mechanisms
are discussed
Technofixing the Future: Ethical Side Effects of Using AI and Big Data to meet the SDGs
While the use of smart information systems (the combination of AI and Big Data) offer great potential for meeting many of the UN’s Sustainable Development Goals (SDGs), they also raise a number of ethical challenges in their implementation. Through the use of six empirical case studies, this paper will examine potential ethical issues relating to use of SIS to meet the challenges in six of the SDGs (2, 3, 7, 8, 11, and 12). The paper will show that often a simple “technofix”, such as through the use of SIS, is not sufficient and may exacerbate, or create new, issues for the development community using SIS
- …