4 research outputs found
Optimal Privacy-Aware Dynamic Estimation
In this paper, we develop an information-theoretic framework for the optimal
privacy-aware estimation of the states of a (linear or nonlinear) system. In
our setup, a private process, modeled as a first-order Markov chain, derives
the states of the system, and the state estimates are shared with an untrusted
party who might attempt to infer the private process based on the state
estimates. As the privacy metric, we use the mutual information between the
private process and the state estimates. We first show that the privacy-aware
estimation is a closed-loop control problem wherein the estimator controls the
belief of the adversary about the private process. We also derive the Bellman
optimality principle for the optimal privacy-aware estimation problem, which is
used to study the structural properties of the optimal estimator. We next
develop a policy gradient algorithm, for computing an optimal estimation
policy, based on a novel variational formulation of the mutual information. We
finally study the performance of the optimal estimator in a building automation
application
Minimum-norm Sparse Perturbations for Opacity in Linear Systems
Opacity is a notion that describes an eavesdropper's inability to estimate a
system's 'secret' states by observing the system's outputs. In this paper, we
propose algorithms to compute the minimum sparse perturbation to be added to a
system to make its initial states opaque. For these perturbations, we consider
two sparsity constraints - structured and affine. We develop an algorithm to
compute the global minimum-norm perturbation for the structured case. For the
affine case, we use the global minimum solution of the structured case as
initial point to compute a local minimum. Empirically, this local minimum is
very close to the global minimum. We demonstrate our results via a running
example.Comment: Submitted to Indian Control Conference, 2023 (6 pages
A Design Framework for Strongly -Private Data Disclosure
In this paper, we study a stochastic disclosure control problem using
information-theoretic methods. The useful data to be disclosed depend on
private data that should be protected. Thus, we design a privacy mechanism to
produce new data which maximizes the disclosed information about the useful
data under a strong -privacy criterion. For sufficiently small leakage,
the privacy mechanism design problem can be geometrically studied in the space
of probability distributions by a local approximation of the mutual
information. By using methods from Euclidean information geometry, the original
highly challenging optimization problem can be reduced to a problem of finding
the principal right-singular vector of a matrix, which characterizes the
optimal privacy mechanism. In two extensions we first consider a scenario where
an adversary receives a noisy version of the user's message and then we look
for a mechanism which finds based on observing , maximizing the mutual
information between and while satisfying the privacy criterion on
and under the Markov chain .Comment: 16 pages, 2 figure