2 research outputs found

    Privacy In Multi-Agent And Dynamical Systems

    Get PDF
    The use of private data is pivotal for numerous services including location--based ones, collaborative recommender systems, and social networks. Despite the utility these services provide, the usage of private data raises privacy concerns to their owners. Noise--injecting techniques, such as differential privacy, address these concerns by adding artificial noise such that an adversary with access to the published response cannot confidently infer the private data. Particularly, in multi--agent and dynamical environments, privacy--preserving techniques need to be expressive enough to capture time--varying privacy needs, multiple data owners, and multiple data users. Current work in differential privacy assumes that a single response gets published and a single predefined privacy guarantee is provided. This work relaxes these assumptions by providing several problem formulations and their approaches. In the setting of a social network, a data owner has different privacy needs against different users. We design a coalition--free privacy--preserving mechanism that allows a data owner to diffuse their private data over a network. We also formulate the problem of multiple data owners that provide their data to multiple data users. Also, for time--varying privacy needs, we prove that, for a class of existing privacy--preserving mechanism, it is possible to effectively relax privacy constraints gradually. Additionally, we provide a privacy--aware mechanism for time--varying private data, where we wish to protect only the current value of it. Finally, in the context of location--based services, we provide a mechanism where the strength of the privacy guarantees varies with the local population density. These contributions increase the applicability of differential privacy and set future directions for more flexible and expressive privacy guarantees
    corecore