2 research outputs found
More Than Privacy: Applying Differential Privacy in Key Areas of Artificial Intelligence
Artificial Intelligence (AI) has attracted a great deal of attention in
recent years. However, alongside all its advancements, problems have also
emerged, such as privacy violations, security issues and model fairness.
Differential privacy, as a promising mathematical model, has several attractive
properties that can help solve these problems, making it quite a valuable tool.
For this reason, differential privacy has been broadly applied in AI but to
date, no study has documented which differential privacy mechanisms can or have
been leveraged to overcome its issues or the properties that make this
possible. In this paper, we show that differential privacy can do more than
just privacy preservation. It can also be used to improve security, stabilize
learning, build fair models, and impose composition in selected areas of AI.
With a focus on regular machine learning, distributed machine learning, deep
learning, and multi-agent systems, the purpose of this article is to deliver a
new view on many possibilities for improving AI performance with differential
privacy techniques
Differential Privacy for Industrial Internet of Things: Opportunities, Applications and Challenges
The development of Internet of Things (IoT) brings new changes to various
fields. Particularly, industrial Internet of Things (IIoT) is promoting a new
round of industrial revolution. With more applications of IIoT, privacy
protection issues are emerging. Specially, some common algorithms in IIoT
technology such as deep models strongly rely on data collection, which leads to
the risk of privacy disclosure. Recently, differential privacy has been used to
protect user-terminal privacy in IIoT, so it is necessary to make in-depth
research on this topic. In this paper, we conduct a comprehensive survey on the
opportunities, applications and challenges of differential privacy in IIoT. We
firstly review related papers on IIoT and privacy protection, respectively.
Then we focus on the metrics of industrial data privacy, and analyze the
contradiction between data utilization for deep models and individual privacy
protection. Several valuable problems are summarized and new research ideas are
put forward. In conclusion, this survey is dedicated to complete comprehensive
summary and lay foundation for the follow-up researches on industrial
differential privacy