The resilience of a voting system has been a central topic in computational
social choice. Many voting rules, like plurality, are shown to be vulnerable as
the attacker can target specific voters to manipulate the result. What if a
local differential privacy (LDP) mechanism is adopted such that the true
preference of a voter is never revealed in pre-election polls? In this case,
the attacker can only infer stochastic information about a voter's true
preference, and this may cause the manipulation of the electoral result
significantly harder. The goal of this paper is to provide a quantitative study
on the effect of adopting LDP mechanisms on a voting system. We introduce the
metric PoLDP (power of LDP) that quantitatively measures the difference between
the attacker's manipulation cost under LDP mechanisms and that without LDP
mechanisms. The larger PoLDP is, the more robustness LDP mechanisms can add to
a voting system. We give a full characterization of PoLDP for the voting system
with plurality rule and provide general guidance towards the application of LDP
mechanisms