We consider the problem of recovering a function over the space of
permutations (or, the symmetric group) over n elements from given partial
information; the partial information we consider is related to the group
theoretic Fourier Transform of the function. This problem naturally arises in
several settings such as ranked elections, multi-object tracking, ranking
systems, and recommendation systems. Inspired by the work of Donoho and Stark
in the context of discrete-time functions, we focus on non-negative functions
with a sparse support (support size βͺ domain size). Our recovery method is
based on finding the sparsest solution (through β0β optimization) that is
consistent with the available information. As the main result, we derive
sufficient conditions for functions that can be recovered exactly from partial
information through β0β optimization. Under a natural random model for the
generation of functions, we quantify the recoverability conditions by deriving
bounds on the sparsity (support size) for which the function satisfies the
sufficient conditions with a high probability as nββ. β0β
optimization is computationally hard. Therefore, the popular compressive
sensing literature considers solving the convex relaxation, β1β
optimization, to find the sparsest solution. However, we show that β1β
optimization fails to recover a function (even with constant sparsity)
generated using the random model with a high probability as nββ. In
order to overcome this problem, we propose a novel iterative algorithm for the
recovery of functions that satisfy the sufficient conditions. Finally, using an
Information Theoretic framework, we study necessary conditions for exact
recovery to be possible.Comment: 19 page