3,517 research outputs found

    Inferring Rankings Using Constrained Sensing

    Full text link
    We consider the problem of recovering a function over the space of permutations (or, the symmetric group) over nn elements from given partial information; the partial information we consider is related to the group theoretic Fourier Transform of the function. This problem naturally arises in several settings such as ranked elections, multi-object tracking, ranking systems, and recommendation systems. Inspired by the work of Donoho and Stark in the context of discrete-time functions, we focus on non-negative functions with a sparse support (support size β‰ͺ\ll domain size). Our recovery method is based on finding the sparsest solution (through β„“0\ell_0 optimization) that is consistent with the available information. As the main result, we derive sufficient conditions for functions that can be recovered exactly from partial information through β„“0\ell_0 optimization. Under a natural random model for the generation of functions, we quantify the recoverability conditions by deriving bounds on the sparsity (support size) for which the function satisfies the sufficient conditions with a high probability as nβ†’βˆžn \to \infty. β„“0\ell_0 optimization is computationally hard. Therefore, the popular compressive sensing literature considers solving the convex relaxation, β„“1\ell_1 optimization, to find the sparsest solution. However, we show that β„“1\ell_1 optimization fails to recover a function (even with constant sparsity) generated using the random model with a high probability as nβ†’βˆžn \to \infty. In order to overcome this problem, we propose a novel iterative algorithm for the recovery of functions that satisfy the sufficient conditions. Finally, using an Information Theoretic framework, we study necessary conditions for exact recovery to be possible.Comment: 19 page

    Constructing Linear Encoders with Good Spectra

    Full text link
    Linear encoders with good joint spectra are suitable candidates for optimal lossless joint source-channel coding (JSCC), where the joint spectrum is a variant of the input-output complete weight distribution and is considered good if it is close to the average joint spectrum of all linear encoders (of the same coding rate). In spite of their existence, little is known on how to construct such encoders in practice. This paper is devoted to their construction. In particular, two families of linear encoders are presented and proved to have good joint spectra. The first family is derived from Gabidulin codes, a class of maximum-rank-distance codes. The second family is constructed using a serial concatenation of an encoder of a low-density parity-check code (as outer encoder) with a low-density generator matrix encoder (as inner encoder). In addition, criteria for good linear encoders are defined for three coding applications: lossless source coding, channel coding, and lossless JSCC. In the framework of the code-spectrum approach, these three scenarios correspond to the problems of constructing linear encoders with good kernel spectra, good image spectra, and good joint spectra, respectively. Good joint spectra imply both good kernel spectra and good image spectra, and for every linear encoder having a good kernel (resp., image) spectrum, it is proved that there exists a linear encoder not only with the same kernel (resp., image) but also with a good joint spectrum. Thus a good joint spectrum is the most important feature of a linear encoder.Comment: v5.5.5, no. 201408271350, 40 pages, 3 figures, extended version of the paper to be published in IEEE Transactions on Information Theor

    Acceleration of a Full-scale Industrial CFD Application with OP2

    Get PDF

    Inferring rankings using constrained sensing

    Get PDF
    We consider the problem of recovering a function over the space of permutations (or, the symmetric group) over n elements from given partial information; the partial information we consider is related to the group theoretic Fourier Transform of the function. This problem naturally arises in several settings such as ranked elections, multi-object tracking, ranking systems, and recommendation systems. Inspired by the work of Donoho and Stark in the context of discrete-time functions, we focus on non-negative functions with a sparse support (support size <;<; domain size). Our recovery method is based on finding the sparsest solution (through l[subscript 0] optimization) that is consistent with the available information. As the main result, we derive sufficient conditions for functions that can be recovered exactly from partial information through l[subscript 0] optimization. Under a natural random model for the generation of functions, we quantify the recoverability conditions by deriving bounds on the sparsity (support size) for which the function satisfies the sufficient conditions with a high probability as n β†’ ∞. β„“0 optimization is computationally hard. Therefore, the popular compressive sensing literature considers solving the convex relaxation, β„“[subscript 1] optimization, to find the sparsest solution. However, we show that β„“[subscript 1] optimization fails to recover a function (even with constant sparsity) generated using the random model with a high probability as n β†’ ∞. In order to overcome this problem, we propose a novel iterative algorithm for the recovery of functions that satisfy the sufficient conditions. Finally, using an Information Theoretic framework, we study necessary conditions for exact recovery to be possible
    • …
    corecore