122 research outputs found

    Large-Scale Sensor Network Localization via Rigid Subnetwork Registration

    Full text link
    In this paper, we describe an algorithm for sensor network localization (SNL) that proceeds by dividing the whole network into smaller subnetworks, then localizes them in parallel using some fast and accurate algorithm, and finally registers the localized subnetworks in a global coordinate system. We demonstrate that this divide-and-conquer algorithm can be used to leverage existing high-precision SNL algorithms to large-scale networks, which could otherwise only be applied to small-to-medium sized networks. The main contribution of this paper concerns the final registration phase. In particular, we consider a least-squares formulation of the registration problem (both with and without anchor constraints) and demonstrate how this otherwise non-convex problem can be relaxed into a tractable convex program. We provide some preliminary simulation results for large-scale SNL demonstrating that the proposed registration algorithm (together with an accurate localization scheme) offers a good tradeoff between run time and accuracy.Comment: 5 pages, 8 figures, 1 table. To appear in Proc. IEEE International Conference on Acoustics, Speech, and Signal Processing, April 19-24, 201

    Approximating the Little Grothendieck Problem over the Orthogonal and Unitary Groups

    Get PDF
    The little Grothendieck problem consists of maximizing ijCijxixj\sum_{ij}C_{ij}x_ix_j over binary variables xi{±1}x_i\in\{\pm1\}, where C is a positive semidefinite matrix. In this paper we focus on a natural generalization of this problem, the little Grothendieck problem over the orthogonal group. Given C a dn x dn positive semidefinite matrix, the objective is to maximize ijTr(CijTOiOjT)\sum_{ij}Tr (C_{ij}^TO_iO_j^T) restricting OiO_i to take values in the group of orthogonal matrices, where CijC_{ij} denotes the (ij)-th d x d block of C. We propose an approximation algorithm, which we refer to as Orthogonal-Cut, to solve this problem and show a constant approximation ratio. Our method is based on semidefinite programming. For a given d1d\geq 1, we show a constant approximation ratio of αR(d)2\alpha_{R}(d)^2, where αR(d)\alpha_{R}(d) is the expected average singular value of a d x d matrix with random Gaussian N(0,1/d)N(0,1/d) i.i.d. entries. For d=1 we recover the known αR(1)2=2/π\alpha_{R}(1)^2=2/\pi approximation guarantee for the classical little Grothendieck problem. Our algorithm and analysis naturally extends to the complex valued case also providing a constant approximation ratio for the analogous problem over the Unitary Group. Orthogonal-Cut also serves as an approximation algorithm for several applications, including the Procrustes problem where it improves over the best previously known approximation ratio of~122\frac1{2\sqrt{2}}. The little Grothendieck problem falls under the class of problems approximated by a recent algorithm proposed in the context of the non-commutative Grothendieck inequality. Nonetheless, our approach is simpler and it provides a more efficient algorithm with better approximation ratios and matching integrality gaps. Finally, we also provide an improved approximation algorithm for the more general little Grothendieck problem over the orthogonal (or unitary) group with rank constraints.Comment: Updates in version 2: extension to the complex valued (unitary group) case, sharper lower bounds on the approximation ratios, matching integrality gap, and a generalized rank constrained version of the problem. Updates in version 3: Improvement on the expositio

    A Riemannian low-rank method for optimization over semidefinite matrices with block-diagonal constraints

    Get PDF
    We propose a new algorithm to solve optimization problems of the form minf(X)\min f(X) for a smooth function ff under the constraints that XX is positive semidefinite and the diagonal blocks of XX are small identity matrices. Such problems often arise as the result of relaxing a rank constraint (lifting). In particular, many estimation tasks involving phases, rotations, orthonormal bases or permutations fit in this framework, and so do certain relaxations of combinatorial problems such as Max-Cut. The proposed algorithm exploits the facts that (1) such formulations admit low-rank solutions, and (2) their rank-restricted versions are smooth optimization problems on a Riemannian manifold. Combining insights from both the Riemannian and the convex geometries of the problem, we characterize when second-order critical points of the smooth problem reveal KKT points of the semidefinite problem. We compare against state of the art, mature software and find that, on certain interesting problem instances, what we call the staircase method is orders of magnitude faster, is more accurate and scales better. Code is available.Comment: 37 pages, 3 figure
    corecore