6 research outputs found

    HVOX: Scalable Interferometric Synthesis and Analysis of Spherical Sky Maps

    Full text link
    Analysis and synthesis are key steps of the radio-interferometric imaging process, serving as a bridge between visibility and sky domains. They can be expressed as partial Fourier transforms involving a large number of non-uniform frequencies and spherically-constrained spatial coordinates. Due to the data non-uniformity, these partial Fourier transforms are computationally expensive and represent a serious bottleneck in the image reconstruction process. The W-gridding algorithm achieves log-linear complexity for both steps by applying a series of 2D non-uniform FFTs (NUFFT) to the data sliced along the so-called ww frequency coordinate. A major drawback of this method however is its restriction to direction-cosine meshes, which are fundamentally ill-suited for large field of views. This paper introduces the HVOX gridder, a novel algorithm for analysis/synthesis based on a 3D-NUFFT. Unlike W-gridding, the latter is compatible with arbitrary spherical meshes such as the popular HEALPix scheme for spherical data processing. The 3D-NUFFT allows one to optimally select the size of the inner FFTs, in particular the number of W-planes. This results in a better performing and auto-tuned algorithm, with controlled accuracy guarantees backed by strong results from approximation theory. To cope with the challenging scale of next-generation radio telescopes, we propose moreover a chunked evaluation strategy: by partitioning the visibility and sky domains, the 3D-NUFFT is decomposed into sub-problems which execute in parallel, while simultaneously cutting memory requirements. Our benchmarking results demonstrate the scalability of HVOX for both SKA and LOFAR, considering state-of-the-art challenging imaging setups. HVOX is moreover computationally competitive with W-gridder, despite the absence of domain-specific optimizations in our implementation

    A Fast and Scalable Polyatomic Frank-Wolfe Algorithm for the LASSO

    No full text
    We propose a fast and scalable Polyatomic Frank-Wolfe (P-FW) algorithm for the resolution of high-dimensional LASSO regression problems. The latter improves upon traditional Frank-Wolfe methods by considering generalized greedy steps with polyatomic (i.e. linear combinations of multiple atoms) update directions, hence allowing for a more efficient exploration of the search space. To preserve sparsity of the intermediate iterates, we moreover re-optimize the LASSO problem over the set of selected atoms at each iteration. For efficiency reasons, the accuracy of this re-optimization step is relatively low for early iterations and gradually increases with the iteration count. We provide convergence guarantees for our algorithm and validate it in simulated compressed sensing setups. Our experiments reveal that P-FW outperforms state-of-the-art methods in terms of runtime, both for FW methods and optimal first-order proximal gradient methods such as the Fast Iterative Soft-Thresholding Algorithm (FISTA).LCA

    Une version polyatomique de l'algorithme Frank-Wolfe pour résoudre le problème LASSO en grandes dimensions

    No full text
    Nous nous intéressons à la reconstruction parcimonieuse d’images à l’aide du problème d’optimisation régularisé LASSO. Dans de nombreuses applications pratiques, les grandes dimensions des objets à reconstruire limitent, voire empêchent, l’utilisation des méthodes de résolution proximales classiques. C’est le cas par exemple en radioastronomie. Nous détaillons dans cet article le fonctionnement de l’algorithme Frank-Wolfe Polyatomique, spécialement développé pour résoudre le problème LASSO dans ces contextes exigeants. Nous démontrons sa supériorité par rapport aux méthodes proximales dans des situations en grande dimension avec des mesures de Fourier, lors de la résolution de problèmes simulés inspirés de la radio-interférométrie.LCA
    corecore