203,241 research outputs found

    Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    Get PDF
    Automated source extraction and parameterization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper we present a new algorithm, dubbed CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parameterization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, including also different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the ASKAP-EMU survey. The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.Comment: 15 pages, 9 figure

    Extremely fast focal-plane wavefront sensing for extreme adaptive optics

    Full text link
    We present a promising approach to the extremely fast sensing and correction of small wavefront errors in adaptive optics systems. As our algorithm's computational complexity is roughly proportional to the number of actuators, it is particularly suitable to systems with 10,000 to 100,000 actuators. Our approach is based on sequential phase diversity and simple relations between the point-spread function and the wavefront error in the case of small aberrations. The particular choice of phase diversity, introduced by the deformable mirror itself, minimizes the wavefront error as well as the computational complexity. The method is well suited for high-contrast astronomical imaging of point sources such as the direct detection and characterization of exoplanets around stars, and it works even in the presence of a coronagraph that suppresses the diffraction pattern. The accompanying paper in these proceedings by Korkiakoski et al. describes the performance of the algorithm using numerical simulations and laboratory tests.Comment: SPIE Paper 8447-7

    Statistical and systematic uncertainties in pixel-based source reconstruction algorithms for gravitational lensing

    Full text link
    Gravitational lens modeling of spatially resolved sources is a challenging inverse problem with many observational constraints and model parameters. We examine established pixel-based source reconstruction algorithms for de-lensing the source and constraining lens model parameters. Using test data for four canonical lens configurations, we explore statistical and systematic uncertainties associated with gridding, source regularisation, interpolation errors, noise, and telescope pointing. Specifically, we compare two gridding schemes in the source plane: a fully adaptive grid that follows the lens mapping but is irregular, and an adaptive Cartesian grid. We also consider regularisation schemes that minimise derivatives of the source (using two finite difference methods) and introduce a scheme that minimises deviations from an analytic source profile. Careful choice of gridding and regularisation can reduce "discreteness noise" in the χ2\chi^2 surface that is inherent in the pixel-based methodology. With a gridded source, some degree of interpolation is unavoidable, and errors due to interpolation need to be taken into account (especially for high signal-to-noise data). Different realisations of the noise and telescope pointing lead to slightly different values for lens model parameters, and the scatter between different "observations" can be comparable to or larger than the model uncertainties themselves. The same effects create scatter in the lensing magnification at the level of a few percent for a peak signal-to-noise ratio of 10, which decreases as the data quality improves.Comment: 20 pages, 18 figures, accepted to MNRAS, see http://physics.rutgers.edu/~tagoreas/papers/ for high resolution image

    Adaptive content mapping for internet navigation

    Get PDF
    The Internet as the biggest human library ever assembled keeps on growing. Although all kinds of information carriers (e.g. audio/video/hybrid file formats) are available, text based documents dominate. It is estimated that about 80% of all information worldwide stored electronically exists in (or can be converted into) text form. More and more, all kinds of documents are generated by means of a text processing system and are therefore available electronically. Nowadays, many printed journals are also published online and may even discontinue to appear in print form tomorrow. This development has many convincing advantages: the documents are both available faster (cf. prepress services) and cheaper, they can be searched more easily, the physical storage only needs a fraction of the space previously necessary and the medium will not age. For most people, fast and easy access is the most interesting feature of the new age; computer-aided search for specific documents or Web pages becomes the basic tool for information-oriented work. But this tool has problems. The current keyword based search machines available on the Internet are not really appropriate for such a task; either there are (way) too many documents matching the specified keywords are presented or none at all. The problem lies in the fact that it is often very difficult to choose appropriate terms describing the desired topic in the first place. This contribution discusses the current state-of-the-art techniques in content-based searching (along with common visualization/browsing approaches) and proposes a particular adaptive solution for intuitive Internet document navigation, which not only enables the user to provide full texts instead of manually selected keywords (if available), but also allows him/her to explore the whole database

    Adaptive Binning of X-ray data with Weighted Voronoi Tesselations

    Full text link
    We present a technique to adaptively bin sparse X-ray data using weighted Voronoi tesselations (WVTs). WVT binning is a generalisation of Cappellari & Copin's (2001) Voronoi binning algorithm, developed for integral field spectroscopy. WVT binning is applicable to many types of data and creates unbiased binning structures with compact bins that do not lead the eye. We apply the algorithm to simulated data, as well as several X-ray data sets, to create adaptively binned intensity images, hardness ratio maps and temperature maps with constant signal-to-noise ratio per bin. We also illustrate the separation of diffuse gas emission from contributions of unresolved point sources in elliptical galaxies. We compare the performance of WVT binning with other adaptive binning and adaptive smoothing techniques. We find that the CIAO tool csmooth creates serious artefacts and advise against its use to interpret diffuse X-ray emission.Comment: 14 pages; submitted to MNRAS; code freely available at http://www.phy.ohiou.edu/~diehl/WVT/index.html with user manual, examples and high-resolution version of this pape

    Finding faint HI structure in and around galaxies: scraping the barrel

    Get PDF
    Soon to be operational HI survey instruments such as APERTIF and ASKAP will produce large datasets. These surveys will provide information about the HI in and around hundreds of galaxies with a typical signal-to-noise ratio of ∼\sim 10 in the inner regions and ∼\sim 1 in the outer regions. In addition, such surveys will make it possible to probe faint HI structures, typically located in the vicinity of galaxies, such as extra-planar-gas, tails and filaments. These structures are crucial for understanding galaxy evolution, particularly when they are studied in relation to the local environment. Our aim is to find optimized kernels for the discovery of faint and morphologically complex HI structures. Therefore, using HI data from a variety of galaxies, we explore state-of-the-art filtering algorithms. We show that the intensity-driven gradient filter, due to its adaptive characteristics, is the optimal choice. In fact, this filter requires only minimal tuning of the input parameters to enhance the signal-to-noise ratio of faint components. In addition, it does not degrade the resolution of the high signal-to-noise component of a source. The filtering process must be fast and be embedded in an interactive visualization tool in order to support fast inspection of a large number of sources. To achieve such interactive exploration, we implemented a multi-core CPU (OpenMP) and a GPU (OpenGL) version of this filter in a 3D visualization environment (SlicerAstro\tt{SlicerAstro}).Comment: 17 pages, 9 figures, 4 tables. Astronomy and Computing, accepte
    • …
    corecore