9,908 research outputs found

    Cleaning sky survey databases using Hough Transform and Renewal String approaches

    Get PDF
    Large astronomical databases obtained from sky surveys such as the SuperCOSMOS Sky Survey (SSS) invariably suffer from spurious records coming from artefactual effects of the telescope, satellites and junk objects in orbit around earth and physical defects on the photographic plate or CCD. Though relatively small in number these spurious records present a significant problem in many situations where they can become a large proportion of the records potentially of interest to a given astronomer. Accurate and robust techniques are needed for locating and flagging such spurious objects, and we are undertaking a programme investigating the use of machine learning techniques in this context. In this paper we focus on the four most common causes of unwanted records in the SSS: satellite or aeroplane tracks, scratches, fibres and other linear phenomena introduced to the plate, circular halos around bright stars due to internal reflections within the telescope and diffraction spikes near to bright stars. Appropriate techniques are developed for the detection of each of these. The methods are applied to the SSS data to develop a dataset of spurious object detections, along with confidence measures, which can allow these unwanted data to be removed from consideration. These methods are general and can be adapted to other astronomical survey data.Comment: Accepted for MNRAS. 17 pages, latex2e, uses mn2e.bst, mn2e.cls, md706.bbl, shortbold.sty (all included). All figures included here as low resolution jpegs. A version of this paper including the figures can be downloaded from http://www.anc.ed.ac.uk/~amos/publications.html and more details on this project can be found at http://www.anc.ed.ac.uk/~amos/sattrackres.htm

    Ship Wake Detection in SAR Images via Sparse Regularization

    Get PDF
    In order to analyse synthetic aperture radar (SAR) images of the sea surface, ship wake detection is essential for extracting information on the wake generating vessels. One possibility is to assume a linear model for wakes, in which case detection approaches are based on transforms such as Radon and Hough. These express the bright (dark) lines as peak (trough) points in the transform domain. In this paper, ship wake detection is posed as an inverse problem, which the associated cost function including a sparsity enforcing penalty, i.e. the generalized minimax concave (GMC) function. Despite being a non-convex regularizer, the GMC penalty enforces the overall cost function to be convex. The proposed solution is based on a Bayesian formulation, whereby the point estimates are recovered using maximum a posteriori (MAP) estimation. To quantify the performance of the proposed method, various types of SAR images are used, corresponding to TerraSAR-X, COSMO-SkyMed, Sentinel-1, and ALOS2. The performance of various priors in solving the proposed inverse problem is first studied by investigating the GMC along with the L1, Lp, nuclear and total variation (TV) norms. We show that the GMC achieves the best results and we subsequently study the merits of the corresponding method in comparison to two state-of-the-art approaches for ship wake detection. The results show that our proposed technique offers the best performance by achieving 80% success rate.Comment: 18 page

    A disaster risk assessment model for the conservation of cultural heritage sites in Melaka Malaysia

    Get PDF
    There exist ongoing efforts to reduce the exposure of Cultural Heritage Sites (CHSs) to Disaster Risk (DR). However, a complicated issue these efforts face is that of ‘estimation’ whereby no standardised unit exist for assessing the effects of Cultural Heritage (CH) exposed to DR as compared to other exposed items having standardised assessment units such as; ‘number of people’ for deaths, injured and displaced, ‘dollar’ for economic impact, ‘number of units’ for building stock or animals among others. This issue inhibits the effective assessment of CHSs exposed to DR. Although there exist several DR assessment frameworks for conserving CHSs, the conceptualisation of DR in these studies fall short of good practice such as international strategy for disaster reduction by United Nations which expresses DR to being a hollistic interplay of three variables (hazard, vulnerability and capacity). Adopting such good practice, this research seeks to propose a mechanism of DR assessment aimed at reducing the exposure of CHSs to DR. Quantitative method adopted for data collection involved a survey of 365 respondents at CHSs in Melaka using a structured questionnaire. Similarly, data analysis consisted of a two-step Structural Equation Modelling (measurement and structural modelling). The achievement of the recommended thresholds for unidimensionality, validity and reliability by the measurement models is a testimony to the model fitness for all 8 first-order independent variables and 2 first-order dependent variables. While hazard had a ‘small’ but negative effect, vulnerability had a ‘very large’ but negative effect on the exposure of CHSs to DR. Likewise, capacity had a ‘small’ but positive effect on the exposure of CHSs to DR. The outcome of this study is a Disaster Risk Assessment Model (DRAM) aimed at reducing DR to CHSs. The implication of this research is providing insights on decisions for DR assessment to institutions, policymakers and statutory bodies towards their approach to enhancing the conservation of CHSs

    Massively Parallel Computing and the Search for Jets and Black Holes at the LHC

    Full text link
    Massively parallel computing at the LHC could be the next leap necessary to reach an era of new discoveries at the LHC after the Higgs discovery. Scientific computing is a critical component of the LHC experiment, including operation, trigger, LHC computing GRID, simulation, and analysis. One way to improve the physics reach of the LHC is to take advantage of the flexibility of the trigger system by integrating coprocessors based on Graphics Processing Units (GPUs) or the Many Integrated Core (MIC) architecture into its server farm. This cutting edge technology provides not only the means to accelerate existing algorithms, but also the opportunity to develop new algorithms that select events in the trigger that previously would have evaded detection. In this article we describe new algorithms that would allow to select in the trigger new topological signatures that include non-prompt jet and black hole--like objects in the silicon tracker.Comment: 15 pages, 11 figures, submitted to NIM

    The image ray transform for structural feature detection

    No full text
    The use of analogies to physical phenomena is an exciting paradigm in computer vision that allows unorthodox approaches to feature extraction, creating new techniques with unique properties. A technique known as the "image ray transform" has been developed based upon an analogy to the propagation of light as rays. The transform analogises an image to a set of glass blocks with refractive index linked to pixel properties and then casts a large number of rays through the image. The course of these rays is accumulated into an output image. The technique can successfully extract tubular and circular features and we show successful circle detection, ear biometrics and retinal vessel extraction. The transform has also been extended through the use of multiple rays arranged as a beam to increase robustness to noise, and we show quantitative results for fully automatic ear recognition, achieving 95.2% rank one recognition across 63 subjects
    • …
    corecore