3,488 research outputs found

    Learning Local Metrics and Influential Regions for Classification

    Get PDF
    The performance of distance-based classifiers heavily depends on the underlying distance metric, so it is valuable to learn a suitable metric from the data. To address the problem of multimodality, it is desirable to learn local metrics. In this short paper, we define a new intuitive distance with local metrics and influential regions, and subsequently propose a novel local metric learning method for distance-based classification. Our key intuition is to partition the metric space into influential regions and a background region, and then regulate the effectiveness of each local metric to be within the related influential regions. We learn local metrics and influential regions to reduce the empirical hinge loss, and regularize the parameters on the basis of a resultant learning bound. Encouraging experimental results are obtained from various public and popular data sets

    Application of Wavelet Analysis in Detecting Runway Foreign Object Debris

    Get PDF
    Foreign Object Debris (FOD) is dangerous for aircraft safety. And it can be suggested to use image processing technology on the FOD’s detection. Depending on image processing system, a major sub-system in FOD detecting system on the runway, FOD image will be observed efficiently and rapidly with few economy costs and highly accuracy and reliability. The paper analyses the characteristics and principles of wavelet transformation algorithm and applies wavelet theory on FOD’s identification and detection. Identifying the FOD’s shape and marking characteristic point on the runway under poor visual background would be accomplished by programming in MATLAB using wavelet algorithm. The results show that the plan is applicable. Besides that, it brings about profound significance for realizing the real-time detecting on the FOD and testing with more feasibility and efficiency.

    Domain Fingerprints for No-reference Image Quality Assessment

    Get PDF
    Human fingerprints are detailed and nearly unique markers of human identity. Such a unique and stable fingerprint is also left on each acquired image. It can reveal how an image was degraded during the image acquisition procedure and thus is closely related to the quality of an image. In this work, we propose a new no-reference image quality assessment (NR-IQA) approach called domain-aware IQA (DA-IQA), which for the first time introduces the concept of domain fingerprint to the NR-IQA field. The domain fingerprint of an image is learned from image collections of different degradations and then used as the unique characteristics to identify the degradation sources and assess the quality of the image. To this end, we design a new domain-aware architecture, which enables simultaneous determination of both the distortion sources and the quality of an image. With the distortion in an image better characterized, the image quality can be more accurately assessed, as verified by extensive experiments, which show that the proposed DA-IQA performs better than almost all the compared state-of-the-art NR-IQA methods.Comment: accepted by IEEE Transactions on Circuits and Systems for Video Technology (TCSVT

    Exploring the total Galactic extinction with SDSS BHB stars

    Full text link
    Aims: We used 12,530 photometrically-selected blue horizontal branch (BHB) stars from the Sloan Digital Sky Survey (SDSS) to estimate the total extinction of the Milky Way at the high Galactic latitudes, RVR_V and AVA_V in each line of sight. Methods: A Bayesian method was developed to estimate the reddening values in the given lines of sight. Based on the most likely values of reddening in multiple colors, we were able to derive the values of RVR_V and AVA_V. Results: We selected 94 zero-reddened BHB stars from seven globular clusters as the template. The reddening in the four SDSS colors for the northern Galactic cap were estimated by comparing the field BHB stars with the template stars. The accuracy of this estimation is around 0.01\,mag for most lines of sight. We also obtained to be around 2.40±1.05\pm1.05 and AVA_V map within an uncertainty of 0.1\,mag. The results, including reddening values in the four SDSS colors, AVA_V, and RVR_V in each line of sight, are released on line. In this work, we employ an up-to-date parallel technique on GPU card to overcome time-consuming computations. We plan to release online the C++ CUDA code used for this analysis. Conclusions: The extinction map derived from BHB stars is highly consistent with that from Schlegel, Finkbeiner & Davis(1998). The derived RVR_V is around 2.40±1.05\pm1.05. The contamination probably makes the RVR_V be larger.Comment: 16 pages, 13 figures, 4 tables, accepted for publication in A&
    • …
    corecore