13 research outputs found

    Adaptive Quantisation in HEVC for Contouring Artefacts Removal in UHD Content

    Get PDF
    Contouring artefacts affect the visual experience of some particular types of compressed Ultra High Definition (UHD) sequences characterised by smoothly textured areas and gradual transitions in the value of the pixels. This paper proposes a technique to adjust the quantisation process at the encoder so that contouring artefacts are avoided. The devised method does not require any change at the decoder side and introduces a negligible coding rate increment (up to 3.4% for the same objective quality). This result compares favourably with the average 11.2% bit-rate penalty introduced by a method where the quantisation step is reduced in contour-prone areas

    High Efficiency Video Coding (HEVC) tools for next generation video content

    Get PDF

    Benchmarking of mobile phone cameras

    Get PDF
    fi=vertaisarvioitu|en=peerReviewed

    Image Quality Evaluation in Lossy Compressed Images

    Get PDF
    This research focuses on the quantification of image quality in lossy compressed images, exploring the impact of digital artefacts and scene characteristics upon image quality evaluation. A subjective paired comparison test was implemented to assess perceived quality of JPEG 2000 against baseline JPEG over a range of different scene types. Interval scales were generated for both algorithms, which indicated a subjective preference for JPEG 2000, particularly at low bit rates, and these were confirmed by an objective distortion measure. The subjective results did not follow this trend for some scenes however, and both algorithms were found to be scene dependent as a result of the artefacts produced at high compression rates. The scene dependencies were explored from the interval scale results, which allowed scenes to be grouped according to their susceptibilities to each of the algorithms. Groupings were correlated with scene measures applied in a linked study. A pilot study was undertaken to explore perceptibility thresholds of JPEG 2000 of the same set of images. This work was developed with a further experiment to investigate the thresholds of perceptibility and acceptability of higher resolution JPEG 2000 compressed images. A set of images was captured using a professional level full-frame Digital Single Lens Reflex camera, using a raw workflow and carefully controlled image-processing pipeline. The scenes were quantified using a set of simple scene metrics to classify them according to whether they were average, higher than, or lower than average, for a number of scene properties known to affect image compression and perceived image quality; these were used to make a final selection of test images. Image fidelity was investigated using the method of constant stimuli to quantify perceptibility thresholds and just noticeable differences (JNDs) of perceptibility. Thresholds and JNDs of acceptability were also quantified to explore suprathreshold quality evaluation. The relationships between the two thresholds were examined and correlated with the results from the scene measures, to identify more or less susceptible scenes. It was found that the level and differences between the two thresholds was an indicator of scene dependency and could be predicted by certain types of scene characteristics. A third study implemented the soft copy quality ruler as an alternative psychophysical method, by matching the quality of compressed images to a set of images varying in a single attribute, separated by known JND increments of quality. The imaging chain and image processing workflow were evaluated using objective measures of tone reproduction and spatial frequency response. An alternative approach to the creation of ruler images was implemented and tested, and the resulting quality rulers were used to evaluate a subset of the images from the previous study. The quality ruler was found to be successful in identifying scene susceptibilities and observer sensitivity. The fourth investigation explored the implementation of four different image quality metrics. These were the Modular Image Difference Metric, the Structural Similarity Metric, The Multi-scale Structural Similarity Metric and the Weighted Structural Similarity Metric. The metrics were tested against the subjective results and all were found to have linear correlation in terms of predictability of image quality

    Present and Future of Gravitational Wave Astronomy

    Get PDF
    The first detection on Earth of a gravitational wave signal from the coalescence of a binary black hole system in 2015 established a new era in astronomy, allowing the scientific community to observe the Universe with a new form of radiation for the first time. More than five years later, many more gravitational wave signals have been detected, including the first binary neutron star coalescence in coincidence with a gamma ray burst and a kilonova observation. The field of gravitational wave astronomy is rapidly evolving, making it difficult to keep up with the pace of new detector designs, discoveries, and astrophysical results. This Special Issue is, therefore, intended as a review of the current status and future directions of the field from the perspective of detector technology, data analysis, and the astrophysical implications of these discoveries. Rather than presenting new results, the articles collected in this issue will serve as a reference and an introduction to the field. This Special Issue will include reviews of the basic properties of gravitational wave signals; the detectors that are currently operating and the main sources of noise that limit their sensitivity; planned upgrades of the detectors in the short and long term; spaceborne detectors; a data analysis of the gravitational wave detector output focusing on the main classes of detected and expected signals; and implications of the current and future discoveries on our understanding of astrophysics and cosmology

    Abstracts on Radio Direction Finding (1899 - 1995)

    Get PDF
    The files on this record represent the various databases that originally composed the CD-ROM issue of "Abstracts on Radio Direction Finding" database, which is now part of the Dudley Knox Library's Abstracts and Selected Full Text Documents on Radio Direction Finding (1899 - 1995) Collection. (See Calhoun record https://calhoun.nps.edu/handle/10945/57364 for further information on this collection and the bibliography). Due to issues of technological obsolescence preventing current and future audiences from accessing the bibliography, DKL exported and converted into the three files on this record the various databases contained in the CD-ROM. The contents of these files are: 1) RDFA_CompleteBibliography_xls.zip [RDFA_CompleteBibliography.xls: Metadata for the complete bibliography, in Excel 97-2003 Workbook format; RDFA_Glossary.xls: Glossary of terms, in Excel 97-2003 Workbookformat; RDFA_Biographies.xls: Biographies of leading figures, in Excel 97-2003 Workbook format]; 2) RDFA_CompleteBibliography_csv.zip [RDFA_CompleteBibliography.TXT: Metadata for the complete bibliography, in CSV format; RDFA_Glossary.TXT: Glossary of terms, in CSV format; RDFA_Biographies.TXT: Biographies of leading figures, in CSV format]; 3) RDFA_CompleteBibliography.pdf: A human readable display of the bibliographic data, as a means of double-checking any possible deviations due to conversion
    corecore