24 research outputs found

    WAVELET AND SINE BASED ANALYSIS OF PRINT QUALITY EVALUATIONS

    Get PDF
    Recent advances in imaging technology have resulted in a proliferation of images across different media. Before it reaches the end user, these signals undergo several transformations, which may introduce defects/artifacts that affect the perceived image quality. In order to design and evaluate these imaging systems, perceived image quality must be measured. This work focuses on analysis of print image defects and characterization of printer artifacts such as banding and graininess by using a human visual system (HVS) based framework. Specifically the work addresses the prediction of visibility of print defects (banding and graininess) by representing the print defects in terms of the orthogonal wavelet and sinusoidal basis functions and combining the detection probabilities of each basis functions to predict the response of the human visual system (HVS). The detection probabilities for basis function components and the simulated print defects are obtained from separate subjective tests. The prediction performance from both the wavelet based and sine based approaches is compared with the subjective testing results .The wavelet based prediction performs better than the sinusoidal based approach and can be a useful technique in developing measures and methods for print quality evaluations based on HVS

    Virtual Satisfaction of Human With the Need in Attribute of Things

    Get PDF
    In the real world, the entity and attributes of an object are unified. It is generally believed that inseparability of entity and attributes is fully embodied in the energy character of the object. Although entity and attributes are inseparable, it is likely to realize the relevant separation of entity and attributes by means of virtual reality technology (VRT). As a matter of fact, this kind of separation means relative separation of entity and attributes with virtual reality technology as the medium. This is of great significance as it enables people to get satisfied with their desire to perceive attributes of an object, and it is a way to satisfy people’s infinite desire by technology. With the swift progress of this technology, all kinds of limitations in the real environment successfully break through and the degree of human freedom is greatly improved

    Tent-pole spatial defect pooling for prediction of subjective quality assessment of streaks and bands in color printing

    Get PDF
    Abstract. An algorithm is described for measuring the subjective, visual impact of 1-D defects (streaks and bands

    Self Supervised Low Dose Computed Tomography Image Denoising Using Invertible Network Exploiting Inter Slice Congruence

    Full text link
    The resurgence of deep neural networks has created an alternative pathway for low-dose computed tomography denoising by learning a nonlinear transformation function between low-dose CT (LDCT) and normal-dose CT (NDCT) image pairs. However, those paired LDCT and NDCT images are rarely available in the clinical environment, making deep neural network deployment infeasible. This study proposes a novel method for self-supervised low-dose CT denoising to alleviate the requirement of paired LDCT and NDCT images. Specifically, we have trained an invertible neural network to minimize the pixel-based mean square distance between a noisy slice and the average of its two immediate adjacent noisy slices. We have shown the aforementioned is similar to training a neural network to minimize the distance between clean NDCT and noisy LDCT image pairs. Again, during the reverse mapping of the invertible network, the output image is mapped to the original input image, similar to cycle consistency loss. Finally, the trained invertible network's forward mapping is used for denoising LDCT images. Extensive experiments on two publicly available datasets showed that our method performs favourably against other existing unsupervised methods.Comment: 10 pages, Accepted in IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) 202

    Image Quality Evaluation in Lossy Compressed Images

    Get PDF
    This research focuses on the quantification of image quality in lossy compressed images, exploring the impact of digital artefacts and scene characteristics upon image quality evaluation. A subjective paired comparison test was implemented to assess perceived quality of JPEG 2000 against baseline JPEG over a range of different scene types. Interval scales were generated for both algorithms, which indicated a subjective preference for JPEG 2000, particularly at low bit rates, and these were confirmed by an objective distortion measure. The subjective results did not follow this trend for some scenes however, and both algorithms were found to be scene dependent as a result of the artefacts produced at high compression rates. The scene dependencies were explored from the interval scale results, which allowed scenes to be grouped according to their susceptibilities to each of the algorithms. Groupings were correlated with scene measures applied in a linked study. A pilot study was undertaken to explore perceptibility thresholds of JPEG 2000 of the same set of images. This work was developed with a further experiment to investigate the thresholds of perceptibility and acceptability of higher resolution JPEG 2000 compressed images. A set of images was captured using a professional level full-frame Digital Single Lens Reflex camera, using a raw workflow and carefully controlled image-processing pipeline. The scenes were quantified using a set of simple scene metrics to classify them according to whether they were average, higher than, or lower than average, for a number of scene properties known to affect image compression and perceived image quality; these were used to make a final selection of test images. Image fidelity was investigated using the method of constant stimuli to quantify perceptibility thresholds and just noticeable differences (JNDs) of perceptibility. Thresholds and JNDs of acceptability were also quantified to explore suprathreshold quality evaluation. The relationships between the two thresholds were examined and correlated with the results from the scene measures, to identify more or less susceptible scenes. It was found that the level and differences between the two thresholds was an indicator of scene dependency and could be predicted by certain types of scene characteristics. A third study implemented the soft copy quality ruler as an alternative psychophysical method, by matching the quality of compressed images to a set of images varying in a single attribute, separated by known JND increments of quality. The imaging chain and image processing workflow were evaluated using objective measures of tone reproduction and spatial frequency response. An alternative approach to the creation of ruler images was implemented and tested, and the resulting quality rulers were used to evaluate a subset of the images from the previous study. The quality ruler was found to be successful in identifying scene susceptibilities and observer sensitivity. The fourth investigation explored the implementation of four different image quality metrics. These were the Modular Image Difference Metric, the Structural Similarity Metric, The Multi-scale Structural Similarity Metric and the Weighted Structural Similarity Metric. The metrics were tested against the subjective results and all were found to have linear correlation in terms of predictability of image quality

    The effect of scene content on image quality

    Get PDF
    Device-dependent metrics attempt to predict image quality from an ‘average signal’, usually embodied on test targets. Consequently, the metrics perform well on individual ‘average looking’ scenes and test targets, but provide lower correlation with subjective assessments when working with a variety of scenes with different than ‘average signal’ characteristics. This study considers the issues of scene dependency on image quality. This study aims to quantify the change in quality with scene contents, to research the problem of scene dependency in relation to devicedependent image quality metrics and to provide a solution to it. A novel subjective scaling method was developed in order to derive individual attribute scales, using the results from the overall image quality assessments. This was an analytical top-down approach, which does not require separate scaling of individual attributes and does not assume that the attribute is not independent from other attributes. From the measurements, interval scales were created and the effective scene dependency factor was calculated, for each attribute. Two device-dependent image quality metrics, the Effective Pictorial Information Capacity (EPIC) and the Perceived Information Capacity (PIC), were used to predict subjective image quality for a test set that varied in sharpness and noisiness. These metrics were found to be reliable predictors of image quality. However, they were not equally successful in predicting quality for different images with varying scene content. Objective scene classification was thus considered and employed in order to deal with the problem of scene dependency in device-dependent metrics. It used objective scene descriptors, which correlated with subjective criteria on scene susceptibility. This process resulted in the development of a fully automatic classification of scenes into ‘standard’ and ‘non-standard’ groups, and the result allows the calculation of calibrated metric values for each group. The classification and metric calibration performance was quite encouraging, not only because it improved mean image quality predictions from all scenes, but also because it catered for nonstandard scenes, which originally produced low correlations. The findings indicate that the proposed automatic scene classification method has great potential for tackling the problem of scene dependency, when modelling device-dependent image quality. In addition, possible further studies of objective scene classification are discussed

    Thermomechanical streaking defects in architectural aluminium extrusions

    Full text link
    No results found for This study determined the microstructural and optical origins of the surface defect known as "thermomechanical streaking". The findings of this study will help reduce the prevalence of streaking in the aluminium extrusion industry, and consequently reduce the significant material and energy wastage which coincide with every occurrence

    Time and Understanding

    Get PDF

    Radiofrequency and Gaseous Technologies for Enhancing the Microbiological Safety of Low Moisture Food Ingredients

    Get PDF
    High heat resistance and long survival of Salmonella in low moisture food ingredients (LMFIs) such as spices and seeds are concerning as they are typically consumed without cooking. Therefore, it is challenging to effectively inactivate pathogenic bacteria without negatively impacting the quality of the treated product. This dissertation aimed to develop and evaluate novel intervention technologies: in-package radiofrequency steaming and non-thermal gaseous technologies to improve the microbial safety of LMFIs. The dissertation can be divided into three parts. The first part of this dissertation on the thermal inactivation kinetics of Salmonella and a surrogate, Enterococcus faecium NRRL B-2354on black pepper powder indicated that microbial inactivation increased with increasing treatment temperature and water activity. Inoculation protocol also influenced the heat resistance of Salmonella where inoculation of black peppercorns pre-grinding had higher D-values compared to those inoculated post-grinding. The second part of this dissertation aimed at developing an in-package pasteurization process to inactivate Salmonella enterica in spices (black peppercorn) and herbs (dried basil leaves). During RF heating, the one-way steam vent enabled the accumulation of steam inside the package improving the heating uniformity before venting off excess steam. In-package radiofrequency steaming reduced Salmonella below detection levels on dried basil leaves within 35 s in a bottle sealed with a steam vent and 40 s in polymer packages with steam-vent and on black peppercorns within 155 s in a polymer package. A single intervention technology is not fit for all LMF matrices. Thermal processing would not be feasible for chia seeds due to the potential oxidation of fats and gelling in the presence of moisture. Therefore, the third part of the study explored non-thermal antimicrobial gaseous technologies, such as chlorine dioxide (ClO2), and ethylene oxide (EtO) gas on the decontamination of chia seeds. The developed response surface model suggested that an increase in gas concentration, relative humidity, and treatment time enhanced the microbial reduction on chia seeds. At gas concentration of 10 mg/L and 80% RH over a 5 h exposure period; Salmonella and E. faecium populations were reduced by 3.7 ± 0.2 and 3.2 ± 0.3 log CFU/g, respectively. Mild heating at 60 °C after ClO2 (90 %RH, 3 mg/L for 2 h) followed by ambient storage for seven days enhanced the inactivation to achieve 5-log reduction. The quality of treated products was not significantly impacted except for an increase in peroxide value after ClO2 treatment. EtO inactivation was faster than ClO2 treatment on chia seeds providing more than 5 log reduction of Salmonella within 10 minutes at 50% RH and 60 °C without significantly affecting its quality. E. faecium was a suitable surrogate for Salmonella in all intervention technologies investigated in this study. The developed predictive models would benefit food industries in identifying the process parameters for improving LMFIs safety without altering the nutritional and sensorial qualities of food
    corecore