236,457 research outputs found

    Image Utility Assessment and a Relationship with Image Quality Assessment

    Get PDF
    International audiencePresent quality assessment (QA) algorithms aim to generate scores for natural images consistent with subjective scores for the quality assessment task. For the quality assessment task, human observers evaluate a natural image based on its perceptual resemblance to a reference. Natural images communicate useful information to humans, and this paper investigates the utility assessment task, where human observers evaluate the usefulness of a natural image as a surrogate for a reference. Current QA algorithms implicitly assess utility insofar as an image that exhibits strong perceptual resemblance to a reference is also of high utility. However, a perceived quality score is not a proxy for a perceived utility score: a decrease in perceived quality may not affect the perceived utility. Two experiments are conducted to investigate the relationship between the quality assessment and utility assessment tasks. The results from these experiments provide evidence that any algorithm optimized to predict perceived quality scores cannot immediately predict perceived utility scores. Several QA algorithms are evaluated in terms of their ability to predict subjective scores for the quality and utility assessment tasks. Among the QA algorithms evaluated, the visual information fidelity (VIF) criterion, which is frequently reported to provide the highest correlation with perceived quality, predicted both perceived quality and utility scores reasonably. The consistent performance of VIF for both the tasks raised suspicions in light of the evidence from the psychophysical experiments. A thorough analysis of VIF revealed that it artificially emphasizes evaluations at finer image scales (i.e., higher spatial frequencies) over those at coarser image scales (i.e., lower spatial frequencies). A modified implementation of VIF, denoted VIF*, is presented that provides statistically significant improvement over VIF for the quality assessment task and statistically worse performance for the utility assessment task. A novel utility assessment algorithm, referred to as the natural image contour evaluation (NICE), is introduced that conducts a comparison of the contours of a test image to those of a reference image across multiple image scales to score the test image. NICE demonstrates a viable departure from traditional QA algorithms that incorporate energy-based approaches and is capable of predicting perceived utility scores

    Mapping the disease-specific LupusQoL to the SF-6D

    Get PDF
    Purpose To derive a mapping algorithm to predict SF-6D utility scores from the non-preference-based LupusQoL and test the performance of the developed algorithm on a separate independent validation data set. Method LupusQoL and SF-6D data were collected from 320 patients with systemic lupus erythematosus (SLE) attending routine rheumatology outpatient appointments at seven centres in the UK. Ordinary least squares (OLS) regression was used to estimate models of increasing complexity in order to predict individuals’ SF-6D utility scores from their responses to the LupusQoL questionnaire. Model performance was judged on predictive ability through the size and pattern of prediction errors generated. The performance of the selected model was externally validated on an independent data set containing 113 female SLE patients who had again completed both the LupusQoL and SF-36 questionnaires. Results Four of the eight LupusQoL domains (physical health, pain, emotional health, and fatigue) were selected as dependent variables in the final model. Overall model fit was good, with R2 0.7219, MAE 0.0557, and RMSE 0.0706 when applied to the estimation data set, and R2 0.7431, MAE 0.0528, and RMSE 0.0663 when applied to the validation sample. Conclusion This study provides a method by which health state utility values can be estimated from patient responses to the non-preference-based LupusQoL, generalisable beyond the data set upon which it was estimated. Despite concerns over the use of OLS to develop mapping algorithms, we find this method to be suitable in this case due to the normality of the SF-6D data

    Endeavor agility on consumption value through affirming an acceptable degree of utilization esteem for new items

    Get PDF
    Purpose: This comparative study holistically assesses the agility that turns into the standard of business and methods for progress. Design/Methodology/Approach: The contribution and the relevant methodology based on a duality of purposes. They are (i) quantitative research system that utilized to complete the investigation and (ii) both fundamental and auxiliary sources used to assemble information. Findings: Based on the holistically implied arguments and yielded results, it proposed that the writing audit results various parameters to clarify nimbleness and utilization esteems, which utilized to build a survey. At that point, the examination led to design a fitting example between use esteems and hidden agility measurements. Practical implications: Addressing to dual purposes, this study sheds new light on the Mallintercept method block strategy that utilized to gather reactions. Originality/Value: Although this study organically builds upon recent studies, this area gives a detail examination of the investigation. The survey has a field containing the segment profile of the respondents. This examination applies the utilization esteem model as the essential system, which incorporated the practical worth, the social worth, the passionate worth, the epistemic worth, and the restrictive worth.peer-reviewe

    Utility Analysis for Optimizing Compact Adaptive Spectral Imaging Systems for Subpixel Target Detection Applications

    Get PDF
    Since the development of spectral imaging systems where we transitioned from panchromatic, single band images to multiple bands, we have pursued a way to evaluate the quality of spectral images. As spectral imaging capabilities improved and the bands collected wavelengths outside of the visible spectrum they could be used to gain information about the earth such as material identification that would have been a challenge with panchromatic images. We now have imaging systems capable of collecting images with hundreds of contiguous bands across the reflective portion of the electromagnetic spectrum that allows us to extract information at subpixel levels. Prediction and assessment methods for panchromatic image quality, while well-established are continuing to be improved. For spectral images however, methods for analyzing quality and what this entails have yet to form a solid framework. In this research, we built on previous work to develop a process to optimize the design of spectral imaging systems. We used methods for predicting quality of spectral images and extended the existing framework for analyzing efficacy of miniature systems. We comprehensively analyzed utility of spectral images and efficacy of compact systems for a set of application scenarios designed to test the relationships of system parameters, figures of merit, and mission requirements in the trade space for spectral images collected by a compact imaging system from design to operation. We focused on subpixel target detection to analyze spectral image quality of compact spaceborne systems with adaptive band selection capabilities. In order to adequately account for the operational aspect of exploiting adaptive band collection capabilities, we developed a method for band selection. Dimension reduction is a step often employed in processing spectral images, not only to improve computation time but to avoid errors associated with high dimensionality. An adaptive system with a tunable filter can select which bands to collect for each target so the dimension reduction happens at the collection stage instead of the processing stage. We developed the band selection method to optimize detection probability using only the target reflectance signature. This method was conceived to be simple enough to be calculated by a small on-board CPU, to be able to drive collection decisions, and reduce data processing requirements. We predicted the utility of the selected bands using this method, then validated the results using real images, and cross-validated them using simulated image associated with perfect truth data. In this way, we simultaneously validated the band selection method we developed and the combined use of the simulation and prediction tools used as part of the analytic process to optimize system design. We selected a small set of mission scenarios and demonstrated the use of this process to provide example recommendations for efficacy and utility based on the mission. The key parameters we analyzed to drive the design recommendations were target abundance, noise, number of bands, and scene complexity. We found critical points in the system design trade space, and coupled with operational requirements, formed a set of mission feasibility and system design recommendations. The selected scenarios demonstrated the relationship between the imaging system design and operational requirements based on the mission. We found key points in the spectral imaging trade space that indicated relationships within the spectral image utility trade space that can be used to further solidify the frameworks for compact spectral imaging systems

    The effect of time constraint on anticipation, decision making, and option generation in complex and dynamic environments

    Get PDF
    Researchers interested in performance in complex and dynamic situations have focused on how individuals predict their opponent(s) potential courses of action (i.e., during assessment) and generate potential options about how to respond (i.e., during intervention). When generating predictive options, previous research supports the use of cognitive mechanisms that are consistent with long-term working memory (LTWM) theory (Ericsson and Kintsch in Phychol Rev 102(2):211–245, 1995; Ward et al. in J Cogn Eng Decis Mak 7:231–254, 2013). However, when generating options about how to respond, the extant research supports the use of the take-the-first (TTF) heuristic (Johnson and Raab in Organ Behav Hum Decis Process 91:215–229, 2003). While these models provide possible explanations about how options are generated in situ, often under time pressure, few researchers have tested the claims of these models experimentally by explicitly manipulating time pressure. The current research investigates the effect of time constraint on option-generation behavior during the assessment and intervention phases of decision making by employing a modified version of an established option-generation task in soccer. The results provide additional support for the use of LTWM mechanisms during assessment across both time conditions. During the intervention phase, option-generation behavior appeared consistent with TTF, but only in the non-time-constrained condition. Counter to our expectations, the implementation of time constraint resulted in a shift toward the use of LTWM-type mechanisms during the intervention phase. Modifications to the cognitive-process level descriptions of decision making during intervention are proposed, and implications for training during both phases of decision making are discussed
    • 

    corecore