17,984 research outputs found

    Utility Analysis for Optimizing Compact Adaptive Spectral Imaging Systems for Subpixel Target Detection Applications

    Get PDF
    Since the development of spectral imaging systems where we transitioned from panchromatic, single band images to multiple bands, we have pursued a way to evaluate the quality of spectral images. As spectral imaging capabilities improved and the bands collected wavelengths outside of the visible spectrum they could be used to gain information about the earth such as material identification that would have been a challenge with panchromatic images. We now have imaging systems capable of collecting images with hundreds of contiguous bands across the reflective portion of the electromagnetic spectrum that allows us to extract information at subpixel levels. Prediction and assessment methods for panchromatic image quality, while well-established are continuing to be improved. For spectral images however, methods for analyzing quality and what this entails have yet to form a solid framework. In this research, we built on previous work to develop a process to optimize the design of spectral imaging systems. We used methods for predicting quality of spectral images and extended the existing framework for analyzing efficacy of miniature systems. We comprehensively analyzed utility of spectral images and efficacy of compact systems for a set of application scenarios designed to test the relationships of system parameters, figures of merit, and mission requirements in the trade space for spectral images collected by a compact imaging system from design to operation. We focused on subpixel target detection to analyze spectral image quality of compact spaceborne systems with adaptive band selection capabilities. In order to adequately account for the operational aspect of exploiting adaptive band collection capabilities, we developed a method for band selection. Dimension reduction is a step often employed in processing spectral images, not only to improve computation time but to avoid errors associated with high dimensionality. An adaptive system with a tunable filter can select which bands to collect for each target so the dimension reduction happens at the collection stage instead of the processing stage. We developed the band selection method to optimize detection probability using only the target reflectance signature. This method was conceived to be simple enough to be calculated by a small on-board CPU, to be able to drive collection decisions, and reduce data processing requirements. We predicted the utility of the selected bands using this method, then validated the results using real images, and cross-validated them using simulated image associated with perfect truth data. In this way, we simultaneously validated the band selection method we developed and the combined use of the simulation and prediction tools used as part of the analytic process to optimize system design. We selected a small set of mission scenarios and demonstrated the use of this process to provide example recommendations for efficacy and utility based on the mission. The key parameters we analyzed to drive the design recommendations were target abundance, noise, number of bands, and scene complexity. We found critical points in the system design trade space, and coupled with operational requirements, formed a set of mission feasibility and system design recommendations. The selected scenarios demonstrated the relationship between the imaging system design and operational requirements based on the mission. We found key points in the spectral imaging trade space that indicated relationships within the spectral image utility trade space that can be used to further solidify the frameworks for compact spectral imaging systems

    Optimizing the procedure of grain nutrient predictions in barley via hyperspectral imaging

    Get PDF
    Hyperspectral imaging enables researchers and plant breeders to analyze various traits of interest like nutritional value in high throughput. In order to achieve this, the optimal design of a reliable calibration model, linking the measured spectra with the investigated traits, is necessary. In the present study we investigated the impact of different regression models, calibration set sizes and calibration set compositions on prediction performance. For this purpose, we analyzed concentrations of six globally relevant grain nutrients of the wild barley population HEB-YIELD as case study. The data comprised 1,593 plots, grown in 2015 and 2016 at the locations Dundee and Halle, which have been entirely analyzed through traditional laboratory methods and hyperspectral imaging. The results indicated that a linear regression model based on partial least squares outperformed neural networks in this particular data modelling task. There existed a positive relationship between the number of samples in a calibration model and prediction performance, with a local optimum at a calibration set size of ~40% of the total data. The inclusion of samples from several years and locations could clearly improve the predictions of the investigated nutrient traits at small calibration set sizes. It should be stated that the expansion of calibration models with additional samples is only useful as long as they are able to increase trait variability. Models obtained in a certain environment were only to a limited extent transferable to other environments. They should therefore be successively upgraded with new calibration data to enable a reliable prediction of the desired traits. The presented results will assist the design and conceptualization of future hyperspectral imaging projects in order to achieve reliable predictions. It will in general help to establish practical applications of hyperspectral imaging systems, for instance in plant breeding concepts

    Terahertz Security Image Quality Assessment by No-reference Model Observers

    Full text link
    To provide the possibility of developing objective image quality assessment (IQA) algorithms for THz security images, we constructed the THz security image database (THSID) including a total of 181 THz security images with the resolution of 127*380. The main distortion types in THz security images were first analyzed for the design of subjective evaluation criteria to acquire the mean opinion scores. Subsequently, the existing no-reference IQA algorithms, which were 5 opinion-aware approaches viz., NFERM, GMLF, DIIVINE, BRISQUE and BLIINDS2, and 8 opinion-unaware approaches viz., QAC, SISBLIM, NIQE, FISBLIM, CPBD, S3 and Fish_bb, were executed for the evaluation of the THz security image quality. The statistical results demonstrated the superiority of Fish_bb over the other testing IQA approaches for assessing the THz image quality with PLCC (SROCC) values of 0.8925 (-0.8706), and with RMSE value of 0.3993. The linear regression analysis and Bland-Altman plot further verified that the Fish__bb could substitute for the subjective IQA. Nonetheless, for the classification of THz security images, we tended to use S3 as a criterion for ranking THz security image grades because of the relatively low false positive rate in classifying bad THz image quality into acceptable category (24.69%). Interestingly, due to the specific property of THz image, the average pixel intensity gave the best performance than the above complicated IQA algorithms, with the PLCC, SROCC and RMSE of 0.9001, -0.8800 and 0.3857, respectively. This study will help the users such as researchers or security staffs to obtain the THz security images of good quality. Currently, our research group is attempting to make this research more comprehensive.Comment: 13 pages, 8 figures, 4 table

    Infrared thermography-calorimetric quantitation of energy expenditure in biomechanically different types of jūdō throwing techniques: a pilot study

    Get PDF
    It was the purpose of this pilot study to assess the energy expenditure (EE) of two biome-chanically different jūdō throws, namely, the simple mechanical couple-based uchi-mata vs. the lever-based throw ippon-seoi-nage, using infrared thermal calorimetry (ITC). Testing subjects included one Caucasian female elite athlete (age: 26.4 years) and one male veteran jūdōka (age: 50.8 years). ITC images were captured by an Avio NEC InfRec R300 camera and thermal data obtained were plotted into a proprietary equation for estimation of EE. Data were compared to respiratory data obtained by a Cosmed K4 b2 portable gas analyzer. Oxy-gen consumption as estimated by ITC capture during practice of uchi-mata was markedly lower than during performance ippon-seoi-nage in the female (457 mL•min-1 vs. 540 mL•min-1, P<0.05) and male subject (1,078 mL•min-1 vs. 1,088 mL•min-1, NS), with the difference in values between both genders subject being significant (P<0.01). The metabolic cost of the exercise (uchi-mata vs. ippon-seoi-nage) itself was 1.26 kcal•min-1 (88 W) vs. 1.68 kcal•min-1 (117 W) (P<0.05) in the female subject, and 2.97 kcal•min-1 (207 W) (P<0.01) vs. 3.02 kcal•min-1 (211 W) (NS) in the male subject. Values for the female were significantly differ-ent (P<0.01) from those of the male subject. The results support the initial hypothesis that the couple-based jūdō throws (in this case, uchi-mata) are energetically more efficient than lever-based throws, such as ippon-seoi-nage. Application of this approach may be of practical use for coaches in optimizing energy-saving strategies in both elite and veteran jūdō athletes

    Flow velocity mapping using contrast enhanced high-frame-rate plane wave ultrasound and image tracking: methods and initial in vitro and in vivo evaluation

    Get PDF
    Ultrasound imaging is the most widely used method for visualising and quantifying blood flow in medical practice, but existing techniques have various limitations in terms of imaging sensitivity, field of view, flow angle dependence, and imaging depth. In this study, we developed an ultrasound imaging velocimetry approach capable of visualising and quantifying dynamic flow, by combining high-frame-rate plane wave ultrasound imaging, microbubble contrast agents, pulse inversion contrast imaging and speckle image tracking algorithms. The system was initially evaluated in vitro on both straight and carotid-mimicking vessels with steady and pulsatile flows and in vivo in the rabbit aorta. Colour and spectral Doppler measurements were also made. Initial flow mapping results were compared with theoretical prediction and reference Doppler measurements and indicate the potential of the new system as a highly sensitive, accurate, angle-independent and full field-of-view velocity mapping tool capable of tracking and quantifying fast and dynamic flows

    Expanding Dimensionality in Cinema Color: Impacting Observer Metamerism through Multiprimary Display

    Get PDF
    Television and cinema display are both trending towards greater ranges and saturation of reproduced colors made possible by near-monochromatic RGB illumination technologies. Through current broadcast and digital cinema standards work, system designs employing laser light sources, narrow-band LED, quantum dots and others are being actively endorsed in promotion of Wide Color Gamut (WCG). Despite artistic benefits brought to creative content producers, spectrally selective excitations of naturally different human color response functions exacerbate variability of observer experience. An exaggerated variation in color-sensing is explicitly counter to the exhaustive controls and calibrations employed in modern motion picture pipelines. Further, singular standard observer summaries of human color vision such as found in the CIE’s 1931 and 1964 color matching functions and used extensively in motion picture color management are deficient in recognizing expected human vision variability. Many researchers have confirmed the magnitude of observer metamerism in color matching in both uniform colors and imagery but few have shown explicit color management with an aim of minimized difference in observer perception variability. This research shows that not only can observer metamerism influences be quantitatively predicted and confirmed psychophysically but that intentionally engineered multiprimary displays employing more than three primaries can offer increased color gamut with drastically improved consistency of experience. To this end, a seven-channel prototype display has been constructed based on observer metamerism models and color difference indices derived from the latest color vision demographic research. This display has been further proven in forced-choice paired comparison tests to deliver superior color matching to reference stimuli versus both contemporary standard RGB cinema projection and recently ratified standard laser projection across a large population of color-normal observers

    Automated reliability assessment for spectroscopic redshift measurements

    Get PDF
    We present a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function (PDF). We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process, and produce a redshift posterior PDF that will be the starting-point for ML algorithms to provide an automated assessment of a redshift reliability. As a use case, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification to describe different types of redshift PDFs, but due to the subjective definition of these flags, soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions, unlabelled data from preliminary mock simulations for the Euclid space mission are projected into this mapping to predict their redshift reliability labels.Comment: Submitted on 02 June 2017 (v1). Revised on 08 September 2017 (v2). Latest version 28 September 2017 (this version v3
    • …
    corecore