11,037 research outputs found

    Stability effects on results of diffusion tensor imaging analysis by reduction of the number of gradient directions due to motion artifacts: an application to presymptomatic Huntington's disease.

    Get PDF
    In diffusion tensor imaging (DTI), an improvement in the signal-to-noise ratio (SNR) of the fractional anisotropy (FA) maps can be obtained when the number of recorded gradient directions (GD) is increased. Vice versa, elimination of motion-corrupted or noisy GD leads to a more accurate characterization of the diffusion tensor. We previously suggest a slice-wise method for artifact detection in FA maps. This current study applies this approach to a cohort of 18 premanifest Huntington's disease (pHD) subjects and 23 controls. By 2-D voxelwise statistical comparison of original FA-maps and FA-maps with a reduced number of GD, the effect of eliminating GD that were affected by motion was demonstrated.We present an evaluation metric that allows to test if the computed FA-maps (with a reduced number of GD) still reflect a "true" FA-map, as defined by simulations in the control sample. Furthermore, we investigated if omitting data volumes affected by motion in the pHD cohort could lead to an increased SNR in the resulting FA-maps.A high agreement between original FA maps (with all GD) and corrected FA maps (i.e. without GD corrupted by motion) were observed even for numbers of eliminated GD up to 13. Even in one data set in which 46 GD had to be eliminated, the results showed a moderate agreement

    Rapid analyses of dry matter content and carotenoids in fresh cassava roots using a portable visible and near infrared spectrometer (Vis/NIRS)

    Full text link
    Portable Vis/NIRS are flexible tools for fast and unbiased analyses of constituents with minimal sample preparation. This study developed calibration models for dry matter content (DMC) and carotenoids in fresh cassava roots using a portable Vis/NIRS system. We examined the effects of eight data pre-treatment combinations on calibration models and assessed calibrations on processed and intact root samples. We compared Vis/NIRS derived-DMC to other phenotyping methods. The results of the study showed that the combination of standard normal variate and de-trend (SNVD) with first derivative calculated on two data points and no smoothing (SNVD+1111) was adequate for a robust model. Calibration performance was higher with processed than the intact root samples for all the traits although intact root models for some traits especially total carotenoid content (TCC) (R2c = 96%, R2cv = 90%, RPD = 3.6 and SECV = 0.63) were sufficient for screening purposes. Using three key quality traits as templates, we developed models with processed fresh root samples. Robust calibrations were established for DMC (R2c = 99%, R2cv = 95%, RPD = 4.5 and SECV = 0.9), TCC (R2c = 99%, R2cv = 91%, RPD = 3.5 and SECV = 2.1) and all Trans β-carotene (ATBC) (R2c = 98%, R2cv = 91%, RPD = 3.5 and SECV = 1.6). Coefficient of determination on independent validation set (R2p) for these traits were also satisfactory for ATBC (91%), TCC (88%) and DMC (80%). Compared to other methods, Vis/NIRS-derived DMC from both intact and processed roots had very high correlation (>0.95) with the ideal oven-drying than from specific gravity method (0.49). There was equally a high correlation (0.94) between the intact and processed Vis/NIRS DMC. Therefore, the portable Vis/NIRS could be employed for the rapid analyses of DMC and quantification of carotenoids in cassava for nutritional and breeding purposes

    A statistical method for estimating activity uncertainty parameters to improve project forecasting

    Get PDF
    Just like any physical system, projects have entropy that must be managed by spending energy. The entropy is the project’s tendency to move to a state of disorder (schedule delays, cost overruns), and the energy process is an inherent part of any project management methodology. In order to manage the inherent uncertainty of these projects, accurate estimates (for durations, costs, resources, …) are crucial to make informed decisions. Without these estimates, managers have to fall back to their own intuition and experience, which are undoubtedly crucial for making decisions, but are are often subject to biases and hard to quantify. This paper builds further on two published calibration methods that aim to extract data from real projects and calibrate them to better estimate the parameters for the probability distributions of activity durations. Both methods rely on the lognormal distribution model to estimate uncertainty in activity durations and perform a sequence of statistical hypothesis tests that take the possible presence of two human biases into account. Based on these two existing methods, a new so-called statistical partitioning heuristic is presented that integrates the best elements of the two methods to further improve the accuracy of estimating the distribution of activity duration uncertainty. A computational experiment has been carried out on an empirical database of 83 empirical projects. The experiment shows that the new statistical partitioning method performs at least as good as, and often better than, the two existing calibration methods. The improvement will allow a better quantification of the activity duration uncertainty, which will eventually lead to a better prediction of the project schedule and more realistic expectations about the project outcomes. Consequently, the project manager will be able to better cope with the inherent uncertainty (entropy) of projects with a minimum managerial effort (energy)

    Photo-z Performance for Precision Cosmology

    Full text link
    Current and future weak lensing surveys will rely on photometrically estimated redshifts of very large numbers of galaxies. In this paper, we address several different aspects of the demanding photo-z performance that will be required for future experiments, such as the proposed ESA Euclid mission. It is first shown that the proposed all-sky near-infrared photometry from Euclid, in combination with anticipated ground-based photometry (e.g. PanStarrs-2 or DES) should yield the required precision in individual photo-z of sigma(z) < 0.05(1+z) at I_AB < 24.5. Simple a priori rejection schemes based on the photometry alone can be tuned to recognise objects with wildly discrepant photo-z and to reduce the outlier fraction to < 0.25% with only modest loss of otherwise usable objects. Turning to the more challenging problem of determining the mean redshift of a set of galaxies to a precision of 0.002(1+z) we argue that, for many different reasons, this is best accomplished by relying on the photo-z themselves rather than on the direct measurement of from spectroscopic redshifts of a representative subset of the galaxies. A simple adaptive scheme based on the statistical properties of the photo-z likelihood functions is shown to meet this stringent systematic requirement. We also examine the effect of an imprecise correction for Galactic extinction and the effects of contamination by fainter over-lapping objects in photo-z determination. The overall conclusion of this work is that the acquisition of photometrically estimated redshifts with the precision required for Euclid, or other similar experiments, will be challenging but possible. (abridged)Comment: 16 pages, 11 figures; submitted to MNRA

    2D Reconstruction of Small Intestine's Interior Wall

    Full text link
    Examining and interpreting of a large number of wireless endoscopic images from the gastrointestinal tract is a tiresome task for physicians. A practical solution is to automatically construct a two dimensional representation of the gastrointestinal tract for easy inspection. However, little has been done on wireless endoscopic image stitching, let alone systematic investigation. The proposed new wireless endoscopic image stitching method consists of two main steps to improve the accuracy and efficiency of image registration. First, the keypoints are extracted by Principle Component Analysis and Scale Invariant Feature Transform (PCA-SIFT) algorithm and refined with Maximum Likelihood Estimation SAmple Consensus (MLESAC) outlier removal to find the most reliable keypoints. Second, the optimal transformation parameters obtained from first step are fed to the Normalised Mutual Information (NMI) algorithm as an initial solution. With modified Marquardt-Levenberg search strategy in a multiscale framework, the NMI can find the optimal transformation parameters in the shortest time. The proposed methodology has been tested on two different datasets - one with real wireless endoscopic images and another with images obtained from Micro-Ball (a new wireless cubic endoscopy system with six image sensors). The results have demonstrated the accuracy and robustness of the proposed methodology both visually and quantitatively.Comment: Journal draf
    • …
    corecore