457 research outputs found

    Longitudinal Voxel-based morphometry with unified segmentation: evaluation on simulated Alzheimer’s disease

    Get PDF
    The goal of this work is to evaluate Voxel-Based Morphometry and three longitudinally-tailored methods of VBM.We use a cohort of simulated images produced by deforming original scans using a Finite Element Method, guided to emulate Alzheimer-like changes. The simulated images provide quite realistic data with a known pattern of spatial atrophy, with which VBM’s findings can be meaningfully compared. We believe this is the first evaluation of VBM for which anatomically-plausible ‘gold-standard’ results are available. The three longitudinal VBM methods have been implemented within the unified segmentation framework of SPM5; one of the techniques is a newly developed procedure, which shows promising potential

    Evaluation of local and global atrophy measurement techniques with simulated Alzheimer's disease data

    Get PDF
    The main goal of this work was to evaluate several well-known methods which provide global (BSI and SIENA) or local (Jacobian integration) estimates of atrophy in brain structures using Magnetic Resonance images. For that purpose, we have generated realistic simulated Alzheimer's disease images in which volume changes are modelled with a Finite Element thermoelastic model, which mimic the patterns of change obtained from a cohort of 19 real controls and 27 probable Alzheimer's disease patients. SIENA and BSI results correlate very well with gold standard data (BSI mean absolute error <0.29%; SIENA <0.44%). Jacobian integration was guided by both fluid and FFD-based registration techniques and resulting deformation fields and associated Jacobians were compared, region by region, with gold standard ones. The FFD registration technique provided more satisfactory results than the fluid one. Mean absolute error differences between volume changes given by the FFD-based technique and the gold standard were: sulcal CSF <2.49%; lateral ventricles 2.25%; brain <0.36%; hippocampi <0.42%

    Why item response theory should be used for longitudinal questionnaire data analysis in medical research

    Get PDF
    Background Multi-item questionnaires are important instruments for monitoring health in epidemiological longitudinal studies. Mostly sum-scores are used as a summary measure for these multi-item questionnaires. The objective of this study was to show the negative impact of using sum-score based longitudinal data analysis instead of Item Response Theory (IRT)-based plausible values. Methods In a simulation study (varying the number of items, sample size, and distribution of the outcomes) the parameter estimates resulting from both modeling techniques were compared to the true values. Next, the models were applied to an example dataset from the Amsterdam Growth and Health Longitudinal Study (AGHLS). Results The results show that using sum-scores leads to overestimation of the within person (repeated measurement) variance and underestimation of the between person variance. Conclusions We recommend using IRT-based plausible value techniques for analyzing repeatedly measured multi-item questionnaire data

    Phenomenological model of diffuse global and regional atrophy using finite-element methods

    Get PDF
    The main goal of this work is the generation of ground-truth data for the validation of atrophy measurement techniques, commonly used in the study of neurodegenerative diseases such as dementia. Several techniques have been used to measure atrophy in cross-sectional and longitudinal studies, but it is extremely difficult to compare their performance since they have been applied to different patient populations. Furthermore, assessment of performance based on phantom measurements or simple scaled images overestimates these techniques' ability to capture the complexity of neurodegeneration of the human brain. We propose a method for atrophy simulation in structural magnetic resonance (MR) images based on finite-element methods. The method produces cohorts of brain images with known change that is physically and clinically plausible, providing data for objective evaluation of atrophy measurement techniques. Atrophy is simulated in different tissue compartments or in different neuroanatomical structures with a phenomenological model. This model of diffuse global and regional atrophy is based on volumetric measurements such as the brain or the hippocampus, from patients with known disease and guided by clinical knowledge of the relative pathological involvement of regions and tissues. The consequent biomechanical readjustment of structures is modelled using conventional physics-based techniques based on biomechanical tissue properties and simulating plausible tissue deformations with finite-element methods. A thermoelastic model of tissue deformation is employed, controlling the rate of progression of atrophy by means of a set of thermal coefficients, each one corresponding to a different type of tissue. Tissue characterization is performed by means of the meshing of a labelled brain atlas, creating a reference volumetric mesh that will be introduced to a finite-element solver to create the simulated deformations. Preliminary work on the simulation of acquisition artefa- - cts is also presented. Cross-sectional and

    Dynamic Bayesian Combination of Multiple Imperfect Classifiers

    Get PDF
    Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present a dynamic Bayesian classifier combination approach and investigate the changes in base classifier performance over time.Comment: 35 pages, 12 figure

    Anomalous diffusion and the first passage time problem

    Full text link
    We study the distribution of first passage time (FPT) in Levy type of anomalous diffusion. Using recently formulated fractional Fokker-Planck equation we obtain three results. (1) We derive an explicit expression for the FPT distribution in terms of Fox or H-functions when the diffusion has zero drift. (2) For the nonzero drift case we obtain an analytical expression for the Laplace transform of the FPT distribution. (3) We express the FPT distribution in terms of a power series for the case of two absorbing barriers. The known results for ordinary diffusion (Brownian motion) are obtained as special cases of our more general results.Comment: 25 pages, 4 figure

    Radiative Corrections to One-Photon Decays of Hydrogenic Ions

    Full text link
    Radiative corrections to the decay rate of n=2 states of hydrogenic ions are calculated. The transitions considered are the M1 decay of the 2s state to the ground state and the E1(M2) decays of the 2p1/22p_{1/2} and 2p3/22p_{3/2} states to the ground state. The radiative corrections start in order α(Zα)2\alpha (Z \alpha)^2, but the method used sums all orders of ZαZ\alpha. The leading α(Zα)2\alpha (Z\alpha)^2 correction for the E1 decays is calculated and compared with the exact result. The extension of the calculational method to parity nonconserving transitions in neutral atoms is discussed.Comment: 22 pages, 2 figure

    A comparison of hirudin with heparin in the prevention of restenosis after coronary angioplasty

    Get PDF
    __Background:__ The likelihood of restenosis is a major limitation of coronary angioplasty. We studied whether hirudin, a highly selective inhibitor of thrombin with irreversible effects, would prevent restenosis after angioplasty. We compared two regimens of recombinant hirudin with heparin. __Methods:__ We randomly assigned 1141 patients with unstable angina who were scheduled for angioplasty to receive one of three treatments: (1) a bolus dose of 10,000 IU of heparin followed by an intravenous infusion of heparin for 24 hours and subcutaneous placebo twice daily for three days (382 patients), (2) a bolus dose of 40 mg of hirudin followed by an intravenous infusion of hirudin for 24 hours and subcutaneous placebo twice daily for three days (381 patients), or (3) the same hirudin regimen except that 40 mg of hirudin was given subcutaneously instead of placebo twice daily for three days (378 patients). The primary end point was event-free survival at seven months. Other end points were early cardiac events (within 96 hours), bleeding and other complications of the study treatment, and angiographic measurements of coronary diameter at six months of follow-up. __Results:__ At seven months, event-free survival was 67.3 percent in the group receiving heparin, 63.5 percent in the group receiving intravenous hirudin, and 68.0 percent in the group receiving both intravenous and subcutaneous hirudin (P=0.61). However, the administration of hirudin was associated with a significant reduction in early cardiac events, which occurred in 11.0, 7.9, and 5.6 percent of patients in the respective groups (combined relative risk with hirudin, 0.61; 95 percent confidence interval, 0.41 to 0.90; P=0.023). The mean minimal luminal diameters in the respective groups on follow-up angiography at six months were 1.54, 1.47, and 1.56 mm. __Conclusions:__ Although significantly fewer early cardiac events occurred with hirudin than with heparin, hirudin had no apparent benefit with longer-term follow-up

    Observations and modeling of wave-supported sediment gravity flows on the Po prodelta and comparison to prior observations from the Eel shelf

    Get PDF
    Author Posting. © The Authors, 2006. This is the author's version of the work. It is posted here by permission of Elsevier B.V. for personal use, not for redistribution. The definitive version was published in Continental Shelf Research 27 (2007): 375-399, doi:10.1016/j.csr.2005.07.008.A mooring and tripod array was deployed from the fall of 2002 through the spring of 2003 on the Po prodelta to measure sediment transport processes associated with sediment delivered from the Po River. Observations on the prodelta revealed wave-supported gravity flows of high concentration mud suspensions that are dynamically and kinematically similar to those observed on the Eel shelf (Traykovski et al., 2000). Due to the dynamic similarity between the two sites, a simple one-dimensional across-shelf model with the appropriate bottom boundary condition was used to examine fluxes associated with this transport mechanism at both locations. To calculate the sediment concentrations associated with the wave-dominated and wave-current resuspension, a bottom boundary condition using a reference concentration was combined with an “active layer” formulation to limit the amount of sediment in suspension. Whereas the wave-supported gravity flow mechanism dominates the transport on the Eel shelf, on the Po prodelta flux due to this mechanism is equal in magnitude to transport due to wave resuspension and wind-forced mean currents in cross-shore direction. Southward transport due to wave resuspension and wind forced mean currents move an order of magnitude more sediment along-shore than the downslope flux associated wave-supported gravity flows.This work funded by the U.S. Office of Naval Research under grant number #N00014-02-10378, under the direction of program manager, Tom Drake
    corecore