4,601 research outputs found

    Optimization-based interactive segmentation interface for multiregion problems.

    Get PDF
    Interactive segmentation is becoming of increasing interest to the medical imaging community in that it combines the positive aspects of both manual and automated segmentation. However, general-purpose tools have been lacking in terms of segmenting multiple regions simultaneously with a high degree of coupling between groups of labels. Hierarchical max-flow segmentation has taken advantage of this coupling for individual applications, but until recently, these algorithms were constrained to a particular hierarchy and could not be considered general-purpose. In a generalized form, the hierarchy for any given segmentation problem is specified in run-time, allowing different hierarchies to be quickly explored. We present an interactive segmentation interface, which uses generalized hierarchical max-flow for optimization-based multiregion segmentation guided by user-defined seeds. Applications in cardiac and neonatal brain segmentation are given as example applications of its generality

    Robust Cardiac Motion Estimation using Ultrafast Ultrasound Data: A Low-Rank-Topology-Preserving Approach

    Get PDF
    Cardiac motion estimation is an important diagnostic tool to detect heart diseases and it has been explored with modalities such as MRI and conventional ultrasound (US) sequences. US cardiac motion estimation still presents challenges because of the complex motion patterns and the presence of noise. In this work, we propose a novel approach to estimate the cardiac motion using ultrafast ultrasound data. -- Our solution is based on a variational formulation characterized by the L2-regularized class. The displacement is represented by a lattice of b-splines and we ensure robustness by applying a maximum likelihood type estimator. While this is an important part of our solution, the main highlight of this paper is to combine a low-rank data representation with topology preservation. Low-rank data representation (achieved by finding the k-dominant singular values of a Casorati Matrix arranged from the data sequence) speeds up the global solution and achieves noise reduction. On the other hand, topology preservation (achieved by monitoring the Jacobian determinant) allows to radically rule out distortions while carefully controlling the size of allowed expansions and contractions. Our variational approach is carried out on a realistic dataset as well as on a simulated one. We demonstrate how our proposed variational solution deals with complex deformations through careful numerical experiments. While maintaining the accuracy of the solution, the low-rank preprocessing is shown to speed up the convergence of the variational problem. Beyond cardiac motion estimation, our approach is promising for the analysis of other organs that experience motion.Comment: 15 pages, 10 figures, Physics in Medicine and Biology, 201

    Four-dimensional tomographic reconstruction by time domain decomposition

    Full text link
    Since the beginnings of tomography, the requirement that the sample does not change during the acquisition of one tomographic rotation is unchanged. We derived and successfully implemented a tomographic reconstruction method which relaxes this decades-old requirement of static samples. In the presented method, dynamic tomographic data sets are decomposed in the temporal domain using basis functions and deploying an L1 regularization technique where the penalty factor is taken for spatial and temporal derivatives. We implemented the iterative algorithm for solving the regularization problem on modern GPU systems to demonstrate its practical use

    Uncertainty Quantification and Reduction in Cardiac Electrophysiological Imaging

    Get PDF
    Cardiac electrophysiological (EP) imaging involves solving an inverse problem that infers cardiac electrical activity from body-surface electrocardiography data on a physical domain defined by the body torso. To avoid unreasonable solutions that may fit the data, this inference is often guided by data-independent prior assumptions about different properties of cardiac electrical sources as well as the physical domain. However, these prior assumptions may involve errors and uncertainties that could affect the inference accuracy. For example, common prior assumptions on the source properties, such as fixed spatial and/or temporal smoothness or sparseness assumptions, may not necessarily match the true source property at different conditions, leading to uncertainties in the inference. Furthermore, prior assumptions on the physical domain, such as the anatomy and tissue conductivity of different organs in the thorax model, represent an approximation of the physical domain, introducing errors to the inference. To determine the robustness of the EP imaging systems for future clinical practice, it is important to identify these errors/uncertainties and assess their impact on the solution. This dissertation focuses on the quantification and reduction of the impact of uncertainties caused by prior assumptions/models on cardiac source properties as well as anatomical modeling uncertainties on the EP imaging solution. To assess the effect of fixed prior assumptions/models about cardiac source properties on the solution of EP imaging, we propose a novel yet simple Lp-norm regularization method for volumetric cardiac EP imaging. This study reports the necessity of an adaptive prior model (rather than fixed model) for constraining the complex spatiotemporally changing properties of the cardiac sources. We then propose a multiple-model Bayesian approach to cardiac EP imaging that employs a continuous combination of prior models, each re-effecting a specific spatial property for volumetric sources. The 3D source estimation is then obtained as a weighted combination of solutions across all models. Including a continuous combination of prior models, our proposed method reduces the chance of mismatch between prior models and true source properties, which in turn enhances the robustness of the EP imaging solution. To quantify the impact of anatomical modeling uncertainties on the EP imaging solution, we propose a systematic statistical framework. Founded based on statistical shape modeling and unscented transform, our method quantifies anatomical modeling uncertainties and establish their relation to the EP imaging solution. Applied on anatomical models generated from different image resolutions and different segmentations, it reports the robustness of EP imaging solution to these anatomical shape-detail variations. We then propose a simplified anatomical model for the heart that only incorporates certain subject-specific anatomical parameters, while discarding local shape details. Exploiting less resources and processing for successful EP imaging, this simplified model provides a simple clinically-compatible anatomical modeling experience for EP imaging systems. Different components of our proposed methods are validated through a comprehensive set of synthetic and real-data experiments, including various typical pathological conditions and/or diagnostic procedures, such as myocardial infarction and pacing. Overall, the methods presented in this dissertation for the quantification and reduction of uncertainties in cardiac EP imaging enhance the robustness of EP imaging, helping to close the gap between EP imaging in research and its clinical application

    State of the art: iterative CT reconstruction techniques

    Get PDF
    Owing to recent advances in computing power, iterative reconstruction (IR) algorithms have become a clinically viable option in computed tomographic (CT) imaging. Substantial evidence is accumulating about the advantages of IR algorithms over established analytical methods, such as filtered back projection. IR improves image quality through cyclic image processing. Although all available solutions share the common mechanism of artifact reduction and/or potential for radiation dose savings, chiefly due to image noise suppression, the magnitude of these effects depends on the specific IR algorithm. In the first section of this contribution, the technical bases of IR are briefly reviewed and the currently available algorithms released by the major CT manufacturers are described. In the second part, the current status of their clinical implementation is surveyed. Regardless of the applied IR algorithm, the available evidence attests to the substantial potential of IR algorithms for overcoming traditional limitations in CT imaging

    -Norm Regularization in Volumetric Imaging of Cardiac Current Sources

    Get PDF
    Advances in computer vision have substantially improved our ability to analyze the structure and mechanics of the heart. In comparison, our ability to observe and analyze cardiac electrical activities is much limited. The progress to computationally reconstruct cardiac current sources from noninvasive voltage data sensed on the body surface has been hindered by the ill-posedness and the lack of a unique solution of the reconstruction problem. Common L2- and L1-norm regularizations tend to produce a solution that is either too diffused or too scattered to reflect the complex spatial structure of current source distribution in the heart. In this work, we propose a general regularization with Lp-norm () constraint to bridge the gap and balance between an overly smeared and overly focal solution in cardiac source reconstruction. In a set of phantom experiments, we demonstrate the superiority of the proposed Lp-norm method over its L1 and L2 counterparts in imaging cardiac current sources with increasing extents. Through computer-simulated and real-data experiments, we further demonstrate the feasibility of the proposed method in imaging the complex structure of excitation wavefront, as well as current sources distributed along the postinfarction scar border. This ability to preserve the spatial structure of source distribution is important for revealing the potential disruption to the normal heart excitation
    • …
    corecore