2,098 research outputs found

    Recovering 3D Shape with Absolute Size from Endoscope Images Using RBF Neural Network

    Get PDF
    Medical diagnosis judges the status of polyp from the size and the 3D shape of the polyp from its medical endoscope image. However the medical doctor judges the status empirically from the endoscope image and more accurate 3D shape recovery from its 2D image has been demanded to support this judgment. As a method to recover 3D shape with high speed, VBW (Vogel-Breuß-Weickert) model is proposed to recover 3D shape under the condition of point light source illumination and perspective projection. However, VBW model recovers the relative shape but there is a problem that the shape cannot be recovered with the exact size. Here, shape modification is introduced to recover the exact shape with modification from that with VBW model. RBF-NN is introduced for the mapping between input and output. Input is given as the output of gradient parameters of VBW model for the generated sphere. Output is given as the true gradient parameters of true values of the generated sphere. Learning mapping with NN can modify the gradient and the depth can be recovered according to the modified gradient parameters. Performance of the proposed approach is confirmed via computer simulation and real experiment

    Multidimensional image analysis of cardiac function in MRI

    Get PDF
    Cardiac morphology is a key indicator of cardiac health. Important metrics that are currently in clinical use are left-ventricle cardiac ejection fraction, cardiac muscle (myocardium) mass, myocardium thickness and myocardium thickening over the cardiac cycle. Advances in imaging technologies have led to an increase in temporal and spatial resolution. Such an increase in data presents a laborious task for medical practitioners to analyse. In this thesis, measurement of the cardiac left-ventricle function is achieved by developing novel methods for the automatic segmentation of the left-ventricle blood-pool and the left ventricle myocardium boundaries. A preliminary challenge faced in this task is the removal of noise from Magnetic Resonance Imaging (MRI) data, which is addressed by using advanced data filtering procedures. Two mechanisms for left-ventricle segmentation are employed. Firstly segmentation of the left ventricle blood-pool for the measurement of ejection fraction is undertaken in the signal intensity domain. Utilising the high discrimination between blood and tissue, a novel methodology based on a statistical partitioning method offers success in localising and segmenting the blood pool of the left ventricle. From this initialisation, the estimation of the outer wall (epi-cardium) of the left ventricle can be achieved using gradient information and prior knowledge. Secondly, a more involved method for extracting the myocardium of the leftventricle is developed, that can better perform segmentation in higher dimensions. Spatial information is incorporated in the segmentation by employing a gradient-based boundary evolution. A level-set scheme is implemented and a novel formulation for the extraction of the cardiac muscle is introduced. Two surfaces, representing the inner and the outer boundaries of the left-ventricle, are simultaneously evolved using a coupling function and supervised with a probabilistic model of expertly assisted manual segmentations

    Simulation of pore-scale flow using finite element-methods

    No full text
    I present a new finite element (FE) simulation method to simulate pore-scale flow. Within the pore-space, I solve a simplified form of the incompressible Navier-Stoke’s equation, yielding the velocity field in a two-step solution approach. First, Poisson’s equation is solved with homogeneous boundary conditions, and then the pore pressure is computed and the velocity field obtained for no slip conditions at the grain boundaries. From the computed velocity field I estimate the effective permeability of porous media samples characterized by thin section micrographs, micro-CT scans and synthetically generated grain packings. This two-step process is much simpler than solving the full Navier Stokes equation and therefore provides the opportunity to study pore geometries with hundreds of thousands of pores in a computationally more cost effective manner than solving the full Navier-Stoke’s equation. My numerical model is verified with an analytical solution and validated on samples whose permeabilities and porosities had been measured in laboratory experiments (Akanji and Matthai, 2010). Comparisons were also made with Stokes solver, published experimental, approximate and exact permeability data. Starting with a numerically constructed synthetic grain packings, I also investigated the extent to which the details of pore micro-structure affect the hydraulic permeability (Garcia et al., 2009). I then estimate the hydraulic anisotropy of unconsolidated granular packings. With the future aim to simulate multiphase flow within the pore-space, I also compute the radii and derive capillary pressure from the Young-Laplace equation (Akanji and Matthai,2010

    CONNECTIVITIES OF VARIOUS COMPONENTS IN ORGANIC-RICH SHALE

    Get PDF
    The physical properties of shale are fundamentally controlled by its microstructure. Connectivity of various components in shale is an important property that governs the transport of mass, energy and momentum. Quantifying connectivity of components is a critical aspect to understand the microstructure of shales. Scanning electron microscope (SEM) imaging technique is a popular technique to capture the microstructure of materials. Before quantifying connectivity of components captured in the SEM image, different components in SEM images need to be identified and segmented. In the first part of this study, an automated SEM-image segmentation workflow involving feature extraction followed by machine learning is developed and tested on SEM images of shale. The proposed segmentation workflow is an alternative to classical threshold-based and object-based segmentation. Four components, namely pore/crack, pyrite, organic/kerogen, and rock matrix including clay, calcite and quartz, are automatically identified and segmented. The performance of the automated SEM-image segmentation workflow, quantified in terms of overall F1 score, on the validation dataset was higher than 0.9. In the second part of this study, five different connectivity-quantification metrics, namely two-point statistical function (S2), two-point cluster function (C2), cluster size distribution, travel times computed using fast marching method (FMM), and Euler’s number, are tested on SEM images of shale. First, the relationships between the connectivity and the responses of the five connectivity-quantification metrics are determined and validated by statistical analysis on a synthetic dataset of binary images, which contains six types of connectivity from the lowest to the highest. Second, such relationships are directly applied to quantify the connectivity of organic/kerogen and pore/crack components in the SEM images of shale

    Reservoir Characterization And Simulation Of Enhanced Oil Recovery For Bakken

    Get PDF
    The Bakken formation is one of the largest unconventional resources in the world with approximately 92 billion barrels of recoverable oil. However, the primary oil recovery factor remains as low as less than 10% of the original oil in place (OOIP). Given the vast Bakken resources and low primary oil recovery, there is a need and enormous potential for the enhanced oil recovery (EOR) in Bakken. Two comprehensive numerical compositional models were built for the simulation CO2 Huff-n-Puff and cyclic surfactant injection in an actual Middle Bakken horizontal well. A good history match of primary production was obtained. Embedded Discrete Fracture Model (EDFM) method was used to efficiently handle hydraulic fractures using non-neighboring connections as a new technique in this simulation study. The EDFM method is faster than the traditional local grid refinement method. The results of CO2 Huff-n-Puff and cyclic surfactant injection processes are compared and discussed. The simulation results show that both enhanced oil recovery processes can significantly increase oil recovery

    Implicit Shape and Appearance Priors for Few-Shot Full Head Reconstruction

    Full text link
    Recent advancements in learning techniques that employ coordinate-based neural representations have yielded remarkable results in multi-view 3D reconstruction tasks. However, these approaches often require a substantial number of input views (typically several tens) and computationally intensive optimization procedures to achieve their effectiveness. In this paper, we address these limitations specifically for the problem of few-shot full 3D head reconstruction. We accomplish this by incorporating a probabilistic shape and appearance prior into coordinate-based representations, enabling faster convergence and improved generalization when working with only a few input images (even as low as a single image). During testing, we leverage this prior to guide the fitting process of a signed distance function using a differentiable renderer. By incorporating the statistical prior alongside parallelizable ray tracing and dynamic caching strategies, we achieve an efficient and accurate approach to few-shot full 3D head reconstruction. Moreover, we extend the H3DS dataset, which now comprises 60 high-resolution 3D full head scans and their corresponding posed images and masks, which we use for evaluation purposes. By leveraging this dataset, we demonstrate the remarkable capabilities of our approach in achieving state-of-the-art results in geometry reconstruction while being an order of magnitude faster than previous approaches

    High Data Output and Automated 3D Correlative Light–Electron Microscopy Method

    Get PDF
    Correlative light/electron microscopy (CLEM) allows the simultaneous observation of a given subcellular structure by fluorescence light microscopy (FLM) and electron microscopy. The use of this approach is becoming increasingly frequent in cell biology. In this study, we report on a new high data output CLEM method based on the use of cryosections. We successfully applied the method to analyze the structure of rough and smooth Russell bodies used as model systems. The major advantages of our method are (i) the possibility to correlate several hundreds of events at the same time, (ii) the possibility to perform three-dimensional (3D) correlation, (iii) the possibility to immunolabel both endogenous and recombinantly expressed proteins at the same time and (iv) the possibility to combine the high data analysis capability of FLM with the high precision–accuracy of transmission electron microscopy in a CLEM hybrid morphometry analysis. We have identified and optimized critical steps in sample preparation, defined routines for sample analysis and retracing of regions of interest, developed software for semi/fully automatic 3D reconstruction and defined preliminary conditions for an hybrid light/electron microscopy morphometry approach

    Review of the Synergies Between Computational Modeling and Experimental Characterization of Materials Across Length Scales

    Full text link
    With the increasing interplay between experimental and computational approaches at multiple length scales, new research directions are emerging in materials science and computational mechanics. Such cooperative interactions find many applications in the development, characterization and design of complex material systems. This manuscript provides a broad and comprehensive overview of recent trends where predictive modeling capabilities are developed in conjunction with experiments and advanced characterization to gain a greater insight into structure-properties relationships and study various physical phenomena and mechanisms. The focus of this review is on the intersections of multiscale materials experiments and modeling relevant to the materials mechanics community. After a general discussion on the perspective from various communities, the article focuses on the latest experimental and theoretical opportunities. Emphasis is given to the role of experiments in multiscale models, including insights into how computations can be used as discovery tools for materials engineering, rather than to "simply" support experimental work. This is illustrated by examples from several application areas on structural materials. This manuscript ends with a discussion on some problems and open scientific questions that are being explored in order to advance this relatively new field of research.Comment: 25 pages, 11 figures, review article accepted for publication in J. Mater. Sc

    Geometric Variational Models for Inverse Problems in Imaging

    Get PDF
    This dissertation develops geometric variational models for different inverse problems in imaging that are ill-posed, designing at the same time efficient numerical algorithms to compute their solutions. Variational methods solve inverse problems by the following two steps: formulation of a variational model as a minimization problem, and design of a minimization algorithm to solve it. This dissertation is organized in the same manner. It first formulates minimization problems associated with geometric models for different inverse problems in imaging, and it then designs efficient minimization algorithms to compute their solutions. The minimization problem summarizes both the data available from the measurements and the prior knowledge about the solution in its objective functional; this naturally leads to the combination of a measurement or data term and a prior term. Geometry can play a role in any of these terms, depending on the properties of the data acquisition system or the object being imaged. In this context, each chapter of this dissertation formulates a variational model that includes geometry in a different manner in the objective functional, depending on the inverse problem at hand. In the context of compressed sensing, the first chapter exploits the geometric properties of images to include an alignment term in the sparsity prior of compressed sensing; this additional prior term aligns the normal vectors of the level curves of the image with the reconstructed signal, and it improves the quality of reconstruction. A two-step recovery method is designed for that purpose: first, it estimates the normal vectors to the level curves of the image; second, it reconstructs an image matching the compressed sensing measurements, the geometric alignment of normals, and the sparsity constraint of compressed sensing. The proposed method is extended to non-local operators in graphs for the recovery of textures. The harmonic active contours of Chapter 2 make use of differential geometry to interpret the segmentation of an image as a minimal surface manifold. In this case, geometry is exploited in both the measurement term, by coupling the different image channels in a robust edge detector, and in the prior term, by imposing smoothness in the segmentation. The proposed technique generalizes existing active contours to higher dimensional spaces and non-flat images; in the plane, it improves the segmentation of images with inhomogeneities and weak edges. Shape-from-shading is investigated in Chapter 3 for the reconstruction of a silicon wafer from images of printed circuits taken with a scanning electron microscope. In this case, geometry plays a role in the image acquisition system, that is, in the measurement term of the objective functional. The prior term involves a smoothness constraint on the surface and a shape prior on the expected pattern in the circuit. The proposed reconstruction method also estimates a deformation field between the ideal pattern design and the reconstructed surface, substituting the model of shape variability necessary in shape priors with an elastic deformation field that quantifies deviations in the manufacturing process. Finally, the techniques used for the design of efficient numerical algorithms are explained with an example problem based on the level set method. To this purpose, Chapter 4 develops an efficient algorithm for the level set method when the level set function is constrained to remain a signed distance function. The distance function is preserved by the introduction of an explicit constraint in the minimization problem, the minimization algorithm is efficient by the adequate use of variable-splitting and augmented Lagrangian techniques. These techniques introduce additional variables, constraints, and Lagrange multipliers in the original minimization problem, and they decompose it into sub-optimization problems that are simple and can be efficiently solved. As a result, the proposed algorithm is five to six times faster than the original algorithm for the level set method
    • 

    corecore