85 research outputs found

    Modelling the microstructure and the viscoelastic behaviour of carbon black filled rubber materials from 3D simulations

    No full text
    International audienceVolume fraction and spatial repartition of fillers impact the physical properties of rubber. Extended percolating networks of nano-sized fillers significantly modify the macroscopic mechanical properties of rubbers. Random models that describe the multiscale microstructure of rubber and efficient Fourier-based numerical algorithms are combined to predict the material’s mechanical properties. From TEM image analysis, various types of multiscale models were proposed and validated, accounting for the non-homogeneous distribution of fillers: in the present work, aggregates are located outside of an exclusion polymer simulated by two families of random models. The first model generates the exclusion polymer by a Boolean model of spheres. In the second model, the exclusion polymer is a mosaic model built from a Johnson-Mehl tessellation. Here the exclusion polymer and the polymer containing the filler show a similar morphology, contrary to the Boolean model. Aggregates are then described as the intersection of a Boolean model of spheres and of the complementary of the exclusion polymer. Carbon black particles are simulated by a Cox model of spheres in the aggregates. The models rely on a limited number of parameters fitted from experimental covariance and cumulative granulometry. The influence of the model parameters on percolation properties of the models is studied numerically from 3D simulations. Finally, a novel Fourier-based algorithm is proposed to estimate the viscoelastic properties of linear heterogeneous media, in the harmonic regime. The method is compared to analytical results and to a different, time-discretized FFT scheme. As shown in this work, the proposed numerical method is efficient for computing the viscoelastic response of microstructures containing rubbers and fillers

    Metal artifact reduction in dental CT images using polar mathematical morphology

    Full text link
    Most dental implant planning systems use a 3D representation of the CT scan of the patient under study as it provides a more intuitive view of the human jaw. The presence of metallic objects in human jaws, such as amalgam or gold fillings, provokes several artifacts like streaking and beam hardening which makes the reconstruction process difficult. In order to reduce these artifacts, several methods have been proposed using the raw data, directly obtained from the tomographs, in different ways. However, in DICOM-based applications this information is not available, and thus the need of a new method that handles this task in the DICOM domain. The presented method performs a morphological filtering in the polar domain yielding output images less affected by artifacts (even in cases of multiple metallic objects) without causing significant smoothing of the anatomic structures, which allows a great improvement in the 3D reconstruction. The algorithm has been automated and compared to other image denoising methods with successful results. (C) 2010 Elsevier Ireland Ltd. All rights reserved.This work has been supported by the project MIRACLE (DPI2007-66782-C03-01-AR07) of Spanish Ministerio de Educacion y Ciencia.Naranjo Ornedo, V.; Llorens Rodríguez, R.; Alcañiz Raya, ML.; López-Mir, F. (2011). Metal artifact reduction in dental CT images using polar mathematical morphology. Computer Methods and Programs in Biomedicine. 102(1):64-74. https://doi.org/10.1016/j.cmpb.2010.11.009S6474102

    Modelling the Microstructure and the Viscoelastic Behaviour of Carbon Black Filled Rubber Materials from 3D Simulations

    Get PDF
    Volume fraction and spatial repartition of fillers impact the physical properties of rubber. Extended percolating networks of nano-sized fillers significantly modify the macroscopic mechanical properties of rubbers. Random models that describe the multiscale microstructure of rubber and efficient Fourier-based numerical algorithms are combined to predict the material’s mechanical properties. From TEM image analysis, various types of multiscale models were proposed and validated, accounting for the non-homogeneous distribution of fillers: in the present work, aggregates are located outside of an exclusion polymer simulated by two families of random models. The first model generates the exclusion polymer by a Boolean model of spheres. In the second model, the exclusion polymer is a mosaic model built from a Johnson-Mehl tessellation. Here the exclusion polymer and the polymer containing the filler show a similar morphology, contrary to the Boolean model. Aggregates are then described as the intersection of a Boolean model of spheres and of the complementary of the exclusion polymer. Carbon black particles are simulated by a Cox model of spheres in the aggregates. The models rely on a limited number of parameters fitted from experimental covariance and cumulative granulometry. The influence of the model parameters on percolation properties of the models is studied numerically from 3D simulations. Finally, a novel Fourier-based algorithm is proposed to estimate the viscoelastic properties of linear heterogeneous media, in the harmonic regime. The method is compared to analytical results and to a different, time-discretized FFT scheme. As shown in this work, the proposed numerical method is efficient for computing the viscoelastic response of microstructures containing rubbers and fillers

    Wavelet-Based Multicomponent Denoising Profile for the Classification of Hyperspectral Images

    Get PDF
    The high resolution of the hyperspectral remote sensing images available allows the detailed analysis of even small spatial structures. As a consequence, the study of techniques to efficiently extract spatial information is a very active realm. In this paper, we propose a novel denoising wavelet-based profile for the extraction of spatial information that does not require parameters fixed by the user. Over each band obtained by a wavelet-based feature extraction technique, a denoising profile (DP) is built through the recursive application of discrete wavelet transforms followed by a thresholding process. Each component of the DP consists of features reconstructed by recursively applying inverse wavelet transforms to the thresholded coefficients. Several thresholding methods are explored. In order to show the effectiveness of the extended DP (EDP), we propose a classification scheme based on the computation of the EDP and supervised classification by extreme learning machine. The obtained results are compared to other state-of-the-art methods based on profiles in the literature. An additional study of behavior in the presence of added noise is also performed showing the high reliability of the EDP proposedThis work was supported in part by the Consellería de Educación, Universidade e Formación Profesional under Grants GRC2014/008 and ED431C 2018/2019 and the Ministerio de Economía y Empresa, Gobierno de España under Grant TIN2016-76373-P. Both are cofunded by the European Regional Development FundS

    Counting of RBCs and WBCs in noisy normal blood smear microscopic images

    Get PDF
    This work focuses on the segmentation and counting of peripheral blood smear particles which plays a vital role in medical diagnosis. Our approach profits from some powerful processing techniques. Firstly, the method used for denoising a blood smear image is based on the Bivariate wavelet. Secondly, image edge preservation uses the Kuwahara filter. Thirdly, a new binarization technique is introduced by merging the Otsu and Niblack methods. We have also proposed an efficient step-by-step procedure to determine solid binary objects by merging modified binary, edged images and modified Chan-Vese active contours. The separation of White Blood Cells (WBCs) from Red Blood Cells (RBCs) into two sub-images based on the RBC (blood’s dominant particle) size estimation is a critical step. Using Granulometry, we get an approximation of the RBC size. The proposed separation algorithm is an iterative mechanism which is based on morphological theory, saturation amount and RBC size. A primary aim of this work is to introduce an accurate mechanism for counting blood smear particles. This is accomplished by using the Immersion Watershed algorithm which counts red and white blood cells separately. To evaluate the capability of the proposed framework,experiments were conducted on normal blood smear images. This framework was compared to other published approaches and found to have lower complexity and better performance in its constituent steps; hence, it has a better overall performance

    Automatic Segmentation and Classification of Red and White Blood cells in Thin Blood Smear Slides

    Get PDF
    In this work we develop a system for automatic detection and classification of cytological images which plays an increasing important role in medical diagnosis. A primary aim of this work is the accurate segmentation of cytological images of blood smears and subsequent feature extraction, along with studying related classification problems such as the identification and counting of peripheral blood smear particles, and classification of white blood cell into types five. Our proposed approach benefits from powerful image processing techniques to perform complete blood count (CBC) without human intervention. The general framework in this blood smear analysis research is as follows. Firstly, a digital blood smear image is de-noised using optimized Bayesian non-local means filter to design a dependable cell counting system that may be used under different image capture conditions. Then an edge preservation technique with Kuwahara filter is used to recover degraded and blurred white blood cell boundaries in blood smear images while reducing the residual negative effect of noise in images. After denoising and edge enhancement, the next step is binarization using combination of Otsu and Niblack to separate the cells and stained background. Cells separation and counting is achieved by granulometry, advanced active contours without edges, and morphological operators with watershed algorithm. Following this is the recognition of different types of white blood cells (WBCs), and also red blood cells (RBCs) segmentation. Using three main types of features: shape, intensity, and texture invariant features in combination with a variety of classifiers is next step. The following features are used in this work: intensity histogram features, invariant moments, the relative area, co-occurrence and run-length matrices, dual tree complex wavelet transform features, Haralick and Tamura features. Next, different statistical approaches involving correlation, distribution and redundancy are used to measure of the dependency between a set of features and to select feature variables on the white blood cell classification. A global sensitivity analysis with random sampling-high dimensional model representation (RS-HDMR) which can deal with independent and dependent input feature variables is used to assess dominate discriminatory power and the reliability of feature which leads to an efficient feature selection. These feature selection results are compared in experiments with branch and bound method and with sequential forward selection (SFS), respectively. This work examines support vector machine (SVM) and Convolutional Neural Networks (LeNet5) in connection with white blood cell classification. Finally, white blood cell classification system is validated in experiments conducted on cytological images of normal poor quality blood smears. These experimental results are also assessed with ground truth manually obtained from medical experts

    A graph-based mathematical morphology reader

    Full text link
    This survey paper aims at providing a "literary" anthology of mathematical morphology on graphs. It describes in the English language many ideas stemming from a large number of different papers, hence providing a unified view of an active and diverse field of research

    Detection of Neovascularization Based on Fractal and Texture Analysis with Interaction Effects in Diabetic Retinopathy

    Get PDF
    Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA), high order spectrum analysis (HOS), fractal analysis (FA), and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC) are obtained. They are 96.3%, 99.1% and 98.5% (99.3%), respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.published_or_final_versio
    • …
    corecore