56 research outputs found

    Symmetric inverse consistent nonlinear registration driven by mutual information.

    Get PDF
    A nonlinear viscoelastic image registration algorithm based on the demons paradigm and incorporating inverse consistent constraint (ICC) is implemented. An inverse consistent and symmetric cost function using mutual information (MI) as a similarity measure is employed. The cost function also includes regularization of transformation and inverse consistent error (ICE). The uncertainties in balancing various terms in the cost function are avoided by alternatively minimizing the similarity measure, the regularization of the transformation, and the ICE terms. The diffeomorphism of registration for preventing folding and/or tearing in the deformation is achieved by the composition scheme. The quality of image registration is first demonstrated by constructing brain atlas from 20 adult brains (age range 30-60). It is shown that with this registration technique: (1) the Jacobian determinant is positive for all voxels and (2) the average ICE is around 0.004 voxels with a maximum value below 0.1 voxels. Further, the deformation-based segmentation on Internet Brain Segmentation Repository, a publicly available dataset, has yielded high Dice similarity index (DSI) of 94.7% for the cerebellum and 74.7% for the hippocampus, attesting to the quality of our registration method

    Deep gray matter atrophy in multiple sclerosis: a tensor based morphometry.

    Get PDF
    Tensor based morphometry (TBM) was applied to determine the atrophy of deep gray matter (DGM) structures in 88 relapsing multiple sclerosis (MS) patients. For group analysis of atrophy, an unbiased atlas was constructed from 20 normal brains. The MS brain images were co-registered with the unbiased atlas using a symmetric inverse consistent nonlinear registration. These studies demonstrate significant atrophy of thalamus, caudate nucleus, and putamen even at a modest clinical disability, as assessed by the expanded disability status score (EDSS). A significant correlation between atrophy and EDSS was observed for different DGM structures: (thalamus: r=-0.51, p=3.85 x 10(-7); caudate nucleus: r=-0.43, p=2.35 x 10(-5); putamen: r=-0.36, p=6.12 x 10(-6)). Atrophy of these structures also correlated with 1) T2 hyperintense lesion volumes (thalamus: r=-0.56, p=9.96 x 10(-9); caudate nucleus: r=-0.31, p=3.10 x 10(-3); putamen: r=-0.50, p=6.06 x 10(-7)), 2) T1 hypointense lesion volumes (thalamus: r=-0.61, p=2.29 x 10(-10); caudate nucleus: r=-0.35, p=9.51 x 10(-4); putamen: r=-0.43, p=3.51 x 10(-5)), and 3) normalized CSF volume (thalamus: r=-0.66, p=3.55 x 10(-12); caudate nucleus: r=-0.52, p=2.31 x 10(-7), and putamen: r=-0.66, r=2.13 x 10(-12)). More severe atrophy was observed mainly in thalamus at higher EDSS. These studies appear to suggest a link between the white matter damage and DGM atrophy in MS

    Identification of Selective Inhibitors of Cancer Stem Cells by High-Throughput Screening

    Get PDF
    Screens for agents that specifically kill epithelial cancer stem cells (CSCs) have not been possible due to the rarity of these cells within tumor cell populations and their relative instability in culture. We describe here an approach to screening for agents with epithelial CSC-specific toxicity. We implemented this method in a chemical screen and discovered compounds showing selective toxicity for breast CSCs. One compound, salinomycin, reduces the proportion of CSCs by >100-fold relative to paclitaxel, a commonly used breast cancer chemotherapeutic drug. Treatment of mice with salinomycin inhibits mammary tumor growth in vivo and induces increased epithelial differentiation of tumor cells. In addition, global gene expression analyses show that salinomycin treatment results in the loss of expression of breast CSC genes previously identified by analyses of breast tissues isolated directly from patients. This study demonstrates the ability to identify agents with specific toxicity for epithelial CSCs.National Cancer Institute (U.S.). Initiative for Chemical GeneticsBreast Cancer Research FoundationRoot, DavidBroad Institute of MIT and Harvard (RNAi Platform

    Liver segmentation from registered multiphase CT data sets with EM clustering and GVF level set

    No full text
    In this study, clinically produced multiphase CT volumetric data sets (pre-contrast, arterial and venous enhanced phase) are drawn upon to transcend the intrinsic limitations of single phase data sets for the robust and accurate segmentation of the liver in typically challenging cases. As an initial step, all other phase volumes are registered to either the arterial or venous phase volume by a symmetric nonlinear registration method using mutual information as similarity metric. Once registered, the multiphase CT volumes are pre-filtered to prepare for subsequent steps. Under the assumption that the intensity vectors of different organs follow the Gaussian Mixture model (GMM), expectation maximization (EM) is then used to classify the multiphase voxels into different clusters. The clusters for liver parenchyma, vessels and tumors are combined together and provide the initial liver mask that is used to generate initial zeros level set. Conversely, the voxels classified as non-liver will guide the speed image of the level sets in order to reduce leakage. Geodesic active contour level set using the gradient vector flow (GVF) derived from one of the enhanced phase volumes is then performed to further evolve the liver segmentation mask. Using EM clusters as the reference, the resulting liver mask is finally morphologically post-processed to add missing clusters and reduce leakage. The proposed method has been tested on the clinical data sets of ten patients with relatively complex and/or extensive liver cancer or metastases. A 95.8 dice similarity index when compared to expert manual segmentation demonstrates the high performance and the robustness of our proposed method - even for challenging cancer data sets - and confirms the potential of a more thorough computational exploitation of currently available clinical data sets. © 2010 Copyright SPIE - The International Society for Optical Engineering

    Liver segmentation from registered multiphase CT data sets with EM clustering and GVF level set

    No full text
    In this study, clinically produced multiphase CT volumetric data sets (pre-contrast, arterial and venous enhanced phase) are drawn upon to transcend the intrinsic limitations of single phase data sets for the robust and accurate segmentation of the liver in typically challenging cases. As an initial step, all other phase volumes are registered to either the arterial or venous phase volume by a symmetric nonlinear registration method using mutual information as similarity metric. Once registered, the multiphase CT volumes are pre-filtered to prepare for subsequent steps. Under the assumption that the intensity vectors of different organs follow the Gaussian Mixture model (GMM), expectation maximization (EM) is then used to classify the multiphase voxels into different clusters. The clusters for liver parenchyma, vessels and tumors are combined together and provide the initial liver mask that is used to generate initial zeros level set. Conversely, the voxels classified as non-liver will guide the speed image of the level sets in order to reduce leakage. Geodesic active contour level set using the gradient vector flow (GVF) derived from one of the enhanced phase volumes is then performed to further evolve the liver segmentation mask. Using EM clusters as the reference, the resulting liver mask is finally morphologically post-processed to add missing clusters and reduce leakage. The proposed method has been tested on the clinical data sets of ten patients with relatively complex and/or extensive liver cancer or metastases. A 95.8 dice similarity index when compared to expert manual segmentation demonstrates the high performance and the robustness of our proposed method - even for challenging cancer data sets - and confirms the potential of a more thorough computational exploitation of currently available clinical data sets. © 2010 Copyright SPIE - The International Society for Optical Engineering.</p

    Study on Advanced Anchor - grouting Support of Mining Roadway in Close Lower Coal Seam

    No full text
    Aiming at the problems of difficult support of mining roadway under goaf of the close distance coal seam, obvious large deformation, and long-term continuous deformation of surrounding rock, an advanced bolt-grouting support system is proposed. The system can effectively solve the problem of roadway support from the design system, construction system, and monitoring and evaluation system, and realize the intelligent evaluation of bolt-grouting support. Based on the actual engineering background of 29204 working face in Dongqu Coal Mine, the supporting material and scheme, grouting parameters, and construction technology are studied and analyzed. Finally, the supporting scheme is adjusted and optimized timely by combining with the intelligent evaluation system of anchor grouting. The field practice results show that the advanced bolting-grouting support system has good applicability to roadway deformation under different conditions. The maximum subsidence of the roof in the test section is reduced by 73.48 % compared with the original support scheme, and the maximum stress of the anchor cable is reduced by 50.68 %. The research results provide a set of safe and effective support system and feasible technical ideas and methods to solve the problem of surrounding rock control of mining roadway under goaf of the close distance coal seam

    Recovery of titanium from undissolved residue (tionite) in titanium oxide industry via NaOH hydrothermal conversion and H_2SO_4 leaching

    No full text
    To recover titanium from tionite, a new process consisting of NaOH hydrothermal conversion, water washing, and H_2SO_4 leaching for TiO_2 preparation was developed. The experimental results show that under the optimum hydrothermal conversion conditions, i.e., 50% NaOH (mass fraction) solution, NaOH/tionite mass ratio of 4:1, reaction temperature of 240 °C, reaction time of 1 h and oxygen partial pressure of 0.25 MPa, the titanium was mainly converted into Na_2TiO_3, and the conversion was 97.2%. The unwanted product Na_2TiSiO_5 remained stable in water washing, and its formation was prevented by improving NaOH concentration. In water washing process, about 97.6% of Na~+ could be recycled by washing the hydrothermal product. The NaOH solutions could be reused after concentration. 96.7% of titanium in the washed product was easily leached in H_2SO_4 solution at low temperatures, forming titanyl sulfate solution to further prepare TiO_2

    Emission Characteristics of Particulate Matter from Boiling Food

    No full text
    Cooking food in water or soup, such as hot pot, is a widely used cooking method in China. This type of cooking requires no oil and cooks at a lower temperature, but that does not mean it produces fewer pollutants or is less harmful. There are few research studies on the emission characteristics and mechanisms of particulate matter emissions when eating hot pot (the boiling process), which leads to the unreasonable design of ventilation systems for this kind of catering. In this paper, the effects of boiling different ingredients (including noodles, potatoes, fish, tofu, meatballs, and pork) on particle number concentration emissions were studied. The particle number concentration and particle size distribution of PM with diameters of 0.3 μm and less, 0.3–0.5 μm and 0.5–1.0 μm (PM0.3, PM0.3–0.5 and PM0.5–1.0, respectively) were measured in an experimental chamber. The food type and shape showed very little change in the PM emission characteristics of boiling. When the boiling state was reached, the number concentration, particle size distribution, and arithmetic mean diameter of particles all fluctuated within 60 s. The emission characteristics of particles produced by boiling water and heating oil were compared. Heating oil produced more small particles, and boiling water released more large particles. Transient and steady methods were used to calculate the emission rate of particles, and the steady-state calculation has a high estimation of the emission rate
    • …
    corecore