2,536 research outputs found

    An Automatic Level Set Based Liver Segmentation from MRI Data Sets

    Get PDF
    A fast and accurate liver segmentation method is a challenging work in medical image analysis area. Liver segmentation is an important process for computer-assisted diagnosis, pre-evaluation of liver transplantation and therapy planning of liver tumors. There are several advantages of magnetic resonance imaging such as free form ionizing radiation and good contrast visualization of soft tissue. Also, innovations in recent technology and image acquisition techniques have made magnetic resonance imaging a major tool in modern medicine. However, the use of magnetic resonance images for liver segmentation has been slow when we compare applications with the central nervous systems and musculoskeletal. The reasons are irregular shape, size and position of the liver, contrast agent effects and similarities of the gray values of neighbor organs. Therefore, in this study, we present a fully automatic liver segmentation method by using an approximation of the level set based contour evolution from T2 weighted magnetic resonance data sets. The method avoids solving partial differential equations and applies only integer operations with a two-cycle segmentation algorithm. The efficiency of the proposed approach is achieved by applying the algorithm to all slices with a constant number of iteration and performing the contour evolution without any user defined initial contour. The obtained results are evaluated with four different similarity measures and they show that the automatic segmentation approach gives successful results

    3D mesh processing using GAMer 2 to enable reaction-diffusion simulations in realistic cellular geometries

    Full text link
    Recent advances in electron microscopy have enabled the imaging of single cells in 3D at nanometer length scale resolutions. An uncharted frontier for in silico biology is the ability to simulate cellular processes using these observed geometries. Enabling such simulations requires watertight meshing of electron micrograph images into 3D volume meshes, which can then form the basis of computer simulations of such processes using numerical techniques such as the Finite Element Method. In this paper, we describe the use of our recently rewritten mesh processing software, GAMer 2, to bridge the gap between poorly conditioned meshes generated from segmented micrographs and boundary marked tetrahedral meshes which are compatible with simulation. We demonstrate the application of a workflow using GAMer 2 to a series of electron micrographs of neuronal dendrite morphology explored at three different length scales and show that the resulting meshes are suitable for finite element simulations. This work is an important step towards making physical simulations of biological processes in realistic geometries routine. Innovations in algorithms to reconstruct and simulate cellular length scale phenomena based on emerging structural data will enable realistic physical models and advance discovery at the interface of geometry and cellular processes. We posit that a new frontier at the intersection of computational technologies and single cell biology is now open.Comment: 39 pages, 14 figures. High resolution figures and supplemental movies available upon reques

    Absolute proteomic quantification reveals design principles of sperm flagellar chemosensation

    Get PDF
    © The Author(s), 2020. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Troetschel, C., Hamzeh, H., Alvarez, L., Pascal, R., Lavryk, F., Boenigk, W., Koerschen, H. G., Mueller, A., Poetsch, A., Rennhack, A., Gui, L., Nicastro, D., Struenker, T., Seifert, R., & Kaupp, U. B. Absolute proteomic quantification reveals design principles of sperm flagellar chemosensation. Embo Journal, 39(4), (2020): e102723, doi:10.15252/embj.2019102723.Cilia serve as cellular antennae that translate sensory information into physiological responses. In the sperm flagellum, a single chemoattractant molecule can trigger a Ca2+ rise that controls motility. The mechanisms underlying such ultra‐sensitivity are ill‐defined. Here, we determine by mass spectrometry the copy number of nineteen chemosensory signaling proteins in sperm flagella from the sea urchin Arbacia punctulata. Proteins are up to 1,000‐fold more abundant than the free cellular messengers cAMP, cGMP, H+, and Ca2+. Opto‐chemical techniques show that high protein concentrations kinetically compartmentalize the flagellum: Within milliseconds, cGMP is relayed from the receptor guanylate cyclase to a cGMP‐gated channel that serves as a perfect chemo‐electrical transducer. cGMP is rapidly hydrolyzed, possibly via “substrate channeling” from the channel to the phosphodiesterase PDE5. The channel/PDE5 tandem encodes cGMP turnover rates rather than concentrations. The rate‐detection mechanism allows continuous stimulus sampling over a wide dynamic range. The textbook notion of signal amplification—few enzyme molecules process many messenger molecules—does not hold for sperm flagella. Instead, high protein concentrations ascertain messenger detection. Similar mechanisms may occur in other small compartments like primary cilia or dendritic spines.We thank Heike Krause for preparing the manuscript. Financial support by the Deutsche Forschungsgemeinschaft (DFG) via the priority program SPP 1726 “Microswimmers” and the Cluster of Excellence 1023 “ImmunoSensation” is gratefully acknowledged. We thank D. Stoddard for management of the UTSW cryo‐electron microscope facility, which is funded in part by a Cancer Prevention and Research Institute of Texas (CPRIT) Core Facility Award (RP170644). This study was supported by HHS|National Institutes of Health (NIH) grant R01 GM083122 and by CPRIT grant RR140082 to D. Nicastro

    Toroidal Imploding Detonation Wave Initiator for Pulse Detonation Engines

    Get PDF
    Imploding toroidal detonation waves were used to initiate detonations in propane–air and ethylene–air mixtures inside of a tube. The imploding wave was generated by an initiator consisting of an array of channels filled with acetylene–oxygen gas and ignited with a single spark. The initiator was designed as a low-drag initiator tube for use with pulse detonation engines. To detonate hydrocarbon–air mixtures, the initiator was overfilled so that some acetylene oxygen spilled into the tube. The overfill amount required to detonate propane air was less than 2% of the volume of the 1-m-long, 76-mm-diam tube. The energy necessary to create an implosion strong enough to detonate propane–air mixtures was estimated to be 13% more than that used by a typical initiator tube, although the initiator was also estimated to use less oxygen. Images and pressure traces show a regular, repeatable imploding wave that generates focal pressures in excess of 6 times the Chapman–Jouguet pressure.Atheoretical analysis of the imploding toroidal wave performed using Whitham’s method was found to agree well with experimental data and showed that, unlike imploding cylindrical and spherical geometries, imploding toroids initially experience a period of diffraction before wave focusing occurs. A nonreacting numerical simulation was used to assist in the interpretation of the experimental data

    Integration of a big data emerging on large sparse simulation and its application on green computing platform

    Get PDF
    The process of analyzing large data and verifying a big data set are a challenge for understanding the fundamental concept behind it. Many big data analysis techniques suffer from the poor scalability, variation inequality, instability, lower convergence, and weak accuracy of the large-scale numerical algorithms. Due to these limitations, a wider opportunity for numerical analysts to develop the efficiency and novel parallel algorithms has emerged. Big data analytics plays an important role in the field of sciences and engineering for extracting patterns, trends, actionable information from large sets of data and improving strategies for making a decision. A large data set consists of a large-scale data collection via sensor network, transformation from signal to digital images, high resolution of a sensing system, industry forecasts, existing customer records to predict trends and prepare for new demand. This paper proposes three types of big data analytics in accordance to the analytics requirement involving a large-scale numerical simulation and mathematical modeling for solving a complex problem. First is a big data analytics for theory and fundamental of nanotechnology numerical simulation. Second, big data analytics for enhancing the digital images in 3D visualization, performance analysis of embedded system based on the large sparse data sets generated by the device. Lastly, extraction of patterns from the electroencephalogram (EEG) data set for detecting the horizontal-vertical eye movements. Thus, the process of examining a big data analytics is to investigate the behavior of hidden patterns, unknown correlations, identify anomalies, and discover structure inside unstructured data and extracting the essence, trend prediction, multi-dimensional visualization and real-time observation using the mathematical model. Parallel algorithms, mesh generation, domain-function decomposition approaches, inter-node communication design, mapping the subdomain, numerical analysis and parallel performance evaluations (PPE) are the processes of the big data analytics implementation. The superior of parallel numerical methods such as AGE, Brian and IADE were proven for solving a large sparse model on green computing by utilizing the obsolete computers, the old generation servers and outdated hardware, a distributed virtual memory and multi-processors. The integration of low-cost communication of message passing software and green computing platform is capable of increasing the PPE up to 60% when compared to the limited memory of a single processor. As a conclusion, large-scale numerical algorithms with great performance in scalability, equality, stability, convergence, and accuracy are important features in analyzing big data simulation
    • 

    corecore