606 research outputs found

    Histopathological image analysis : a review

    Get PDF
    Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe

    An Evaluation of multispectral earth-observing multi-aperture telescope designs for target detection and characterization

    Get PDF
    Earth-observing satellites have fundamental size and weight design limits since they must be launched into space. These limits serve to constrain the spatial resolutions that such imaging systems can achieve with traditional telescope design strategies. Segmented and sparse-aperture imaging system designs may offer solutions to this problem. Segmented and sparse-aperture designs can be viewed as competing technologies; both approaches offer solutions for achieving finer resolution imaging from space. Segmented-aperture systems offer greater fill factor, and therefore greater signal-to-noise ratio (SNR), for a given encircled diameter than their sparse aperture counterparts, though their larger segments often suffer from greater optical aberration than those of smaller, sparse designs. Regardless, the use of any multi-aperture imaging system comes at a price; their increased effective aperture size and improvement in spatial resolution are offset by a reduction in image quality due to signal loss (less photon-collecting area) and aberrations introduced by misalignments between individual sub-apertures as compared with monolithic collectors. Introducing multispectral considerations to a multi-aperture imaging system further starves the system of photons and reduces SNR in each spectral band. This work explores multispectral design considerations inherent in 9-element tri-arm sparse aperture, hexagonal-element segmented aperture, and monolithic aperture imaging systems. The primary thrust of this work is to develop an objective target detection-based metric that can be used to compare the achieved image utility of these competing multi-aperture telescope designs over a designated design parameter trade space. Characterizing complex multi-aperture system designs in this way may lead to improved assessment of programmatic risk and reward in the development of higher-resolution imaging capabilities. This method assumes that the stringent requirements for limiting the wavefront error (WFE) associated with multi-aperture imaging systems when producing imagery for visual assessment, can be relaxed when employing target detection-based metrics for evaluating system utility. Simple target detection algorithms were used to determine Receiver Operating Characteristic (ROC) curves for the various simulated multi-aperture system designs that could be used in an objective assessment of each system\u27s ability to support target detection activities. Also, a set of regressed equations was developed that allow one to predict multi-aperture system target detection performance within the bounds of the designated trade space. Suitable metrics for comparing the shapes of two individual ROC curves, such as the total area under the curve (AUC) and the sample Pearson correlation coefficient, were found to be useful tools in validating the predicted results of the trade space regression models. And lastly, some simple rules of thumb relating to multi-aperture system design were identified from the inspection of various points of equivalency between competing system designs, as determined from the comparison metrics employed. The goal of this work, the development of a process for simulating multi-aperture imaging systems and comparing them in terms of target detection tasks, was successfully accomplished. The process presented here could be tailored to the needs of any specific multi-aperture development effort and used as a tool for system design engineers

    Automated Reconstruction of Neurovascular Networks in Knife-Edge Scanning Microscope Mouse and Rat Brain Nissl Stained Data Sets

    Get PDF
    The Knife-Edge Scanning Microscope (KESM), developed at the Brain Network Laboratory at Texas A&M University, can image a whole small animal brain at sub- micrometer resolution. Nissl data from the KESM enable us to look into vasculatures and cell bodies at the same time. Hence, analyzing the images from KESM mouse and rat Nissl data can help understand interactions between cerebral blood flow and its surrounding tissue. However, analysis is difficult since the image data contain complex cellular features, as well as imaging artifacts, which make it hard to extract the geometry of the vasculature and the cells. In this project, we propose a novel approach to reconstructing the neurovascular networks from whole-brain mouse and partial rat Nissl data sets. The proposed method consists of (1) pre-processing, (2) thresholding, and (3) post-processing. Initially, we enhanced the raw image data in the pre-processing step. Next, we applied a dynamic global thresholding to ex-tract vessels in the thresholding step. Subsequently, in the post-processing step, we computed local properties of the connected components to remove various sources of noise and we applied artificial neural networks to extract vasculatures. Concurrently, the proposed method connected small and large discontinuities in the vascular traces. To validate the performance of the proposed method, we compared reconstruction results of the proposed method with an alternative method (Lim's method). The comparison shows that the proposed method significantly outperforms (nine times faster, and more robust to noise) Lim's method. As a consequence, the proposed method provides a framework that can be applied to other data sets, even when the images contain a large portion of low-contrast images across the image stacks. We expect that the proposed method will contribute to studies investigating the correlation between the soma of the cells and microvascular networks

    Toward Regional Characterizations of the Oceanic Internal Wavefield

    Get PDF
    Many major oceanographic internal wave observational programs of the last 4 decades are reanalyzed in order to characterize variability of the deep ocean internal wavefield. The observations are discussed in the context of the universal spectral model proposed by Garrett and Munk. The Garrett and Munk model is a good description of wintertime conditions at Site-D on the continental rise north of the Gulf Stream. Elsewhere and at other times, significant deviations in terms of amplitude, separability of the 2-D vertical wavenumber - frequency spectrum, and departure from the model's functional form are noted. Subtle geographic patterns are apparent in deviations from the high frequency and high vertical wavenumber power laws of the Garrett and Munk spectrum. Moreover, such deviations tend to co-vary: whiter frequency spectra are partnered with redder vertical wavenumber spectra. Attempts are made to interpret the variability in terms of the interplay between generation, propagation and nonlinearity using a statistical radiative balance equation. This process frames major questions for future research with the insight that such integrative studies could constrain both observationally and theoretically based interpretations

    An accurate calculation of the nucleon axial charge with lattice QCD

    Full text link
    We report on a lattice QCD calculation of the nucleon axial charge, gAg_A, using M\"{o}bius Domain-Wall fermions solved on the dynamical Nf=2+1+1N_f=2+1+1 HISQ ensembles after they are smeared using the gradient-flow algorithm. The calculation is performed with three pion masses, mπ∼{310,220,130}m_\pi\sim\{310,220,130\} MeV. Three lattice spacings (a∼{0.15,0.12,0.09}a\sim\{0.15,0.12,0.09\} fm) are used with the heaviest pion mass, while the coarsest two spacings are used on the middle pion mass and only the coarsest spacing is used with the near physical pion mass. On the mπ∼220m_\pi\sim220 MeV, a∼0.12a\sim0.12 fm point, a dedicated volume study is performed with mπL∼{3.22,4.29,5.36}m_\pi L \sim \{3.22,4.29,5.36\}. Using a new strategy motivated by the Feynman-Hellmann Theorem, we achieve a precise determination of gAg_A with relatively low statistics, and demonstrable control over the excited state, continuum, infinite volume and chiral extrapolation systematic uncertainties, the latter of which remains the dominant uncertainty. Our final determination at 2.6\% total uncertainty is gA=1.278(21)(26)g_A = 1.278(21)(26), with the first uncertainty including statistical and systematic uncertainties from fitting and the second including model selection systematics related to the chiral and continuum extrapolation. The largest reduction of the second uncertainty will come from a greater number of pion mass points as well as more precise lattice QCD results near the physical pion mass.Comment: 17 pages + 11 pages of references and appendices. 15 figures. Interested readers can download the Python analysis scripts and an hdf5 data file at https://github.com/callat-qcd/project_gA_v

    Polarized Light Applications towards Biomedical Diagnosis and Monitoring

    Get PDF
    Utilization of polarized light for improved specificity and sensitivity in disease diagnosis is occurring more often in fields of sensing, measurement, and medical diagnostics. This dissertation focuses on two distinct areas where polarized light is applied in biomedical sensing/monitoring: The first portion of worked reported in this dissertation focuses on addressing several major obstacles that exist prohibiting the use of polarized light as a means of developing an optical based non-invasive polarimetric glucose sensor to improve the quality of life and disease monitoring for millions of people currently afflicted by diabetes mellitus. In this work there are two key areas, which were focused on that require further technical advances for the technology to be realized as a viable solution. First, in vivo studies performed on New Zealand White (NZW) rabbits using a dual-wavelength polarimeter were conducted to allow for performance validation and modeling for predictive glucose measurements accounting for the time delay associated with blood aqueous humor glucose concentrations in addition to overcoming motion induced birefringence utilizing multiple linear regression analysis. Further, feasibility of non-matched index of refraction eye coupling between the system and corneal surface was evaluated using modeling and verified with in vitro testing validation. The system was initially modeled followed by construction of the non-matched coupling configuration for testing in vitro. The second half of the dissertation focuses on the use of polarized light microscopy designed, built, and tested as a low-cost high quality cellphone based polarimetric imaging system to aid medical health professionals in improved diagnosis of disease in the clinic and in low-resource settings. Malaria remains a major global health burden and new methods for, low-cost, high-sensitivity diagnosis of malaria are needed particularly in remote low-resource areas throughout the world. Here, a cost effective optical cell-phone based transmission polarized light microscope system is presented utilized for imaging the malaria pigment known as hemozoin. Validation testing of the optical resolution required to provide diagnosis similar to commercial polarized imaging systems will be conducted and the optimal design will be utilized in addition to image processing to improve the diagnostic capability

    Development of a Hybrid Particle Continuum Solver

    Get PDF
    When simulating complex flows, there are some physical situations that exhibit large fluctuations in particle density such as: planetary reentry, ablation due to arcing, rocket exhaust plumes, etc. When simulating these events, a high level of physical accuracy can be achieved with kinetic methods otherwise known as particle methods. However, this high level of physical accuracy requires large amounts of computation time. If the simulated flow is in collisional equilibrium, then less computationally intensive continuum methods, otherwise known as fluid methods, can be utilized. Hybrid Particle-Continuum (HPC) codes attempt to blend particle and fluid solutions in order to reduce computation time for transitional flows that exhibit both continuum and rarefied flow in a single domain. This thesis details the development of an HPC code in OpenFoam for Cal Poly\u27s Aerospace Engineering department. The primary benchmark for the solver, named hybridFoam, was to simulate a 1D sod-shock simulation. This primary goal was achieved and a collection of test simulations were conducted to map out the solvers current capabilities and identify where future development efforts should focus
    • …
    corecore