972 research outputs found

    Elevation and Deformation Extraction from TomoSAR

    Get PDF
    3D SAR tomography (TomoSAR) and 4D SAR differential tomography (Diff-TomoSAR) exploit multi-baseline SAR data stacks to provide an essential innovation of SAR Interferometry for many applications, sensing complex scenes with multiple scatterers mapped into the same SAR pixel cell. However, these are still influenced by DEM uncertainty, temporal decorrelation, orbital, tropospheric and ionospheric phase distortion and height blurring. In this thesis, these techniques are explored. As part of this exploration, the systematic procedures for DEM generation, DEM quality assessment, DEM quality improvement and DEM applications are first studied. Besides, this thesis focuses on the whole cycle of systematic methods for 3D & 4D TomoSAR imaging for height and deformation retrieval, from the problem formation phase, through the development of methods to testing on real SAR data. After DEM generation introduction from spaceborne bistatic InSAR (TanDEM-X) and airborne photogrammetry (Bluesky), a new DEM co-registration method with line feature validation (river network line, ridgeline, valley line, crater boundary feature and so on) is developed and demonstrated to assist the study of a wide area DEM data quality. This DEM co-registration method aligns two DEMs irrespective of the linear distortion model, which improves the quality of DEM vertical comparison accuracy significantly and is suitable and helpful for DEM quality assessment. A systematic TomoSAR algorithm and method have been established, tested, analysed and demonstrated for various applications (urban buildings, bridges, dams) to achieve better 3D & 4D tomographic SAR imaging results. These include applying Cosmo-Skymed X band single-polarisation data over the Zipingpu dam, Dujiangyan, Sichuan, China, to map topography; and using ALOS L band data in the San Francisco Bay region to map urban building and bridge. A new ionospheric correction method based on the tile method employing IGS TEC data, a split-spectrum and an ionospheric model via least squares are developed to correct ionospheric distortion to improve the accuracy of 3D & 4D tomographic SAR imaging. Meanwhile, a pixel by pixel orbit baseline estimation method is developed to address the research gaps of baseline estimation for 3D & 4D spaceborne SAR tomography imaging. Moreover, a SAR tomography imaging algorithm and a differential tomography four-dimensional SAR imaging algorithm based on compressive sensing, SAR interferometry phase (InSAR) calibration reference to DEM with DEM error correction, a new phase error calibration and compensation algorithm, based on PS, SVD, PGA, weighted least squares and minimum entropy, are developed to obtain accurate 3D & 4D tomographic SAR imaging results. The new baseline estimation method and consequent TomoSAR processing results showed that an accurate baseline estimation is essential to build up the TomoSAR model. After baseline estimation, phase calibration experiments (via FFT and Capon method) indicate that a phase calibration step is indispensable for TomoSAR imaging, which eventually influences the inversion results. A super-resolution reconstruction CS based study demonstrates X band data with the CS method does not fit for forest reconstruction but works for reconstruction of large civil engineering structures such as dams and urban buildings. Meanwhile, the L band data with FFT, Capon and the CS method are shown to work for the reconstruction of large manmade structures (such as bridges) and urban buildings

    Electrical Impedance Tomography for Biomedical Applications: Circuits and Systems Review

    Get PDF
    There has been considerable interest in electrical impedance tomography (EIT) to provide low-cost, radiation-free, real-time and wearable means for physiological status monitoring. To be competitive with other well-established imaging modalities, it is important to understand the requirements of the specific application and determine a suitable system design. This paper presents an overview of EIT circuits and systems including architectures, current drivers, analog front-end and demodulation circuits, with emphasis on integrated circuit implementations. Commonly used circuit topologies are detailed, and tradeoffs are discussed to aid in choosing an appropriate design based on the application and system priorities. The paper also describes a number of integrated EIT systems for biomedical applications, as well as discussing current challenges and possible future directions

    Lab coat 2

    Get PDF
    In past previous years, tremendous of safety product created, invented and merchandized worldwide. These products are made with purposes in which to minimizing the risks while dealing with chemical and hazardous equipment in any working space especially laboratory. In order to ensure safety of lab worker is on the optimal level, manufacturers and importers are responsible to fulfill the safety specification. This requirement should involves providing instruction to safety operate the product, placing an appropriate label on the product, also concerning about the material used in developing product

    SimulaciĂłn de rango del positrĂłn y emisiones gamma adicionales en PET

    Get PDF
    Tesis inédita de la Universidad Complutense de Madrid, Facultad de Ciencias Físicas, Departamento de Física Atómica, Molecular y Nuclear, leída el 03-04-2014Depto. de Estructura de la Materia, Física Térmica y ElectrónicaFac. de Ciencias FísicasTRUEunpu

    Optical Fiber Interferometric Sensors

    Get PDF
    The contributions presented in this book series portray the advances of the research in the field of interferometric photonic technology and its novel applications. The wide scope explored by the range of different contributions intends to provide a synopsis of the current research trends and the state of the art in this field, covering recent technological improvements, new production methodologies and emerging applications, for researchers coming from different fields of science and industry. The manuscripts published in the Special issue, and re-printed in this book series, report on topics that range from interferometric sensors for thickness and dynamic displacement measurement, up to pulse wave and spirometry applications

    Exploration, Registration, and Analysis of High-Throughput 3D Microscopy Data from the Knife-Edge Scanning Microscope

    Get PDF
    Advances in high-throughput, high-volume microscopy techniques have enabled the acquisition of extremely detailed anatomical structures on human or animal organs. The Knife-Edge Scanning Microscope (KESM) is one of the first instruments to produce sub-micrometer resolution ( ~1 µm^(3)) data from whole small animal brains. We successfully imaged, using the KESM, entire mouse brains stained with Golgi (neuronal morphology), India ink (vascular network), and Nissl (soma distribution). Our data sets fill the gap of most existing data sets which have only partial organ coverage or have orders of magnitude lower resolution. However, even though we have such unprecedented data sets, we still do not have a suitable informatics platform to visualize and quantitatively analyze the data sets. This dissertation is designed to address three key gaps: (1) due to the large volume (several tera voxels) and the multiscale nature, visualization alone is a huge challenge, let alone quantitative connectivity analysis; (2) the size of the uncompressed KESM data exceeds a few terabytes and to compare and combine with other data sets from different imaging modalities, the KESM data must be registered to a standard coordinate space; and (3) quantitative analysis that seeks to count every neuron in our massive, growing, and sparsely labeled data is a serious challenge. The goals of my dissertation are as follows: (1) develop an online neuro-informatics framework for efficient visualization and analysis of the multiscale KESM data sets, (2) develop a robust landmark-based 3D registration method for mapping the KESM Nissl-stained entire mouse data into the Waxholm Space (a canonical coordinate system for the mouse brain), and (3) develop a scalable, incremental learning algorithm for cell detection in high-resolution KESM Nissl data. For the web-based neuroinformatics framework, I prepared multi-scale data sets at different zoom levels from the original data sets. And then I extended Google Maps API to develop atlas features such as scale bars, panel browsing, and transparent overlay for 3D rendering. Next, I adapted the OpenLayers API, which is a free mapping and layering API supporting similar functionality as the Google Maps API. Furthermore, I prepared multi-scale data sets in vector-graphics to improve page loading time by reducing the file size. To better appreciate the full 3D morphology of the objects embedded in the data volumes, I developed a WebGL-based approach that complements the web-based framework for interactive viewing. For the registration work, I adapted and customized a stable 2D rigid deformation method to map our data sets to the Waxholm Space. For the analysis of neuronal distribution, I designed and implemented a scalable, effective quantitative analysis method using supervised learning. I utilized Principal Components Analysis (PCA) in a supervised manner and implemented the algorithm using MapReduce parallelization. I expect my frameworks to enable effective exploration and analysis of our KESM data sets. In addition, I expect my approaches to be broadly applicable to the analysis of other high-throughput medical imaging data

    RELION: Implementation of a Bayesian approach to cryo-EM structure determination

    Get PDF
    AbstractRELION, for REgularized LIkelihood OptimizatioN, is an open-source computer program for the refinement of macromolecular structures by single-particle analysis of electron cryo-microscopy (cryo-EM) data. Whereas alternative approaches often rely on user expertise for the tuning of parameters, RELION uses a Bayesian approach to infer parameters of a statistical model from the data. This paper describes developments that reduce the computational costs of the underlying maximum a posteriori (MAP) algorithm, as well as statistical considerations that yield new insights into the accuracy with which the relative orientations of individual particles may be determined. A so-called gold-standard Fourier shell correlation (FSC) procedure to prevent overfitting is also described. The resulting implementation yields high-quality reconstructions and reliable resolution estimates with minimal user intervention and at acceptable computational costs

    Urban Deformation Monitoring using Persistent Scatterer Interferometry and SAR tomography

    Get PDF
    This book focuses on remote sensing for urban deformation monitoring. In particular, it highlights how deformation monitoring in urban areas can be carried out using Persistent Scatterer Interferometry (PSI) and Synthetic Aperture Radar (SAR) Tomography (TomoSAR). Several contributions show the capabilities of Interferometric SAR (InSAR) and PSI techniques for urban deformation monitoring. Some of them show the advantages of TomoSAR in un-mixing multiple scatterers for urban mapping and monitoring. This book is dedicated to the technical and scientific community interested in urban applications. It is useful for choosing the appropriate technique and gaining an assessment of the expected performance. The book will also be useful to researchers, as it provides information on the state-of-the-art and new trends in this fiel

    Statistical simulation of nanoindentation on hardmetals

    Get PDF
    El metall dur és un material que combina alta duresa i resistència al desgast amb bona tenacitat a la fractura i resistència a la fatiga. Gràcies a aquesta combinació única, que aquest material sigui essencial és en eines de tall i altres aplicacions on es requereixin excel·lents propietats tribològiques . Tot i així, les característiques microestructurals de les quals deriven aquestes propietats són compleixes i encara no han sigut adaptades per fer un modelat adequat en software de disseny. Per tant, el principal objectiu del present Treball Final de Màster es unir l’experimentació amb el modelat numèric. Un assaig típic per estudiar les propietats mecàniques del metall dur és la nanoindentación, ja que permet el mesurament a escala nanomètrica. Així, doncs, aquest estudi es simula en un software comercial (ABAQUS) que resol problemes físics a través del mètode d’elements finits (en anglès, FEM). A més, la mostra de l’estudi és una porció d’una microestructura real que ha estat obtinguda prèviament a través de tomografia per feix de ions (en anglès, FIB). Aquest aspecte és especialment innovador perquè normalment es fan servir microestructures artificials. Durant el transcurs del treball, el modelat numèric s’ha desenvolupat amb èxit i els resultats obtinguts s’ajusten raonablement bé amb les dades experimentals trobades a la bibliografia. D’altra banda, s’han provat i contrastat entre si tres models de plasticitat diferents per el cobalt. També s’ha identificat un increment de rigidesa produït per la proximitat de les condicions de contorn, la qual cosa desaconsella l’ús d’estratègies de modelat trobades a la literatura, on es fan servir per simular l’efecte del material de reforç sobre la matriu metàl·lica.El metal duro es un material que combina alta dureza y resistencia al desgaste con buena tenacidad a la fractura y resistencia a la fatiga. Esta combinación única ha hecho que este material sea esencial en herramientas de corte y otras aplicaciones donde se requieran excelentes propiedades tribólogicas. Sin embargo, las características microestructurales que dan lugar a estas propiedades son complejas y no han sido aún adaptadas para su correcto modelado en softwares de diseño. Por tanto, el objetivo principal del presente trabajo final de maestría es unir la experimentación con el modelado numérico. Un ensayo típico para el estudio de las propiedades mecánicas del metal duro es la nanoindentación, pues permite la medición a escala nanométrica. Luego, este estudio es simulado en un software comercial (ABAQUS) que resuelve problemas físicos a través del método de elementos finitos (FEM, por sus siglas en inglés). Además, la muestra de estudio será una porción de una microestructura real, la cual fue obtenida previamente a través de tomografía por haz de iones (FIB). Este aspecto es especialmente novedoso, ya que normalmente se usan microestructuras artificiales. El modelo numérico fue desarrollado con éxito y los resultados obtenidos se ajustan razonablemente bien con los datos experimentales hallados en bibliografía. Por otro lado, se probaron y contrastaron entre sí tres modelos de plasticidad distintos para la fase ligante. También se encontró un incremento en la rigidez producido por la proximidad de las condiciones de contorno, lo cual desalienta el uso de algunas estrategias de modelado halladas en literatura que aplican dichas condiciones para simular el efecto del material de refuerzo sobre la matriz metálica.Hardmetals combine high hardness and wear resistance with fair fracture toughness and fatigue resistance. Such a unique combination has made this material ubiquitous in tooling and other applications where demanding tribological properties are required. However, the microstructural aspects that give rise to these properties are complex and have not been adapted for modelling in engineering software with precision. Therefore, the main purpose of this master thesis is to join experimentation and numerical modelling. A typical test used for studying the mechanical properties of hardmetals is nanoindentation, as it allows to measure at the nanometric scale. The objective of this work is to simulate nanoindentation in a commercial software (ABAQUS) that solves Multiphysics problems through the Finite Element Method (FEM). Moreover, the sample under study is a real portion of a hardmetal microstructure attained with a Focused Ion Beam (FIB) tomography. This aspect is actually novel because traditional approaches use artificial microstructures. The model was successfully developed and its results agree with experimental results. Three different material models for the binder phase were tested and the compared. In addition, a stiffening effect related to boundary constraints was found. Such effect has discouraged some modelling strategies found in literature regarding their application to model the reinforcing phase

    Statistical Performance Evaluation, System Modeling, Distributed Computation, and Signal Pattern Matching for a Compton Medical Imaging System.

    Full text link
    Radionuclide cancer therapy requires imaging radiotracers that concentrate in tumors and emit high energy charged particles that kill tumor cells. These tracers, such as 131I, generally emit high energy photons that need to be imaged to estimate tumor dose and changes in size during treatment. This research describes the performance of a dual-planar silicon-based Compton imaging system and compares it to a conventional parallel-hole collimated Anger camera with high energy general purpose lead collimator for imaging photons emitted from 131I. The collimated Anger camera imposes a tradeoff between resolution and sensitivity due to the mechanical collimation. As the energy of photons exceed 364keV, increased septal penetration and scattering further degrade the imaging performance. Simulations of the Anger camera and the Compton imaging system demonstrate a 20-fold advantage in detection efficiency and higher spatial resolution for detecting high energy photons by the Compton camera since it decouples the tradeoff. The system performance and comparision are analyzed using the modified uniform Cramer-Rao bound algorithms we developed along with the Monte Carlo calculations and system modeling. The bound show that the effect of Doppler broadening is the limiting factor for Compton camera performance for imaging 364keV photons. Performance of the two systems was compared and analyzed by simulating a 2D disk with uniform activities. For the case in which the two imaging systems detected the same number of events, the proposed Compton imaging system has lower image variance than the Anger camera with HEGP when the FWHM of the desired point source response is less than 1.2 cm. This advantage was also demonstrated by imaging and reconstructing a 2D hot spot phantom. In addition to the performance analysis, the distributed Maximum Likelihood Maximization Expectation algorithm with chessboard data partition was evaluated for speeding up image reconstruction for the Compton imaging system. A 1 x 64 distributed computing system speeded computation by about a factor of 22 compared to a single processor. Finally, a real-time signal processing and pattern matching system employing state-of-the-art digital electronics is described for solving problems of event pile-up raised by high photon count rate in the second detector.Ph.D.Biomedical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/60851/1/lhan_1.pd
    • …
    corecore