324 research outputs found

    A Dtm Multi-Resolution Compressed Model for Efficient Data Storage and Network Transfer

    Get PDF
    In recent years the technological evolution of terrestrial, aerial and satellite surveying, has considerably increased the measurement accuracy and, consequently, the quality of the derived information. At the same time, the smaller and smaller limitations on data storage devices, in terms of capacity and cost, has allowed the storage and the elaboration of a bigger number of instrumental observations. A significant example is the terrain height surveyed by LIDAR (LIght Detection And Ranging) technology where several height measurements for each square meter of land can be obtained. The availability of such a large quantity of observations is an essential requisite for an in-depth knowledge of the phenomena under study. But, at the same time, the most common Geographical Information Systems (GISs) show latency in visualizing and analyzing these kind of data. This problem becomes more evident in case of Internet GIS. These systems are based on the very frequent flow of geographical information over the internet and, for this reason, the band-width of the network and the size of the data to be transmitted are two fundamental factors to be considered in order to guarantee the actual usability of these technologies. In this paper we focus our attention on digital terrain models (DTM's) and we briefly analyse the problems about the definition of the minimal necessary information to store and transmit DTM's over network, with a fixed tolerance, starting from a huge number of observations. Then we propose an innovative compression approach for sparse observations by means of multi-resolution spline functions approximation. The method is able to provide metrical accuracy at least comparable to that provided by the most common deterministic interpolation algorithms (inverse distance weighting, local polynomial, radial basis functions). At the same time it dramatically reduces the number of information required for storing or for transmitting and rebuilding a digital terrain model dataset. A brief description of the method is presented and comparisons about the accuracy and data-store compression obtained with respect to other interpolators are shown

    A NEW MULTI-RESOLUTION ALGORITHM TO STORE AND TRANSMIT COMPRESSED DTM

    Get PDF
    WebGIS and virtual globes allow DTMs distribution and three dimensional representations to the Web users' community. In these applications, the database storage size represents a critical point. DTMs are obtained by some sampling or interpolation on the raw observations and typically are stored and distributed by data based models, like for example regular grids. A new approach to store and transmit DTMs is presented. The idea is to use multi-resolution bilinear spline functions to interpolate the observations and to model the terrain. More in detail, the algorithm performs the following actions. 1) The spatial distribution of the observations is investigated. Where few data are available, few levels of splines are activated while more levels are activated where the raw observations are denser: each new level corresponds to an halving of the spline support with respect to the previous level. 2) After the selection of the spline functions to be activated, the relevant coefficients are estimated by interpolating the observations. The interpolation is computed by batch least squares. 3) Finally, the estimated coefficients of the splines are stored. The model guarantees a local resolution consistent with the data density and can be defined analytical, because the coefficients of a given function are stored instead of a set of heights. The approach is discussed and compared with the traditional techniques to interpolate, store and transmit DTMs, considering accuracy and storage requirements. It is also compared with another multi-resolution technique. The research has been funded by the INTERREG HELI-DEM (Helvetia Italy Digital Elevation Model) project

    Digital Image Processing of Electron Micrographs: The PIC System II

    Get PDF
    The PIC system, an integrated package of Fortran programs and subroutines designed to run on the Digital Equipment Corporation VAX family of computers, has been developed for analysis of electron micrographs with emphasis on the particular requirements for structural analysis of biological macromolecules. The substantially improved VAX version of PIC reported here has been developed from an earlier PDP-11 version which was, in turn, developed from a set of IBM 370 programs called MDPP. PIC now encompasses over 150 commands or processing operations that afford a comprehensive range of image processing operations including image restoration, enhancement, Fourier analysis, correlation averaging, and multivariate statistical analysis including clustering and classification. In particular, we describe our software for correction of imperfect lattices, as well as programs for correlation alignment and averaging of single particle images

    Wavelet-Based Enhancement Technique for Visibility Improvement of Digital Images

    Get PDF
    Image enhancement techniques for visibility improvement of color digital images based on wavelet transform domain are investigated in this dissertation research. In this research, a novel, fast and robust wavelet-based dynamic range compression and local contrast enhancement (WDRC) algorithm to improve the visibility of digital images captured under non-uniform lighting conditions has been developed. A wavelet transform is mainly used for dimensionality reduction such that a dynamic range compression with local contrast enhancement algorithm is applied only to the approximation coefficients which are obtained by low-pass filtering and down-sampling the original intensity image. The normalized approximation coefficients are transformed using a hyperbolic sine curve and the contrast enhancement is realized by tuning the magnitude of the each coefficient with respect to surrounding coefficients. The transformed coefficients are then de-normalized to their original range. The detail coefficients are also modified to prevent edge deformation. The inverse wavelet transform is carried out resulting in a lower dynamic range and contrast enhanced intensity image. A color restoration process based on the relationship between spectral bands and the luminance of the original image is applied to convert the enhanced intensity image back to a color image. Although the colors of the enhanced images produced by the proposed algorithm are consistent with the colors of the original image, the proposed algorithm fails to produce color constant results for some pathological scenes that have very strong spectral characteristics in a single band. The linear color restoration process is the main reason for this drawback. Hence, a different approach is required for tackling the color constancy problem. The illuminant is modeled having an effect on the image histogram as a linear shift and adjust the image histogram to discount the illuminant. The WDRC algorithm is then applied with a slight modification, i.e. instead of using a linear color restoration, a non-linear color restoration process employing the spectral context relationships of the original image is applied. The proposed technique solves the color constancy issue and the overall enhancement algorithm provides attractive results improving visibility even for scenes with near-zero visibility conditions. In this research, a new wavelet-based image interpolation technique that can be used for improving the visibility of tiny features in an image is presented. In wavelet domain interpolation techniques, the input image is usually treated as the low-pass filtered subbands of an unknown wavelet-transformed high-resolution (HR) image, and then the unknown high-resolution image is produced by estimating the wavelet coefficients of the high-pass filtered subbands. The same approach is used to obtain an initial estimate of the high-resolution image by zero filling the high-pass filtered subbands. Detail coefficients are estimated via feeding this initial estimate to an undecimated wavelet transform (UWT). Taking an inverse transform after replacing the approximation coefficients of the UWT with initially estimated HR image, results in the final interpolated image. Experimental results of the proposed algorithms proved their superiority over the state-of-the-art enhancement and interpolation techniques

    Isogeometric Analysis for Electromagnetism

    Get PDF
    The combination of numerical analysis with the scanning technology has been seeing increased use in many research areas. There is an emerging need for high-fidelity geometric modeling and meshing for practical applications. The Isogeometric Analysis (IGA) is a comprehensive computational framework, which integrates geometric modeling and meshing with analysis. Different from other existing numerical methods, the IGA can generate analysis ready models without loss of geometrical accuracy. In IGA, the continuity and the quality of a solution can be conveniently controlled and refined. These features enable IGA to integrate modeling, analysis, and design in a unified framework, the root idea of IGA. The IGA for electromagmetics is studied here for steady and transient electromagnetics as well as electromagnetic scattering. The solution procedure and the associated Matlab codes are developed to simulate the electromagnetic radiation on a biological tissues. The scattered and the total electrical fields are computed over the complex geometry of a brain section with realistic material properties. A perfectly matched layer (PML) is developed to model the far field boundary condition. The IGA platform developed here offers a reliable simulation within an accurate representation of the geometry. The results of this research can be used both in evaluating the potential health and safety risks of electromagnetic radiations and in optimizing the design of radiating devices used in non-invasive diagnostics and therapies

    Numerical Approaches for Solving the Combined Reconstruction and Registration of Digital Breast Tomosynthesis

    Get PDF
    Heavy demands on the development of medical imaging modalities for breast cancer detection have been witnessed in the last three decades in an attempt to reduce the mortality associated with the disease. Recently, Digital Breast Tomosynthesis (DBT) shows its promising in the early diagnosis when lesions are small. In particular, it offers potential benefits over X-ray mammography - the current modality of choice for breast screening - of increased sensitivity and specificity for comparable X-ray dose, speed, and cost. An important feature of DBT is that it provides a pseudo-3D image of the breast. This is of particular relevance for heterogeneous dense breasts of young women, which can inhibit detection of cancer using conventional mammography. In the same way that it is difficult to see a bird from the edge of the forest, detecting cancer in a conventional 2D mammogram is a challenging task. Three-dimensional DBT, however, enables us to step through the forest, i.e., the breast, reducing the confounding effect of superimposed tissue and so (potentially) increasing the sensitivity and specificity of cancer detection. The workflow in which DBT would be used clinically, involves two key tasks: reconstruction, to generate a 3D image of the breast, and registration, to enable images from different visits to be compared as is routinely performed by radiologists working with conventional mammograms. Conventional approaches proposed in the literature separate these steps, solving each task independently. This can be effective if reconstructing using a complete set of data. However, for ill-posed limited-angle problems such as DBT, estimating the deformation is difficult because of the significant artefacts associated with DBT reconstructions, leading to severe inaccuracies in the registration. The aim of my work is to find and evaluate methods capable of allying these two tasks, which will enhance the performance of each process as a result. Consequently, I prove that the processes of reconstruction and registration of DBT are not independent but reciprocal. This thesis proposes innovative numerical approaches combining reconstruction of a pair of temporal DBT acquisitions with their registration iteratively and simultaneously. To evaluate the performance of my methods I use synthetic images, breast MRI, and DBT simulations with in-vivo breast compressions. I show that, compared to the conventional sequential method, jointly estimating image intensities and transformation parameters gives superior results with respect to both reconstruction fidelity and registration accuracy

    On the development of an efficient truly meshless discretization procedure in computational mechanics

    Get PDF
    Thesis (Sc.D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2001.Includes bibliographical references (leaves 157-163).The objective of this thesis is to present an efficient and reliable meshless computational technique - the method of finite spheres - for the solution of boundary value problems on complex domains. This method is truly meshless in the sense that the approximation spaces are generated and the numerical integration is performed without a mesh. While the theory behind meshless techniques is rather straightforward, the generation of a computationally efficient scheme is quite difficult. Computational efficiency may be achieved by proper choice of the interpolation functions, effective ways of incorporating the essential boundary conditions and efficient and specialized numerical integration rules. The pure displacement formulation is observed to exhibit volumetric "locking" during incompressible (or nearly incompressible) analysis. A displacement/pressure mixed formulation is developed to overcome this problem. The stability and optimality of the mixed formulation are tested using numerical inf-sup tests for a variety of discretization schemes. Solutions to several example problems are presented showing the application of the method of finite spheres to problems in solid and fluid mechanics. A very specialized application of the technique to physically based real time medical simulations in multimodal virtual environments is also presented. In the current form of implementation, the method of finite spheres is about five times slower than the finite element techniques for problems in two-dimensional elastostatics.by Suvranu De.Sc.D
    • 

    corecore