72 research outputs found

    Inference and experimental design for percolation and random graph models.

    Get PDF
    The problem of optimal arrangement of nodes of a random weighted graph is studied in this thesis. The nodes of graphs under study are fixed, but their edges are random and established according to the so called edge-probability function. This function is assumed to depend on the weights attributed to the pairs of graph nodes (or distances between them) and a statistical parameter. It is the purpose of experimentation to make inference on the statistical parameter and thus to extract as much information about it as possible. We also distinguish between two different experimentation scenarios: progressive and instructive designs. We adopt a utility-based Bayesian framework to tackle the optimal design problem for random graphs of this kind. Simulation based optimisation methods, mainly Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We study optimal design problem for the inference based on partial observations of random graphs by employing data augmentation technique. We prove that the infinitely growing or diminishing node configurations asymptotically represent the worst node arrangements. We also obtain the exact solution to the optimal design problem for proximity graphs (geometric graphs) and numerical solution for graphs with threshold edge-probability functions. We consider inference and optimal design problems for finite clusters from bond percolation on the integer lattice Zd and derive a range of both numerical and analytical results for these graphs. We introduce inner-outer plots by deleting some of the lattice nodes and show that the ‘mostly populated’ designs are not necessarily optimal in the case of incomplete observations under both progressive and instructive design scenarios. Finally, we formulate a problem of approximating finite point sets with lattice nodes and describe a solution to this problem

    Discrete Tomography by Convex-Concave Regularization using Linear and Quadratic Optimization

    Get PDF
    Discrete tomography concerns the reconstruction of objects that are made up from a few different materials, each of which comprising a homogeneous density distribution. Under the assumption that these densities are a priori known new algorithms can be developed which typically need less projection data to reveal appealing reconstruction results

    Nanoinformatics

    Get PDF
    Machine learning; Big data; Atomic resolution characterization; First-principles calculations; Nanomaterials synthesi

    Reconstruction, Classification, and Segmentation for Computational Microscopy

    Full text link
    This thesis treats two fundamental problems in computational microscopy: image reconstruction for magnetic resonance force microscopy (MRFM) and image classification for electron backscatter diffraction (EBSD). In MRFM, as in many inverse problems, the true point spread function (PSF) that blurs the image may be only partially known. The image quality may suffer from this possible mismatch when standard image reconstruction techniques are applied. To deal with the mismatch, we develop novel Bayesian sparse reconstruction methods that account for possible errors in the PSF of the microscope and for the inherent sparsity of MRFM images. Two methods are proposed: a stochastic method and a variational method. They both jointly estimate the unknown PSF and unknown image. Our proposed framework for reconstruction has the flexibility to incorporate sparsity inducing priors, thus addressing ill-posedness of this non-convex problem, Markov-Random field priors, and can be extended to other image models. To obtain scalable and tractable solutions, a dimensionality reduction technique is applied to the highly nonlinear PSF space. The experiments clearly demonstrate that the proposed methods have superior performance compared to previous methods. In EBSD we develop novel and robust dictionary-based methods for segmentation and classification of grain and sub-grain structures in polycrystalline materials. Our work is the first in EBSD analysis to use a physics-based forward model, called the dictionary, to use full diffraction patterns, and that efficiently classifies patterns into grains, boundaries, and anomalies. In particular, unlike previous methods, our method incorporates anomaly detection directly into the segmentation process. The proposed approach also permits super-resolution of grain mantle and grain boundary locations. Finally, the proposed dictionary-based segmentation method performs uncertainty quantification, i.e. p-values, for the classified grain interiors and grain boundaries. We demonstrate that the dictionary-based approach is robust to instrument drift and material differences that produce small amounts of dictionary mismatch.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/102296/1/seunpark_1.pd

    Relevance of accurate Monte Carlo modeling in nuclear medical imaging

    Get PDF
    Monte Carlo techniques have become popular in different areas of medical physics with advantage of powerful computing systems. In particular, they have been extensively applied to simulate processes involving random behavior and to quantify physical parameters that are difficult or even impossible to calculate by experimental measurements. Recent nuclear medical imaging innovations such as single-photon emission computed tomography (SPECT), positron emission tomography (PET), and multiple emission tomography (MET) are ideal for Monte Carlo modeling techniques because of the stochastic nature of radiation emission, transport and detection processes. Factors which have contributed to the wider use include improved models of radiation transport processes, the practicality of application with the development of acceleration schemes and the improved speed of computers. This paper presents derivation and methodological basis for this approach and critically reviews their areas of application in nuclear imaging. An overview of existing simulation programs is provided and illustrated with examples of some useful features of such sophisticated tools in connection with common computing facilities and more powerful multiple-processor parallel processing systems. Current and future trends in the field are also discussed

    Nanoinformatics

    Get PDF
    Machine learning; Big data; Atomic resolution characterization; First-principles calculations; Nanomaterials synthesi

    A review of model designs

    Get PDF
    The PAEQANN project aims to review current ecological theories which can help identify suited models that predict community structure in aquatic ecosystems, to select and discuss appropriate models, depending on the type of target community (i.e. empirical vs. simulation models) and to examine how results add to ecological water management objectives. To reach these goals a number of classical statistical models, artificial neural networks and dynamic models are presented. An even higher number of techniques within these groups will tested lateron in the project. This report introduces all of them. The techniques are shortly introduced, their algorithms explained, and the advantages and disadvantages discussed
    corecore