6,110 research outputs found

    Two and three dimensional segmentation of multimodal imagery

    Get PDF
    The role of segmentation in the realms of image understanding/analysis, computer vision, pattern recognition, remote sensing and medical imaging in recent years has been significantly augmented due to accelerated scientific advances made in the acquisition of image data. This low-level analysis protocol is critical to numerous applications, with the primary goal of expediting and improving the effectiveness of subsequent high-level operations by providing a condensed and pertinent representation of image information. In this research, we propose a novel unsupervised segmentation framework for facilitating meaningful segregation of 2-D/3-D image data across multiple modalities (color, remote-sensing and biomedical imaging) into non-overlapping partitions using several spatial-spectral attributes. Initially, our framework exploits the information obtained from detecting edges inherent in the data. To this effect, by using a vector gradient detection technique, pixels without edges are grouped and individually labeled to partition some initial portion of the input image content. Pixels that contain higher gradient densities are included by the dynamic generation of segments as the algorithm progresses to generate an initial region map. Subsequently, texture modeling is performed and the obtained gradient, texture and intensity information along with the aforementioned initial partition map are used to perform a multivariate refinement procedure, to fuse groups with similar characteristics yielding the final output segmentation. Experimental results obtained in comparison to published/state-of the-art segmentation techniques for color as well as multi/hyperspectral imagery, demonstrate the advantages of the proposed method. Furthermore, for the purpose of achieving improved computational efficiency we propose an extension of the aforestated methodology in a multi-resolution framework, demonstrated on color images. Finally, this research also encompasses a 3-D extension of the aforementioned algorithm demonstrated on medical (Magnetic Resonance Imaging / Computed Tomography) volumes

    Probing the dynamic structure factor of a neutral Fermi superfluid along the BCS-BEC crossover using atomic impurity qubits

    Full text link
    We study an impurity atom trapped by an anharmonic potential, immersed within a cold atomic Fermi gas with attractive interactions that realizes the crossover from a Bardeen-Cooper-Schrieffer (BCS) superfluid to a Bose-Einstein condensate (BEC). Considering the qubit comprising the lowest two vibrational energy eigenstates of the impurity, we demonstrate that its dynamics probes the equilibrium density fluctuations encoded in the dynamic structure factor of the superfluid. Observing the impurity's evolution is thus shown to facilitate nondestructive measurements of the superfluid order parameter and the contact between collective and single-particle excitation spectra. Our setup constitutes a novel model of an open quantum system interacting with a thermal reservoir, the latter supporting both bosonic and fermionic excitations that are also coupled to each other.Comment: Updated to final author version. 9+7 pages, 18 figure

    Rich probabilistic models for semantic labeling

    Get PDF
    Das Ziel dieser Monographie ist es die Methoden und Anwendungen des semantischen Labelings zu erforschen. Unsere Beiträge zu diesem sich rasch entwickelten Thema sind bestimmte Aspekte der Modellierung und der Inferenz in probabilistischen Modellen und ihre Anwendungen in den interdisziplinären Bereichen der Computer Vision sowie medizinischer Bildverarbeitung und Fernerkundung

    Multiscale Regional Liquefaction Hazard Assessment and Mapping

    Get PDF
    Soil liquefaction is a major cause of damage during earthquakes that could trigger many kinds of ground failures such as ground settlement, lateral spreading, land slides, etc. These ground failures could cause damage to infrastructures such as buildings, bridges, and lifelines resulting in significant economic losses. Therefore it is of significant importance to assess liquefaction hazard. The triggering and consequencing ground failure of liquefaction have been well investigated in the past decades. Nowadays, the dominant approach that correlates the observed field behavior with various in-situ Ĺ“index tests is able to achieve considerably precise assessments for free field conditions at site-specific scale. Regional scale assessments of liquefaction hazard, however, are still underdeveloped. Issues such as cross-geologic units correlations are still not systematically investigated in regional liquefaction assessment. Therefore, the main objective of this dissertation is to develop a solution framework for reliable regional assessment of earthquake-induced liquefaction hazard. Another objective is to validate this framework by applying it to several earthquake-prone regions so that liquefaction hazard maps of these regions could be added to the literature and guide designers, engineers and researchers. Moreover, the dominant method of estimating liquefaction damages via empirical correlations are not capable for complex site conditions. Therefore another objective of this dissertation is to study alternative approaches for general estimation of liquefaction damages. To achieve these objectives, a multiscale modeling framework for better estimate of regional liquefaction hazard with material randomness and heterogeneity is developed. One advantage the developed methodology is the extension of conventional random field models to account for soil spatial variability at multiple scales and resolutions. The method allows selectively and adaptively generating random fields at smaller scales around critical areas or around areas where soil properties are known to a great detail from lab or field tests. The process is defined such that spatial correlation is consistent across length scales. Illustrative examples (Marina District in San Francisco, Alameda County in California, and Christchurch in New Zealand) are presented. Liquefaction hazard is evaluated at multi-scale. Compared with single scale analyses, multi-scale random fields provide more detailed information and higher-resolution soil properties around critical areas. This framework provides a new way to consistently incorporating small-scale local liquefaction analysis into large-scale liquefaction assessment mapping. Furthermore, finite element method is identified as a prominent alternative to traditional approach for liquefaction estimation via empirical correlations. A dynamic FEM model is built upon which an effective stress analysis is performed to estimate liquefaction-induced soil deformation at site-specific scale. It is shown the developed finite element model as a numerical tool can be used in predicting cyclic liquefaction in soils. This research is expected to shed light on the complete understanding of soil liquefaction during earthquakes in hoping of saving economic losses in the future

    Ab initio few-mode theory for quantum potential scattering problems

    Full text link
    Few-mode models have been a cornerstone of the theoretical work in quantum optics, with the famous single-mode Jaynes-Cummings model being only the most prominent example. In this work, we develop ab initio few-mode theory, a framework connecting few-mode system-bath models to ab initio theory. We first present a method to derive exact few-mode Hamiltonians for non-interacting quantum potential scattering problems and demonstrate how to rigorously reconstruct the scattering matrix from such few-mode Hamiltonians. We show that upon inclusion of a background scattering contribution, an ab initio version of the well known input-output formalism is equivalent to standard scattering theory. On the basis of these exact results for non-interacting systems, we construct an effective few-mode expansion scheme for interacting theories, which allows to extract the relevant degrees of freedom from a continuum in an open quantum system. As a whole, our results demonstrate that few-mode as well as input-output models can be extended to a general class of problems, and open up the associated toolbox to be applied to various platforms and extreme regimes. We outline differences of the ab initio results to standard model assumptions, which may lead to qualitatively different effects in certain regimes. The formalism is exemplified in various simple physical scenarios. In the process we provide proof-of-concept of the method, demonstrate important properties of the expansion scheme, and exemplify new features in extreme regimes.Comment: 41 pages, 14 figures, substantially extended version now also covering interacting and nonlinear problem

    Spatial and stochastic epidemics : theory, simulation and control

    Get PDF
    It is now widely acknowledged that spatial structure and hence the spatial position of host populations plays a vital role in the spread of infection. In this work I investigate an ensemble of techniques for understanding the stochastic dynamics of spatial and discrete epidemic processes, with especial consideration given to SIR disease dynamics for the Levins-type metapopulation. I present a toolbox of techniques for the modeller of spatial epidemics. The highlight results are a novel form of moment closure derived directly from a stochastic differential representation of the epidemic, a stochastic simulation algorithm that asymptotically in system size greatly out-performs existing simulation methods for the spatial epidemic and finally a method for tackling optimal vaccination scheduling problems for controlling the spread of an invasive pathogen

    Scaling Multidimensional Inference for Big Structured Data

    Get PDF
    In information technology, big data is a collection of data sets so large and complex that it becomes difficult to process using traditional data processing applications [151]. In a world of increasing sensor modalities, cheaper storage, and more data oriented questions, we are quickly passing the limits of tractable computations using traditional statistical analysis methods. Methods which often show great results on simple data have difficulties processing complicated multidimensional data. Accuracy alone can no longer justify unwarranted memory use and computational complexity. Improving the scaling properties of these methods for multidimensional data is the only way to make these methods relevant. In this work we explore methods for improving the scaling properties of parametric and nonparametric models. Namely, we focus on the structure of the data to lower the complexity of a specific family of problems. The two types of structures considered in this work are distributive optimization with separable constraints (Chapters 2-3), and scaling Gaussian processes for multidimensional lattice input (Chapters 4-5). By improving the scaling of these methods, we can expand their use to a wide range of applications which were previously intractable open the door to new research questions
    • …
    corecore