6,799 research outputs found

    A General Spatio-Temporal Clustering-Based Non-local Formulation for Multiscale Modeling of Compartmentalized Reservoirs

    Full text link
    Representing the reservoir as a network of discrete compartments with neighbor and non-neighbor connections is a fast, yet accurate method for analyzing oil and gas reservoirs. Automatic and rapid detection of coarse-scale compartments with distinct static and dynamic properties is an integral part of such high-level reservoir analysis. In this work, we present a hybrid framework specific to reservoir analysis for an automatic detection of clusters in space using spatial and temporal field data, coupled with a physics-based multiscale modeling approach. In this work a novel hybrid approach is presented in which we couple a physics-based non-local modeling framework with data-driven clustering techniques to provide a fast and accurate multiscale modeling of compartmentalized reservoirs. This research also adds to the literature by presenting a comprehensive work on spatio-temporal clustering for reservoir studies applications that well considers the clustering complexities, the intrinsic sparse and noisy nature of the data, and the interpretability of the outcome. Keywords: Artificial Intelligence; Machine Learning; Spatio-Temporal Clustering; Physics-Based Data-Driven Formulation; Multiscale Modelin

    Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches

    Get PDF
    Imaging spectrometers measure electromagnetic energy scattered in their instantaneous field view in hundreds or thousands of spectral channels with higher spectral resolution than multispectral cameras. Imaging spectrometers are therefore often referred to as hyperspectral cameras (HSCs). Higher spectral resolution enables material identification via spectroscopic analysis, which facilitates countless applications that require identifying materials in scenarios unsuitable for classical spectroscopic analysis. Due to low spatial resolution of HSCs, microscopic material mixing, and multiple scattering, spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus, accurate estimation requires unmixing. Pixels are assumed to be mixtures of a few materials, called endmembers. Unmixing involves estimating all or some of: the number of endmembers, their spectral signatures, and their abundances at each pixel. Unmixing is a challenging, ill-posed inverse problem because of model inaccuracies, observation noise, environmental conditions, endmember variability, and data set size. Researchers have devised and investigated many models searching for robust, stable, tractable, and accurate unmixing algorithms. This paper presents an overview of unmixing methods from the time of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models are first discussed. Signal-subspace, geometrical, statistical, sparsity-based, and spatial-contextual unmixing algorithms are described. Mathematical problems and potential solutions are described. Algorithm characteristics are illustrated experimentally.Comment: This work has been accepted for publication in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensin

    Pascual Jordan, his contributions to quantum mechanics and his legacy in contemporary local quantum physics

    Full text link
    After recalling episodes from Pascual Jordan's biography including his pivotal role in the shaping of quantum field theory and his much criticized conduct during the NS regime, I draw attention to his presentation of the first phase of development of quantum field theory in a talk presented at the 1929 Kharkov conference. He starts by giving a comprehensive account of the beginnings of quantum theory, emphasising that particle-like properties arise as a consequence of treating wave-motions quantum-mechanically. He then goes on to his recent discovery of quantization of ``wave fields'' and problems of gauge invariance. The most surprising aspect of Jordan's presentation is however his strong belief that his field quantization is a transitory not yet optimal formulation of the principles underlying causal, local quantum physics. The expectation of a future more radical change coming from the main architect of field quantization already shortly after his discovery is certainly quite startling. I try to answer the question to what extent Jordan's 1929 expectations have been vindicated. The larger part of the present essay consists in arguing that Jordan's plea for a formulation without ``classical correspondence crutches'', i.e. for an intrinsic approach (which avoids classical fields altogether), is successfully addressed in past and recent publications on local quantum physics.Comment: More biographical detail, expansion of the part referring to Jordan's legacy in quantum field theory, 37 pages late

    Objective probability and quantum fuzziness

    Full text link
    This paper offers a critique of the Bayesian interpretation of quantum mechanics with particular focus on a paper by Caves, Fuchs, and Schack containing a critique of the "objective preparations view" or OPV. It also aims to carry the discussion beyond the hardened positions of Bayesians and proponents of the OPV. Several claims made by Caves et al. are rebutted, including the claim that different pure states may legitimately be assigned to the same system at the same time, and the claim that the quantum nature of a preparation device cannot legitimately be ignored. Both Bayesians and proponents of the OPV regard the time dependence of a quantum state as the continuous dependence on time of an evolving state of some kind. This leads to a false dilemma: quantum states are either objective states of nature or subjective states of belief. In reality they are neither. The present paper views the aforesaid dependence as a dependence on the time of the measurement to whose possible outcomes the quantum state serves to assign probabilities. This makes it possible to recognize the full implications of the only testable feature of the theory, viz., the probabilities it assigns to measurement outcomes...Comment: 21 pages, no graphics, inspired by "Subjective probability and quantum certainty" (quant-ph/0608190 v2

    Interaction between high-level and low-level image analysis for semantic video object extraction

    Get PDF
    Authors of articles published in EURASIP Journal on Advances in Signal Processing are the copyright holders of their articles and have granted to any third party, in advance and in perpetuity, the right to use, reproduce or disseminate the article, according to the SpringerOpen copyright and license agreement (http://www.springeropen.com/authors/license)

    The inner regions of protoplanetary disks

    Full text link
    To understand how planetary systems form in the dusty disks around pre-main-sequence stars a detailed knowledge of the structure and evolution of these disks is required. While this is reasonably well understood for the regions of the disk beyond about 1 AU, the structure of these disks inward of 1 AU remains a puzzle. This is partly because it is very difficult to spatially resolve these regions with current telescopes. But it is also because the physics of this region, where the disk becomes so hot that the dust starts to evaporate, is poorly understood. With infrared interferometry it has become possible in recent years to directly spatially resolve the inner AU of protoplanetary disks, albeit in a somewhat limited way. These observations have partly confirmed current models of these regions, but also posed new questions and puzzles. Moreover, it has turned out that the numerical modeling of these regions is extremely challenging. In this review we give a rough overview of the history and recent developments in this exciting field of astrophysics.Comment: 45 pages with 14 Figures. to appear in Annual Review of Astronomy and Astrophysics (2010, Vol. 48
    corecore