15,757 research outputs found

    Methods for detection and characterization of signals in noisy data with the Hilbert-Huang Transform

    Full text link
    The Hilbert-Huang Transform is a novel, adaptive approach to time series analysis that does not make assumptions about the data form. Its adaptive, local character allows the decomposition of non-stationary signals with hightime-frequency resolution but also renders it susceptible to degradation from noise. We show that complementing the HHT with techniques such as zero-phase filtering, kernel density estimation and Fourier analysis allows it to be used effectively to detect and characterize signals with low signal to noise ratio.Comment: submitted to PRD, 10 pages, 9 figures in colo

    Cosmological Information Contents on the Light-Cone

    Full text link
    We develop a theoretical framework to describe the cosmological observables on the past light cone such as the luminosity distance, weak lensing, galaxy clustering, and the cosmic microwave background anisotropies. We consider that all the cosmological observables include not only the background quantity, but also the perturbation quantity, and they are subject to cosmic variance, which sets the fundamental limits on the cosmological information that can be derived from such observables, even in an idealized survey with an infinite number of observations. To quantify the maximum cosmological information content, we apply the Fisher information matrix formalism and spherical harmonic analysis to cosmological observations, in which the angular and the radial positions of the observables on the light cone carry different information. We discuss the maximum cosmological information that can be derived from five different observables: (1) type Ia supernovae, (2) cosmic microwave background anisotropies, (3) weak gravitational lensing, (4) local baryon density, and (5) galaxy clustering. We compare our results with the cosmic variance obtained in the standard approaches, which treat the light cone volume as a cubic box of simultaneity. We discuss implications of our formalism and ways to overcome the fundamental limit.Comment: 39 pages, no figures, submitted to JCA

    A General Spatio-Temporal Clustering-Based Non-local Formulation for Multiscale Modeling of Compartmentalized Reservoirs

    Full text link
    Representing the reservoir as a network of discrete compartments with neighbor and non-neighbor connections is a fast, yet accurate method for analyzing oil and gas reservoirs. Automatic and rapid detection of coarse-scale compartments with distinct static and dynamic properties is an integral part of such high-level reservoir analysis. In this work, we present a hybrid framework specific to reservoir analysis for an automatic detection of clusters in space using spatial and temporal field data, coupled with a physics-based multiscale modeling approach. In this work a novel hybrid approach is presented in which we couple a physics-based non-local modeling framework with data-driven clustering techniques to provide a fast and accurate multiscale modeling of compartmentalized reservoirs. This research also adds to the literature by presenting a comprehensive work on spatio-temporal clustering for reservoir studies applications that well considers the clustering complexities, the intrinsic sparse and noisy nature of the data, and the interpretability of the outcome. Keywords: Artificial Intelligence; Machine Learning; Spatio-Temporal Clustering; Physics-Based Data-Driven Formulation; Multiscale Modelin
    • …
    corecore