1,214,424 research outputs found

    Experimental Design of a Prescribed Burn Instrumentation

    Full text link
    Observational data collected during experiments, such as the planned Fire and Smoke Model Evaluation Experiment (FASMEE), are critical for progressing and transitioning coupled fire-atmosphere models like WRF-SFIRE and WRF-SFIRE-CHEM into operational use. Historical meteorological data, representing typical weather conditions for the anticipated burn locations and times, have been processed to initialize and run a set of simulations representing the planned experimental burns. Based on an analysis of these numerical simulations, this paper provides recommendations on the experimental setup that include the ignition procedures, size and duration of the burns, and optimal sensor placement. New techniques are developed to initialize coupled fire-atmosphere simulations with weather conditions typical of the planned burn locations and time of the year. Analysis of variation and sensitivity analysis of simulation design to model parameters by repeated Latin Hypercube Sampling are used to assess the locations of the sensors. The simulations provide the locations of the measurements that maximize the expected variation of the sensor outputs with the model parameters.Comment: 35 pages, 4 tables, 28 figure

    Continuum limit of total variation on point clouds

    Full text link
    We consider point clouds obtained as random samples of a measure on a Euclidean domain. A graph representing the point cloud is obtained by assigning weights to edges based on the distance between the points they connect. Our goal is to develop mathematical tools needed to study the consistency, as the number of available data points increases, of graph-based machine learning algorithms for tasks such as clustering. In particular, we study when is the cut capacity, and more generally total variation, on these graphs a good approximation of the perimeter (total variation) in the continuum setting. We address this question in the setting of Γ\Gamma-convergence. We obtain almost optimal conditions on the scaling, as number of points increases, of the size of the neighborhood over which the points are connected by an edge for the Γ\Gamma-convergence to hold. Taking the limit is enabled by a transportation based metric which allows to suitably compare functionals defined on different point clouds

    Optimal stopping times for estimating Bernoulli parameters with applications to active imaging

    Full text link
    We address the problem of estimating the parameter of a Bernoulli process. This arises in many applications, including photon-efficient active imaging where each illumination period is regarded as a single Bernoulli trial. We introduce a framework within which to minimize the mean-squared error (MSE) subject to an upper bound on the mean number of trials. This optimization has several simple and intuitive properties when the Bernoulli parameter has a beta prior. In addition, by exploiting typical spatial correlation using total variation regularization, we extend the developed framework to a rectangular array of Bernoulli processes representing the pixels in a natural scene. In simulations inspired by realistic active imaging scenarios, we demonstrate a 4.26 dB reduction in MSE due to the adaptive acquisition, as an average over many independent experiments and invariant to a factor of 3.4 variation in trial budget.Accepted manuscrip

    Network Coding Channel Virtualization Schemes for Satellite Multicast Communications

    Full text link
    In this paper, we propose two novel schemes to solve the problem of finding a quasi-optimal number of coded packets to multicast to a set of independent wireless receivers suffering different channel conditions. In particular, we propose two network channel virtualization schemes that allow for representing the set of intended receivers in a multicast group to be virtualized as one receiver. Such approach allows for a transmission scheme not only adapted to per-receiver channel variation over time, but to the network-virtualized channel representing all receivers in the multicast group. The first scheme capitalizes on a maximum erasure criterion introduced via the creation of a virtual worst per receiver per slot reference channel of the network. The second scheme capitalizes on a maximum completion time criterion by the use of the worst performing receiver channel as a virtual reference to the network. We apply such schemes to a GEO satellite scenario. We demonstrate the benefits of the proposed schemes comparing them to a per-receiver point-to-point adaptive strategy

    Compression and Conditional Emulation of Climate Model Output

    Full text link
    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus it is important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. The statistical model can be used to generate realizations representing the full dataset, along with characterizations of the uncertainties in the generated data. Thus, the methods are capable of both compression and conditional emulation of the climate models. Considerable attention is paid to accurately modeling the original dataset--one year of daily mean temperature data--particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers
    • …
    corecore