6,724 research outputs found

    Effective Visualizations of the Uncertainty in Hurricane Forecasts

    Get PDF
    The track forecast cone developed by the U.S. National Hurricane Center is the one most universally adopted by the general public, the news media, and governmental officials to enhance viewers\u27 understanding of the forecasts and their underlying uncertainties. However, current research has experimentally shown that it has limitations that result in misconceptions of the uncertainty included. Most importantly, the area covered by the cone tends to be misinterpreted as the region affected by the hurricane. In addition, the cone summarizes forecasts for the next three days into a single representation and, thus, makes it difficult for viewers to accurately determine crucial time-specific information. To address these limitations, this research develops novel alternative visualizations. It begins by developing a technique that generates and smoothly interpolates robust statistics from ensembles of hurricane predictions, thus creating visualizations that inherently include the spatial uncertainty by displaying three levels of positional storm strike risk at a specific point in time. To address the misconception of the area covered by the cone, this research develops time-specific visualizations depicting spatial information based on a sampling technique that selects a small, representative subset from an ensemble of points. It also allows depictions of such important storm characteristics as size and intensity. Further, this research generalizes the representative sampling framework to process ensembles of forecast tracks, selecting a subset of tracks accurately preserving the original distributions of available storm characteristics and keeping appropriately defined spatial separations. This framework supports an additional hurricane visualization portraying prediction uncertainties implicitly by directly showing the members of the subset without the visual clutter. We collaborated on cognitive studies that suggest that these visualizations enhance viewers\u27 ability to understand the forecasts because they are potentially interpreted more like uncertainty distributions. In addition to benefiting the field of hurricane forecasting, this research potentially enhances the visualization community more generally. For instance, the representative sampling framework for processing 2D points developed here can be applied to enhancing the standard scatter plots and density plots by reducing sizes of data sets. Further, as the idea of direct ensemble displays can possibly be extended to more general numerical simulations, it, thus, has potential impacts on a wide range of ensemble visualizations

    Curve boxplot: Generalization of boxplot for ensembles of curves

    Get PDF
    pre-printIn simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics

    Illumination coding meets uncertainty learning: toward reliable AI-augmented phase imaging

    Full text link
    We propose a physics-assisted deep learning (DL) framework for large space-bandwidth product (SBP) phase imaging. We design an asymmetric coded illumination scheme to encode high-resolution phase information across a wide field-of-view. We then develop a matching DL algorithm to provide large-SBP phase estimation. We show that this illumination coding scheme is highly scalable in achieving flexible resolution, and robust to experimental variations. We demonstrate this technique on both static and dynamic biological samples, and show that it can reliably achieve 5X resolution enhancement across 4X FOVs using only five multiplexed measurements -- more than 10X data reduction over the state-of-the-art. Typical DL algorithms tend to provide over-confident predictions, whose errors are only discovered in hindsight. We develop an uncertainty learning framework to overcome this limitation and provide predictive assessment to the reliability of the DL prediction. We show that the predicted uncertainty maps can be used as a surrogate to the true error. We validate the robustness of our technique by analyzing the model uncertainty. We quantify the effect of noise, model errors, incomplete training data, and "out-of-distribution" testing data by assessing the data uncertainty. We further demonstrate that the predicted credibility maps allow identifying spatially and temporally rare biological events. Our technique enables scalable AI-augmented large-SBP phase imaging with dependable predictions.Published versio

    The Ensemble Kalman Filter: A Signal Processing Perspective

    Full text link
    The ensemble Kalman filter (EnKF) is a Monte Carlo based implementation of the Kalman filter (KF) for extremely high-dimensional, possibly nonlinear and non-Gaussian state estimation problems. Its ability to handle state dimensions in the order of millions has made the EnKF a popular algorithm in different geoscientific disciplines. Despite a similarly vital need for scalable algorithms in signal processing, e.g., to make sense of the ever increasing amount of sensor data, the EnKF is hardly discussed in our field. This self-contained review paper is aimed at signal processing researchers and provides all the knowledge to get started with the EnKF. The algorithm is derived in a KF framework, without the often encountered geoscientific terminology. Algorithmic challenges and required extensions of the EnKF are provided, as well as relations to sigma-point KF and particle filters. The relevant EnKF literature is summarized in an extensive survey and unique simulation examples, including popular benchmark problems, complement the theory with practical insights. The signal processing perspective highlights new directions of research and facilitates the exchange of potentially beneficial ideas, both for the EnKF and high-dimensional nonlinear and non-Gaussian filtering in general

    Federating structural models and data:Outcomes from a workshop on archiving integrative structures

    Get PDF
    Structures of biomolecular systems are increasingly computed by integrative modeling. In this approach, a structural model is constructed by combining information from multiple sources, including varied experimental methods and prior models. In 2019, a Workshop was held as a Biophysical Society Satellite Meeting to assess progress and discuss further requirements for archiving integrative structures. The primary goal of the Workshop was to build consensus for addressing the challenges involved in creating common data standards, building methods for federated data exchange, and developing mechanisms for validating integrative structures. The summary of the Workshop and the recommendations that emerged are presented here
    • …
    corecore