953 research outputs found

    The Topology ToolKit

    Full text link
    This system paper presents the Topology ToolKit (TTK), a software platform designed for topological data analysis in scientific visualization. TTK provides a unified, generic, efficient, and robust implementation of key algorithms for the topological analysis of scalar data, including: critical points, integral lines, persistence diagrams, persistence curves, merge trees, contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots, Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due to a tight integration with ParaView. It is also easily accessible to developers through a variety of bindings (Python, VTK/C++) for fast prototyping or through direct, dependence-free, C++, to ease integration into pre-existing complex systems. While developing TTK, we faced several algorithmic and software engineering challenges, which we document in this paper. In particular, we present an algorithm for the construction of a discrete gradient that complies to the critical points extracted in the piecewise-linear setting. This algorithm guarantees a combinatorial consistency across the topological abstractions supported by TTK, and importantly, a unified implementation of topological data simplification for multi-scale exploration and analysis. We also present a cached triangulation data structure, that supports time efficient and generic traversals, which self-adjusts its memory usage on demand for input simplicial meshes and which implicitly emulates a triangulation for regular grids with no memory overhead. Finally, we describe an original software architecture, which guarantees memory efficient and direct accesses to TTK features, while still allowing for researchers powerful and easy bindings and extensions. TTK is open source (BSD license) and its code, online documentation and video tutorials are available on TTK's website

    Evaluating forecasts of extreme events for hydrological applications: an approach for screening unfamiliar performance measures

    Get PDF
    Many different performance measures have been developed to evaluate field predictions in meteorology. However, a researcher or practitioner encountering a new or unfamiliar measure may have difficulty in interpreting its results, which may lead to them avoiding new measures and relying on those that are familiar. In the context of evaluating forecasts of extreme events for hydrological applications, this article aims to promote the use of a range of performance measures. Some of the types of performance measures that are introduced in order to demonstrate a six-step approach to tackle a new measure. Using the example of the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble precipitation predictions for the Danube floods of July and August 2002, to show how to use new performance measures with this approach and the way to choose between different performance measures based on their suitability for the task at hand is shown. Copyright © 2008 Royal Meteorological Societ

    The Lagrangian spectral relaxation model for differential diffusion in homogeneous turbulence

    Get PDF
    The Lagrangian spectral relaxation ~LSR! model is extended to treat turbulent mixing of two passive scalars (fa and fb) with different molecular diffusivity coefficients ~i.e., differential-diffusion effects!. Because of the multiscale description employed in the LSR model, the scale dependence of differential-diffusion effects is described explicitly, including the generation of scalar decorrelation at small scales and its backscatter to large scales. The model is validated against DNS data for differential diffusion of Gaussian scalars in forced, isotropic turbulence at four values of the turbulence Reynolds number (Rl538, 90, 160, and 230! with and without uniform mean scalar gradients. The explicit Reynolds and Schmidt number dependencies of the model parameters allows for the determination of the Re ~integral-scale Reynolds number! and Sc ~Schmidt number! scaling of the scalar difference z5fa2fb . For example, its variance is shown to scale like ^z2& ;Re20.3. The rate of backscatter (bD) from the diffusive scales towards the large scales is found to be the key parameter in the model. In particular, it is shown that bD must be an increasing function of the Schmidt number for Sc\u3c1 in order to predict the correct scalar-to-mechanical time-scale ratios, and the correct long-time scalar decorrelation rate in the absence of uniform mean scalar gradients

    Selected Challenges From Spatial Statistics For Spatial Econometricians

    Get PDF
    Griffith and Paelinck (2011) present selected non-standard spatial statistics and spatial econometrics topics that address issues associated with spatial econometric methodology. This paper addresses the following challenges posed by spatial autocorrelation alluded to and/or derived from the spatial statistics topics of this book: the Gaussian random variable Jacobian term for massive datasets; topological features of georeferenced data; eigenvector spatial filtering-based georeferenced data generating mechanisms; and, interpreting random effects.Artykuł prezentuje wybrane, niestandardowe statystyki przestrzenne oraz zagadnienia ekonometrii przestrzennej. Rozważania teoretyczne koncentrują się na wyzwaniach wynikających z autokorelacji przestrzennej, nawiązując do pojęć Gaussowskiej zmiennej losowej, topologicznych cech danych georeferencyjnych, wektorów własnych, filtrów przestrzennych, georeferencyjnych mechanizmów generowania danych oraz interpretacji efektów losowych

    Jacobi Fiber Surfaces for Bivariate Reeb Space Computation

    Get PDF
    This paper presents an efficient algorithm for the computation of the Reeb space of an input bivariate piecewise linear scalar function f defined on a tetrahedral mesh. By extending and generalizing algorithmic concepts from the univariate case to the bivariate one, we report the first practical, output-sensitive algorithm for the exact computation of such a Reeb space. The algorithm starts by identifying the Jacobi set of f , the bivariate analogs of critical points in the univariate case. Next, the Reeb space is computed by segmenting the input mesh along the new notion of Jacobi Fiber Surfaces, the bivariate analog of critical contours in the univariate case. We additionally present a simplification heuristic that enables the progressive coarsening of the Reeb space. Our algorithm is simple to implement and most of its computations can be trivially parallelized. We report performance numbers demonstrating orders of magnitude speedups over previous approaches, enabling for the first time the tractable computation of bivariate Reeb spaces in practice. Moreover, unlike range-based quantization approaches (such as the Joint Contour Net), our algorithm is parameter-free. We demonstrate the utility of our approach by using the Reeb space as a semi-automatic segmentation tool for bivariate data. In particular, we introduce continuous scatterplot peeling, a technique which enables the reduction of the cluttering in the continuous scatterplot, by interactively selecting the features of the Reeb space to project. We provide a VTK-based C++ implementation of our algorithm that can be used for reproduction purposes or for the development of new Reeb space based visualization techniques

    Spatial particulate fields during highwinds in the imperial valley, California

    Get PDF
    We examined windblown dust within the Imperial Valley (CA) during strong springtime west-southwesterly (WSW) wind events. Analysis of routine agency meteorological and ambient particulate matter (PM) measurements identified 165 high WSW wind events between March and June 2013 to 2019. The PM concentrations over these days are higher at northern valley monitoring sites, with daily PM mass concentration of particles less than 10 micrometers aerodynamic diameter (PM10) at these sites commonly greater than 100 μg/m3 and reaching around 400 μg/m3, and daily PM mass concentration of particles less than 2.5 micrometers aerodynamic diameter (PM2.5) commonly greater than 20 μg/m3 and reaching around 60 μg/m3. A detailed analysis utilizing 1 km resolution multi-angle implementation of atmospheric correction (MAIAC) aerosol optical depth (AOD), Identifying Violations Affecting Neighborhoods (IVAN) low-cost PM2.5 measurements and 500 m resolution sediment supply fields alongside routine ground PM observations identified an area of high AOD/PM during WSW events spanning the northwestern valley encompassing the Brawley/Westmorland through the Niland area. This area shows up most clearly once the average PM10 at northern valley routine sites during WSW events exceeds 100 μg/m3. The area is consistent with high soil sediment supply in the northwestern valley and upwind desert, suggesting local sources are primarily responsible. On the basis of this study, MAIAC AOD appears able to identify localized high PM areas during windblown dust events provided the PM levels are high enough. The use of the IVAN data in this study illustrates how a citizen science effort to collect more spatially refined air quality concentration data can help pinpoint episodic pollution patterns and possible sources important for PM exposure and adverse health effects

    Level-oriented universal visual representation environment

    Get PDF
    We propose a three-dimensional graphics engine targeted at simultaneous visualizing multiple data sets and simulations in progress using a number of different visualization methods. The user can navigate between different views in a way in which one would traverse a museum: by switching focus from one object to another or zooming out to include several objects at the same time. Related visual-izations are vertically organized into levels or floors, further enhancing the museum metaphor. Additional information and means of manipulating the visualized data or simulations are provided for the user in a form of a two-dimensional on-screen overlay and also with the use of various input devices, not only mouse or keyboard. L.O.U.V.R.E. proved to be a very efficient and useful tool when dealing with experiments on robotics simulations. This paper presents such usage, and al-so indicates other possible applications. We find that it fills a gap as an intuitive solution encompassing graphing, simulation and user interface at the same time. Its applications go far beyond computer science research into such fields as biology or physics
    corecore