15,416 research outputs found

    Contouring with uncertainty

    Get PDF
    As stated by Johnson [Joh04], the visualization of uncertainty remains one of the major challenges for the visualization community. To achieve this, we need to understand and develop methods that allow us not only to consider uncertainty as an extra variable within the visualization process, but to treat it as an integral part. In this paper, we take contouring, one of the most widely used visualization techniques for two dimensional data, and focus on extending the concept of contouring to uncertainty. We develop special techniques for the visualization of uncertain contours. We illustrate the work through application to a case study in oceanography

    Spectral Mapping Reconstruction of Extended Sources

    Get PDF
    Three dimensional spectroscopy of extended sources is typically performed with dedicated integral field spectrographs. We describe a method of reconstructing full spectral cubes, with two spatial and one spectral dimension, from rastered spectral mapping observations employing a single slit in a traditional slit spectrograph. When the background and image characteristics are stable, as is often achieved in space, the use of traditional long slits for integral field spectroscopy can substantially reduce instrument complexity over dedicated integral field designs, without loss of mapping efficiency -- particularly compelling when a long slit mode for single unresolved source followup is separately required. We detail a custom flux-conserving cube reconstruction algorithm, discuss issues of extended source flux calibration, and describe CUBISM, a tool which implements these methods for spectral maps obtained with ther Spitzer Space Telescope's Infrared Spectrograph.Comment: 11 pages, 8 figures, accepted by PAS

    Structure from motion systems for architectural heritage. A survey of the internal loggia courtyard of Palazzo dei Capitani, Ascoli Piceno, Italy

    Get PDF
    We present the results of a point-cloud-based survey deriving from the use of image-based techniques, in particular with multi-image monoscopic digital photogrammetry systems and software, the so-called “structure-from-motion” technique. The aim is to evaluate the advantages and limitations of such procedures in architectural surveying, particularly in conditions that are “at the limit”. A particular case study was chosen: the courtyard of Palazzo dei Capitani del Popolo in Ascoli Piceno, Italy, which can be considered the ideal example due to its notable vertical, rather than horizontal, layout. In this context, by comparing and evaluating the different results, we present experimentation regarding this single case study with the aim of identifying the best workflow to realise a complex, articulated set of representations—using 3D modelling and 2D processing—necessary to correctly document the particular characteristics of such an architectural object

    Review of the mathematical foundations of data fusion techniques in surface metrology

    Get PDF
    The recent proliferation of engineered surfaces, including freeform and structured surfaces, is challenging current metrology techniques. Measurement using multiple sensors has been proposed to achieve enhanced benefits, mainly in terms of spatial frequency bandwidth, which a single sensor cannot provide. When using data from different sensors, a process of data fusion is required and there is much active research in this area. In this paper, current data fusion methods and applications are reviewed, with a focus on the mathematical foundations of the subject. Common research questions in the fusion of surface metrology data are raised and potential fusion algorithms are discussed

    Selection functions of large spectroscopic surveys

    Full text link
    Context. Large spectroscopic surveys open the way to explore our Galaxy. In order to use the data from these surveys to understand the Galactic stellar population, we need to be sure that stars contained in a survey are a representative subset of the underlying population. Without the selection function taken into account, the results might reflect the properties of the selection function rather than those of the underlying stellar population. Aims. In this work, we introduce a method to estimate the selection function for a given spectroscopic survey. We apply this method to a large sample of public spectroscopic surveys. Methods. We apply a median division binning algorithm to bin observed stars in the colour-magnitude space. This approach produces lower uncertainties and lower biases of the selection function estimate as compared to traditionally used 2D-histograms. We run a set of simulations to verify the method and calibrate the one free parameter it contains. These simulations allow us to test the precision and accuracy of the method. Results. We produce and publish estimated values and uncertainties of selection functions for a large sample of public spectroscopic surveys. We publicly release the code used to produce the selection function estimates. Conclusions. The effect of the selection function on distance modulus and metallicity distributions of stars in surveys is important for surveys with small and largely inhomogeneous spatial coverage. For surveys with contiguous spatial coverage the effect of the selection function is almost negligible.Comment: 12 pages, 11 figures, 1 tabl

    General Defocusing Particle Tracking: fundamentals and uncertainty assessment

    Full text link
    General Defocusing Particle Tracking (GDPT) is a single-camera, three-dimensional particle tracking method that determines the particle depth positions from the defocusing patterns of the corresponding particle images. GDPT relies on a reference set of experimental particle images which is used to predict the depth position of measured particle images of similar shape. While several implementations of the method are possible, its accuracy is ultimately limited by some intrinsic properties of the acquired data, such as the signal-to-noise ratio, the particle concentration, as well as the characteristics of the defocusing patterns. GDPT has been applied in different fields by different research groups, however, a deeper description and analysis of the method fundamentals has hitherto not been available. In this work, we first identity the fundamental elements that characterize a GDPT measurement. Afterwards, we present a standardized framework based on synthetic images to assess the performance of GDPT implementations in terms of measurement uncertainty and relative number of measured particles. Finally, we provide guidelines to assess the uncertainty of experimental GDPT measurements, where true values are not accessible and additional image aberrations can lead to bias errors. The data were processed using DefocusTracker, an open-source GDPT software. The datasets were created using the synthetic image generator MicroSIG and have been shared in a freely-accessible repository

    On the effect of random errors in gridded bathymetric compilations

    Get PDF
    We address the problem of compiling bathymetric data sets with heterogeneous coverage and a range of data measurement accuracies. To generate a regularly spaced grid, we are obliged to interpolate sparse data; our objective here is to augment this product with an estimate of confidence in the interpolated bathymetry based on our knowledge of the component of random error in the bathymetric source data. Using a direct simulation Monte Carlo method, we utilize data from the International Bathymetric Chart of the Arctic Ocean database to develop a suitable methodology for assessment of the standard deviations of depths in the interpolated grid. Our assessment of random errors in each data set are heuristic but realistic and are based on available metadata from the data providers. We show that a confidence grid can be built using this method and that this product can be used to assess reliability of the final compilation. The methodology as developed here is applied to bathymetric data but is equally applicable to other interpolated data sets, such as gravity and magnetic data

    Complexity plots

    Get PDF
    In this paper, we present a novel visualization technique for assisting in observation and analysis of algorithmic\ud complexity. In comparison with conventional line graphs, this new technique is not sensitive to the units of\ud measurement, allowing multivariate data series of different physical qualities (e.g., time, space and energy) to be juxtaposed together conveniently and consistently. It supports multivariate visualization as well as uncertainty visualization. It enables users to focus on algorithm categorization by complexity classes, while reducing visual impact caused by constants and algorithmic components that are insignificant to complexity analysis. It provides an effective means for observing the algorithmic complexity of programs with a mixture of algorithms and blackbox software through visualization. Through two case studies, we demonstrate the effectiveness of complexity plots in complexity analysis in research, education and application

    A Bayesian Heteroscedastic GLM with Application to fMRI Data with Motion Spikes

    Full text link
    We propose a voxel-wise general linear model with autoregressive noise and heteroscedastic noise innovations (GLMH) for analyzing functional magnetic resonance imaging (fMRI) data. The model is analyzed from a Bayesian perspective and has the benefit of automatically down-weighting time points close to motion spikes in a data-driven manner. We develop a highly efficient Markov Chain Monte Carlo (MCMC) algorithm that allows for Bayesian variable selection among the regressors to model both the mean (i.e., the design matrix) and variance. This makes it possible to include a broad range of explanatory variables in both the mean and variance (e.g., time trends, activation stimuli, head motion parameters and their temporal derivatives), and to compute the posterior probability of inclusion from the MCMC output. Variable selection is also applied to the lags in the autoregressive noise process, making it possible to infer the lag order from the data simultaneously with all other model parameters. We use both simulated data and real fMRI data from OpenfMRI to illustrate the importance of proper modeling of heteroscedasticity in fMRI data analysis. Our results show that the GLMH tends to detect more brain activity, compared to its homoscedastic counterpart, by allowing the variance to change over time depending on the degree of head motion

    Fast Monte Carlo Simulation for Patient-specific CT/CBCT Imaging Dose Calculation

    Full text link
    Recently, X-ray imaging dose from computed tomography (CT) or cone beam CT (CBCT) scans has become a serious concern. Patient-specific imaging dose calculation has been proposed for the purpose of dose management. While Monte Carlo (MC) dose calculation can be quite accurate for this purpose, it suffers from low computational efficiency. In response to this problem, we have successfully developed a MC dose calculation package, gCTD, on GPU architecture under the NVIDIA CUDA platform for fast and accurate estimation of the x-ray imaging dose received by a patient during a CT or CBCT scan. Techniques have been developed particularly for the GPU architecture to achieve high computational efficiency. Dose calculations using CBCT scanning geometry in a homogeneous water phantom and a heterogeneous Zubal head phantom have shown good agreement between gCTD and EGSnrc, indicating the accuracy of our code. In terms of improved efficiency, it is found that gCTD attains a speed-up of ~400 times in the homogeneous water phantom and ~76.6 times in the Zubal phantom compared to EGSnrc. As for absolute computation time, imaging dose calculation for the Zubal phantom can be accomplished in ~17 sec with the average relative standard deviation of 0.4%. Though our gCTD code has been developed and tested in the context of CBCT scans, with simple modification of geometry it can be used for assessing imaging dose in CT scans as well.Comment: 18 pages, 7 figures, and 1 tabl
    • …
    corecore