1,625 research outputs found

    A Gaussian approximation of the distributed computing process

    Get PDF
    The authors propose a refinement of the stochastic model describing the dynamics of the Desktop Grid (DG) project with many hosts and many workunits to be performed, originally proposed by Morozov et al. in 2017. The target performance measure is the mean duration of the runtime of the project. To this end, the authors derive an asymptotic expression for the amount of the accumulated work to be done by means of limit theorems for superposed on-off sources that lead to a Gaussian approximation. In more detail, depending on the distribution of active and idle periods, Brownian or fractional Brownian processes are obtained. The authors present the analytic results related to the hitting time of the considered processes (including the case in which the overall amount of work is only known in a probabilistic way), and highlight how the runtime tail distribution could be estimated by simulation. Taking advantage of the properties of Gaussian processes and the Conditional Monte-Carlo (CMC) approach, the authors present a theoretical framework for evaluating the runtime tail distribution

    Doctor of Philosophy

    Get PDF
    dissertationHigh-order finite element methods, using either the continuous or discontinuous Galerkin formulation, are becoming more popular in fields such as fluid mechanics, solid mechanics and computational electromagnetics. While the use of these methods is becoming increasingly common, there has not been a corresponding increase in the availability and use of visualization methods and software that are capable of displaying visualizations of these volumes both accurately and interactively. A fundamental problem with the majority of existing visualization techniques is that they do not understand nor respect the structure of a high-order field, leading to visualization error. Visualizations of high-order fields are generally created by first approximating the field with low-order primitives and then generating the visualization using traditional methods based on linear interpolation. The approximation step introduces error into the visualization pipeline, which requires the user to balance the competing goals of image quality, interactivity and resource consumption. In practice, visualizations performed this way are often either undersampled, leading to visualization error, or oversampled, leading to unnecessary computational effort and resource consumption. Without an understanding of the sources of error, the simulation scientist is unable to determine if artifacts in the image are due to visualization error, insufficient mesh resolution, or a failure in the underlying simulation. This uncertainty makes it difficult for the scientists to make judgments based on the visualization, as judgments made on the assumption that artifacts are a result of visualization error when they are actually a more fundamental problem can lead to poor decision-making. This dissertation presents new visualization algorithms that use the high-order data in its native state, using the knowledge of the structure and mathematical properties of these fields to create accurate images interactively, while avoiding the error introduced by representing the fields with low-order approximations. First, a new algorithm for cut-surfaces is presented, specifically the accurate depiction of colormaps and contour lines on arbitrarily complex cut-surfaces. Second, a mathematical analysis of the evaluation of the volume rendering integral through a high-order field is presented, as well as an algorithm that uses this analysis to create accurate volume renderings. Finally, a new software system, the Element Visualizer (ElVis), is presented, which combines the ideas and algorithms created in this dissertation in a single software package that can be used by simulation scientists to create accurate visualizations. This system was developed and tested with the assistance of the ProjectX simulation team. The utility of our algorithms and visualization system are then demonstrated with examples from several high-order fluid flow simulations

    Optimal Rates for Estimation of Two-Dimensional Totally Positive Distributions

    Full text link
    We study minimax estimation of two-dimensional totally positive distributions. Such distributions pertain to pairs of strongly positively dependent random variables and appear frequently in statistics and probability. In particular, for distributions with β\beta-H\"older smooth densities where β∈(0,2)\beta \in (0, 2), we observe polynomially faster minimax rates of estimation when, additionally, the total positivity condition is imposed. Moreover, we demonstrate fast algorithms to compute the proposed estimators and corroborate the theoretical rates of estimation by simulation studies.Comment: 41 pages, 6 figures; accepted for publication in the Electronic Journal of Statistic

    Early-Warning Monitoring Systems for Improved Drinking Water Resource Protection

    Get PDF

    Real-time Visual Flow Algorithms for Robotic Applications

    Get PDF
    Vision offers important sensor cues to modern robotic platforms. Applications such as control of aerial vehicles, visual servoing, simultaneous localization and mapping, navigation and more recently, learning, are examples where visual information is fundamental to accomplish tasks. However, the use of computer vision algorithms carries the computational cost of extracting useful information from the stream of raw pixel data. The most sophisticated algorithms use complex mathematical formulations leading typically to computationally expensive, and consequently, slow implementations. Even with modern computing resources, high-speed and high-resolution video feed can only be used for basic image processing operations. For a vision algorithm to be integrated on a robotic system, the output of the algorithm should be provided in real time, that is, at least at the same frequency as the control logic of the robot. With robotic vehicles becoming more dynamic and ubiquitous, this places higher requirements to the vision processing pipeline. This thesis addresses the problem of estimating dense visual flow information in real time. The contributions of this work are threefold. First, it introduces a new filtering algorithm for the estimation of dense optical flow at frame rates as fast as 800 Hz for 640x480 image resolution. The algorithm follows a update-prediction architecture to estimate dense optical flow fields incrementally over time. A fundamental component of the algorithm is the modeling of the spatio-temporal evolution of the optical flow field by means of partial differential equations. Numerical predictors can implement such PDEs to propagate current estimation of flow forward in time. Experimental validation of the algorithm is provided using high-speed ground truth image dataset as well as real-life video data at 300 Hz. The second contribution is a new type of visual flow named structure flow. Mathematically, structure flow is the three-dimensional scene flow scaled by the inverse depth at each pixel in the image. Intuitively, it is the complete velocity field associated with image motion, including both optical flow and scale-change or apparent divergence of the image. Analogously to optic flow, structure flow provides a robotic vehicle with perception of the motion of the environment as seen by the camera. However, structure flow encodes the full 3D image motion of the scene whereas optic flow only encodes the component on the image plane. An algorithm to estimate structure flow from image and depth measurements is proposed based on the same filtering idea used to estimate optical flow. The final contribution is the spherepix data structure for processing spherical images. This data structure is the numerical back-end used for the real-time implementation of the structure flow filter. It consists of a set of overlapping patches covering the surface of the sphere. Each individual patch approximately holds properties such as orthogonality and equidistance of points, thus allowing efficient implementations of low-level classical 2D convolution based image processing routines such as Gaussian filters and numerical derivatives. These algorithms are implemented on GPU hardware and can be integrated to future Robotic Embedded Vision systems to provide fast visual information to robotic vehicles
    • …
    corecore