3 research outputs found

    Accelerating model evaluations in uncertainty propagation on tensor grids using computational graph transformations

    Full text link
    Methods such as non-intrusive polynomial chaos (NIPC), and stochastic collocation are frequently used for uncertainty propagation problems. Particularly for low-dimensional problems, these methods often use a tensor-product grid for sampling the space of uncertain inputs. A limitation of this approach is that it encounters a significant challenge: the number of sample points grows exponentially with the increase of uncertain inputs. Current strategies to mitigate computational costs abandon the tensor structure of sampling points, with the aim of reducing their overall count. Contrastingly, our investigation reveals that preserving the tensor structure of sample points can offer distinct advantages in specific scenarios. Notably, by manipulating the computational graph of the targeted model, it is feasible to avoid redundant evaluations at the operation level to significantly reduce the model evaluation cost on tensor-grid inputs. This paper presents a pioneering method: Accelerated Model Evaluations on Tensor grids using Computational graph transformations (AMTC). The core premise of AMTC lies in the strategic modification of the computational graph of the target model to algorithmically remove the repeated evaluations on the operation level. We implemented the AMTC method within the compiler of a new modeling language called the Computational System Design Language (CSDL). We demonstrate the effectiveness of AMTC by using it with the full-grid NIPC method to solve three low-dimensional UQ problems involving an analytical piston model, a multidisciplinary unmanned aerial vehicle design model, and a multi-point air taxi mission analysis model, respectively. For all of the test problems, AMTC reduces the model evaluation cost by between 50% and 90%, making the full-grid NIPC the most efficacious method to use among the UQ methods implemented
    corecore