14 research outputs found
On the subdivision strategy in adaptive quadrature algorithms
AbstractThe subdivision procedure used in most available adaptive quadrature codes is a simple bisection of the chosen interval. Thus the interval is divided in two equally sized parts. In this paper we present a subdivision strategy which gives three nonequally sized parts. The subdivision points are found using only available information. The strategy has been implemented in the QUADPACK code DQAG and tested using the “performance profile” testing technique. We present test results showing a significant reduction in the number of function evaluations compared to the standard bisection procedure on most test families of integrands
On Probabilistic Parallel Programs with Process Creation and Synchronisation
We initiate the study of probabilistic parallel programs with dynamic process
creation and synchronisation. To this end, we introduce probabilistic
split-join systems (pSJSs), a model for parallel programs, generalising both
probabilistic pushdown systems (a model for sequential probabilistic procedural
programs which is equivalent to recursive Markov chains) and stochastic
branching processes (a classical mathematical model with applications in
various areas such as biology, physics, and language processing). Our pSJS
model allows for a possibly recursive spawning of parallel processes; the
spawned processes can synchronise and return values. We study the basic
performance measures of pSJSs, especially the distribution and expectation of
space, work and time. Our results extend and improve previously known results
on the subsumed models. We also show how to do performance analysis in
practice, and present two case studies illustrating the modelling power of
pSJSs.Comment: This is a technical report accompanying a TACAS'11 pape
Increasing the Reliability of Adaptive Quadrature Using Explicit Interpolants
We present two new adaptive quadrature routines. Both routines differ from
previously published algorithms in many aspects, most significantly in how they
represent the integrand, how they treat non-numerical values of the integrand,
how they deal with improper divergent integrals and how they estimate the
integration error. The main focus of these improvements is to increase the
reliability of the algorithms without significantly impacting their efficiency.
Both algorithms are implemented in Matlab and tested using both the "families"
suggested by Lyness and Kaganove and the battery test used by Gander and
Gautschi and Kahaner. They are shown to be more reliable, albeit in some cases
less efficient, than other commonly-used adaptive integrators.Comment: 32 pages, submitted to ACM Transactions on Mathematical Softwar
Compositional Solution Space Quantification for Probabilistic Software Analysis
Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time
Time-dependent reliability methodologies with saddlepoint approximation
Engineers always encounter time-dependent uncertainties that ubiquitously exist, such as the random deterioration of material properties and time-variant loads. Therefore the reliability of engineering systems becomes time-dependent. It is crucial to predict the time-dependent reliability in the design stage, given possible catastrophic consequences of a failure. Although extensive research has been conducted on reliability analysis, estimating the reliability accurately and efficiently is still challenging. The objective of this work is to develop accurate and efficient reliability methodologies for engineering design. The basic idea is the integration of traditional reliability methods with saddlepoint approximation (SPA), which can accurately approximate the tail distribution of a random variable. Four methods are proposed in this work. The first three methods deal with time-independent reliability while the last one estimates the time-dependent reliability. The first method combines SPA with first-order approximation and achieves higher accuracy over the traditional first-order reliability method when bimodal distributions are involved. The second method further improves the accuracy of reliability estimation by integrating SPA with the second-order approximation. The third method extends the second method into the reliability-based design for higher accuracy, and the high efficiency is maintained by an efficient algorithm for searching for an equivalent reliability index. The fourth method uses sequential efficient global optimization to convert a time-dependent problem into a time-independent counterpart. Then the second method is utilized to estimate the time-independent reliability after the conversion. The accuracy and effectiveness of the above methods are demonstrated by both numerical examples and engineering applications --Abstract, page iv