1,465,518 research outputs found
An LTL proof system for runtime verification
We propose a local proof system for LTL formalising deductions within the constraints of Runtime Verification (RV), and show how such a system can be used as a basis for the construction of online runtime monitors. Novel soundness and completeness results are proven for this system. We also prove decidability and incrementality properties for a monitoring algorithm constructed from it. Finally, we relate its expressivity to existing symbolic analysis techniques used in RV.peer-reviewe
Recommended from our members
Average case analysis of marking algorithms
The Lindstrom marking algorithm uses bounded workspace. Its time complexity is O(n^2) in all cases, but it has been assumed that the average case time complexity O(n lg n). It is proven that the average case time complexity is H(n^2). Similarly, the average size of the Wegbreit bit stack is shown to be H(n)
Discrete Fourier analysis of multigrid algorithms
The main topic of this report is a detailed discussion of the discrete Fourier multilevel analysis of multigrid algorithms. First, a brief overview of multigrid methods is given for discretizations of both linear and nonlinear partial differential equations. Special attention is given to the hp-Multigrid as Smoother algorithm, which is a new algorithm suitable for higher order accurate discontinuous Galerkin discretizations of advection dominated flows. In order to analyze the performance of the multigrid algorithms the error transformation operator for several linear multigrid algorithms are derived. The operator norm and spectral radius of the multigrid error transformation are then computed using discrete Fourier analysis. First, the main operations in the discrete Fourier analysis are defined, including the aliasing of modes. Next, the Fourier symbol of the multigrid operators is computed and used to obtain the Fourier symbol of the multigrid error transformation operator. In the multilevel analysis, two and three level h-multigrid, both for uniformly and semi-coarsened meshes, are considered, and also the analysis of the hp-Multigrid as Smoother algorithm for three polynomial levels and three uniformly and semi-coarsened meshes. The report concludes with a discussion of the multigrid operator norm and spectral radius. In the appendix some useful auxiliary results are summarized
Performance Analysis of Cone Detection Algorithms
Many algorithms have been proposed to help clinicians evaluate cone density
and spacing, as these may be related to the onset of retinal diseases. However,
there has been no rigorous comparison of the performance of these algorithms.
In addition, the performance of such algorithms is typically determined by
comparison with human observers. Here we propose a technique to simulate
realistic images of the cone mosaic. We use the simulated images to test the
performance of two popular cone detection algorithms and we introduce an
algorithm which is used by astronomers to detect stars in astronomical images.
We use Free Response Operating Characteristic (FROC) curves to evaluate and
compare the performance of the three algorithms. This allows us to optimize the
performance of each algorithm. We observe that performance is significantly
enhanced by up-sampling the images. We investigate the effect of noise and
image quality on cone mosaic parameters estimated using the different
algorithms, finding that the estimated regularity is the most sensitive
parameter.
This paper was published in JOSA A and is made available as an electronic
reprint with the permission of OSA. The paper can be found at the following URL
on the OSA website: http://www.opticsinfobase.org/abstract.cfm?msid=224577.
Systematic or multiple reproduction or distribution to multiple locations via
electronic or other means is prohibited and is subject to penalties under law.Comment: 13 pages, 7 figures, 2 table
A complexity analysis of statistical learning algorithms
We apply information-based complexity analysis to support vector machine
(SVM) algorithms, with the goal of a comprehensive continuous algorithmic
analysis of such algorithms. This involves complexity measures in which some
higher order operations (e.g., certain optimizations) are considered primitive
for the purposes of measuring complexity. We consider classes of information
operators and algorithms made up of scaled families, and investigate the
utility of scaling the complexities to minimize error. We look at the division
of statistical learning into information and algorithmic components, at the
complexities of each, and at applications to support vector machine (SVM) and
more general machine learning algorithms. We give applications to SVM
algorithms graded into linear and higher order components, and give an example
in biomedical informatics
- …