113,354 research outputs found
Calculating principal eigen-functions of non-negative integral kernels: particle approximations and applications
Often in applications such as rare events estimation or optimal control it is
required that one calculates the principal eigen-function and eigen-value of a
non-negative integral kernel. Except in the finite-dimensional case, usually
neither the principal eigen-function nor the eigen-value can be computed
exactly. In this paper, we develop numerical approximations for these
quantities. We show how a generic interacting particle algorithm can be used to
deliver numerical approximations of the eigen-quantities and the associated
so-called "twisted" Markov kernel as well as how these approximations are
relevant to the aforementioned applications. In addition, we study a collection
of random integral operators underlying the algorithm, address some of their
mean and path-wise properties, and obtain error estimates. Finally,
numerical examples are provided in the context of importance sampling for
computing tail probabilities of Markov chains and computing value functions for
a class of stochastic optimal control problems.Comment: 38 pages, 4 figures, 1 table; to appear in Mathematics of Operations
Researc
Robust nonparametric detection of objects in noisy images
We propose a novel statistical hypothesis testing method for detection of
objects in noisy images. The method uses results from percolation theory and
random graph theory. We present an algorithm that allows to detect objects of
unknown shapes in the presence of nonparametric noise of unknown level and of
unknown distribution. No boundary shape constraints are imposed on the object,
only a weak bulk condition for the object's interior is required. The algorithm
has linear complexity and exponential accuracy and is appropriate for real-time
systems. In this paper, we develop further the mathematical formalism of our
method and explore important connections to the mathematical theory of
percolation and statistical physics. We prove results on consistency and
algorithmic complexity of our testing procedure. In addition, we address not
only an asymptotic behavior of the method, but also a finite sample performance
of our test.Comment: This paper initially appeared in 2010 as EURANDOM Report 2010-049.
Link to the abstract at EURANDOM repository:
http://www.eurandom.tue.nl/reports/2010/049-abstract.pdf Link to the paper at
EURANDOM repository: http://www.eurandom.tue.nl/reports/2010/049-report.pd
Calculating and understanding the value of any type of match evidence when there are potential testing errors
It is well known that Bayesâ theorem (with likelihood ratios) can be used to calculate the impact of evidence, such as a âmatchâ of some feature of a person. Typically the feature of interest is the DNA profile, but the method applies in principle to any feature of a person or object, including not just DNA, fingerprints, or footprints, but also more basic features such as skin colour, height, hair colour or even name. Notwithstanding concerns about the extensiveness of databases of such features, a serious challenge to the use of Bayes in such legal contexts is that its standard formulaic representations are not readily understandable to non-statisticians. Attempts to get round this problem usually involve representations based around some variation of an event tree. While this approach works well in explaining the most trivial instance of Bayesâ theorem (involving a single hypothesis and a single piece of evidence) it does not scale up to realistic situations. In particular, even with a single piece of match evidence, if we wish to incorporate the possibility that there are potential errors (both false positives and false negatives) introduced at any stage in the investigative process, matters become very complex. As a result we have observed expert witnesses (in different areas of speciality) routinely ignore the possibility of errors when presenting their evidence. To counter this, we produce what we believe is the first full probabilistic solution of the simple case of generic match evidence incorporating both classes of testing errors. Unfortunately, the resultant event tree solution is too complex for intuitive comprehension. And, crucially, the event tree also fails to represent the causal information that underpins the argument. In contrast, we also present a simple-to-construct graphical Bayesian Network (BN) solution that automatically performs the calculations and may also be intuitively simpler to understand. Although there have been multiple previous applications of BNs for analysing forensic evidenceâincluding very detailed models for the DNA matching problem, these models have not widely penetrated the expert witness community. Nor have they addressed the basic generic match problem incorporating the two types of testing error. Hence we believe our basic BN solution provides an important mechanism for convincing expertsâand eventually the legal communityâthat it is possible to rigorously analyse and communicate the full impact of match evidence on a case, in the presence of possible error
Sensitivity analysis of the reactor safety study
Originally presented as the first author's thesis, (M.S.) in the M.I.T. Dept. of Nuclear Engineering, 1979.The Reactor Safety Study (RSS) or Wash-1400 developed a
methodology estimating the public risk from light water nuclear
reactors. In order to give further insights into this study,
a sensitivity analysis has been performed to determine the
significant contributors to risk for both the PWR and BWR.
The sensitivity to variation of the point values of the failure
probabilities reported in the RSS was determined for the
safety systems identified therein, as well as for many of the
generic classes from which individual failures contributed to
system failures. Increasing as well as decreasing point values
were considered. An analysis of the sensitivity to increasing
uncertainty in system failure probabilities was also performed.
The sensitivity parameters chosen were release category prob-
abilities, core melt probability, and the risk parameters of
early fatalities, latent cancers and total property damage.
The latter three are adequate for describing all public risks
identified in the RSS. The results indicate reductions of
public risk by less than a factor of two for factor reductions
in system or generic failure probabilities as hignh as one hundred.
There also appears to be more benefit in monitoring the most
sensitive systems to verify adherence to RSS failure rates
than to backfitting present reactors. The sensitivity analysis
results do indicate, however, possible benefits in reducing
human error rates.Final report for research project sponsored by Northeast Utilities Service Company, Yankee Atomic Electric Company under the M.I.T. Energy Laboratory Electric Utility Program
- âŠ