57,153 research outputs found
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
Robust exact differentiators with predefined convergence time
The problem of exactly differentiating a signal with bounded second
derivative is considered. A class of differentiators is proposed, which
converge to the derivative of such a signal within a fixed, i.e., a finite and
uniformly bounded convergence time. A tuning procedure is derived that allows
to assign an arbitrary, predefined upper bound for this convergence time. It is
furthermore shown that this bound can be made arbitrarily tight by appropriate
tuning. The usefulness of the procedure is demonstrated by applying it to the
well-known uniform robust exact differentiator, which the considered class of
differentiators includes as a special case
On the existence of a solution to a spectral estimation problem \emph{\`a la} Byrnes-Georgiou-Lindquist
A parametric spectral estimation problem in the style of Byrnes, Georgiou,
and Lindquist was posed in \cite{FPZ-10}, but the existence of a solution was
only proved in a special case. Based on their results, we show that a solution
indeed exists given an arbitrary matrix-valued prior density. The main tool in
our proof is the topological degree theory.Comment: 6 pages of two-column draft, accepted for publication in IEEE-TA
A New Distribution-Free Concept for Representing, Comparing, and Propagating Uncertainty in Dynamical Systems with Kernel Probabilistic Programming
This work presents the concept of kernel mean embedding and kernel
probabilistic programming in the context of stochastic systems. We propose
formulations to represent, compare, and propagate uncertainties for fairly
general stochastic dynamics in a distribution-free manner. The new tools enjoy
sound theory rooted in functional analysis and wide applicability as
demonstrated in distinct numerical examples. The implication of this new
concept is a new mode of thinking about the statistical nature of uncertainty
in dynamical systems
- …