8,929 research outputs found
A sequential sampling strategy for extreme event statistics in nonlinear dynamical systems
We develop a method for the evaluation of extreme event statistics associated
with nonlinear dynamical systems, using a small number of samples. From an
initial dataset of design points, we formulate a sequential strategy that
provides the 'next-best' data point (set of parameters) that when evaluated
results in improved estimates of the probability density function (pdf) for a
scalar quantity of interest. The approach utilizes Gaussian process regression
to perform Bayesian inference on the parameter-to-observation map describing
the quantity of interest. We then approximate the desired pdf along with
uncertainty bounds utilizing the posterior distribution of the inferred map.
The 'next-best' design point is sequentially determined through an optimization
procedure that selects the point in parameter space that maximally reduces
uncertainty between the estimated bounds of the pdf prediction. Since the
optimization process utilizes only information from the inferred map it has
minimal computational cost. Moreover, the special form of the metric emphasizes
the tails of the pdf. The method is practical for systems where the
dimensionality of the parameter space is of moderate size, i.e. order O(10). We
apply the method to estimate the extreme event statistics for a very
high-dimensional system with millions of degrees of freedom: an offshore
platform subjected to three-dimensional irregular waves. It is demonstrated
that the developed approach can accurately determine the extreme event
statistics using limited number of samples
Synthesis of Minimal Error Control Software
Software implementations of controllers for physical systems are at the core
of many embedded systems. The design of controllers uses the theory of
dynamical systems to construct a mathematical control law that ensures that the
controlled system has certain properties, such as asymptotic convergence to an
equilibrium point, while optimizing some performance criteria. However, owing
to quantization errors arising from the use of fixed-point arithmetic, the
implementation of this control law can only guarantee practical stability:
under the actions of the implementation, the trajectories of the controlled
system converge to a bounded set around the equilibrium point, and the size of
the bounded set is proportional to the error in the implementation. The problem
of verifying whether a controller implementation achieves practical stability
for a given bounded set has been studied before. In this paper, we change the
emphasis from verification to automatic synthesis. Using synthesis, the need
for formal verification can be considerably reduced thereby reducing the design
time as well as design cost of embedded control software.
We give a methodology and a tool to synthesize embedded control software that
is Pareto optimal w.r.t. both performance criteria and practical stability
regions. Our technique is a combination of static analysis to estimate
quantization errors for specific controller implementations and stochastic
local search over the space of possible controllers using particle swarm
optimization. The effectiveness of our technique is illustrated using examples
of various standard control systems: in most examples, we achieve controllers
with close LQR-LQG performance but with implementation errors, hence regions of
practical stability, several times as small.Comment: 18 pages, 2 figure
Statistical and Computational Tradeoff in Genetic Algorithm-Based Estimation
When a Genetic Algorithm (GA), or a stochastic algorithm in general, is
employed in a statistical problem, the obtained result is affected by both
variability due to sampling, that refers to the fact that only a sample is
observed, and variability due to the stochastic elements of the algorithm. This
topic can be easily set in a framework of statistical and computational
tradeoff question, crucial in recent problems, for which statisticians must
carefully set statistical and computational part of the analysis, taking
account of some resource or time constraints. In the present work we analyze
estimation problems tackled by GAs, for which variability of estimates can be
decomposed in the two sources of variability, considering some constraints in
the form of cost functions, related to both data acquisition and runtime of the
algorithm. Simulation studies will be presented to discuss the statistical and
computational tradeoff question.Comment: 17 pages, 5 figure
- …