31,508 research outputs found
Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization
In this paper, we study a class of stochastic optimization problems, referred
to as the \emph{Conditional Stochastic Optimization} (CSO), in the form of
\min_{x \in \mathcal{X}}
\EE_{\xi}f_\xi\Big({\EE_{\eta|\xi}[g_\eta(x,\xi)]}\Big), which finds a wide
spectrum of applications including portfolio selection, reinforcement learning,
robust learning, causal inference and so on. Assuming availability of samples
from the distribution \PP(\xi) and samples from the conditional distribution
\PP(\eta|\xi), we establish the sample complexity of the sample average
approximation (SAA) for CSO, under a variety of structural assumptions, such as
Lipschitz continuity, smoothness, and error bound conditions. We show that the
total sample complexity improves from \cO(d/\eps^4) to \cO(d/\eps^3) when
assuming smoothness of the outer function, and further to \cO(1/\eps^2) when
the empirical function satisfies the quadratic growth condition. We also
establish the sample complexity of a modified SAA, when and are
independent. Several numerical experiments further support our theoretical
findings.
Keywords: stochastic optimization, sample average approximation, large
deviations theoryComment: Typo corrected. Reference added. Revision comments handle
Numerical Approximation of Stationary Distribution for SPDEs
In this paper, we show that the exponential integrator scheme both in spatial
discretization and time discretization for a class of stochastic partial
differential equations has a unique stationary distribution whenever the
stepsize is sufficiently small, and reveal that the weak limit of the law for
the exponential integrator scheme is in fact the counterpart for the stochastic
partial differential equation considered.Comment: P2
Stability of stochastic impulsive differential equations: integrating the cyber and the physical of stochastic systems
According to Newton's second law of motion, we humans describe a dynamical
system with a differential equation, which is naturally discretized into a
difference equation whenever a computer is used. The differential equation is
the physical model in human brains and the difference equation the cyber model
in computers for the dynamical system. The physical model refers to the
dynamical system itself (particularly, a human-designed system) in the physical
world and the cyber model symbolises it in the cyber counterpart. This paper
formulates a hybrid model with impulsive differential equations for the
dynamical system, which integrates its physical model in real world/human
brains and its cyber counterpart in computers. The presented results establish
a theoretic foundation for the scientific study of control and communication in
the animal/human and the machine (Norbert Wiener) in the era of rise of the
machines as well as a systems science for cyber-physical systems (CPS)
Strong convergence rates for backward Euler–Maruyama method for non-linear dissipative-type stochastic differential equations with super-linear diffusion coefficients
In this work, we generalize the current theory of strong convergence rates for the backward Euler–Maruyama scheme for highly non-linear stochastic differential equations, which appear in both mathematical finance and bio-mathematics. More precisely, we show that under a dissipative condition on the drift coefficient and superlinear growth condition on the diffusion coefficient the BEM scheme converges with strong order of a half. This type of convergence gives theoretical foundations for efficient variance reduction techniques for Monte Carlo simulations. We support our theoretical results with relevant examples, such as stochastic population models and stochastic volatility models
- …