6,770 research outputs found
Techniques for the Fast Simulation of Models of Highly dependable Systems
With the ever-increasing complexity and requirements of highly dependable systems, their evaluation during design and operation is becoming more crucial. Realistic models of such systems are often not amenable to analysis using conventional analytic or numerical methods. Therefore, analysts and designers turn to simulation to evaluate these models. However, accurate estimation of dependability measures of these models requires that the simulation frequently observes system failures, which are rare events in highly dependable systems. This renders ordinary Simulation impractical for evaluating such systems. To overcome this problem, simulation techniques based on importance sampling have been developed, and are very effective in certain settings. When importance sampling works well, simulation run lengths can be reduced by several orders of magnitude when estimating transient as well as steady-state dependability measures. This paper reviews some of the importance-sampling techniques that have been developed in recent years to estimate dependability measures efficiently in Markov and nonMarkov models of highly dependable system
Optimization and sensitivity analysis of computer simulation models by the score function method
Experimental Design;Simulation;Optimization;Queueing Theory
Recommended from our members
A Bayesian approach for statistical–physical bulk parameterization of rain microphysics. Part II: Idealized Markov chain Monte Carlo experiments
Observationally informed development of a new framework for bulk rain microphysics, the Bayesian Observationally Constrained Statistical–Physical Scheme (BOSS; described in Part I of this study), is demonstrated. This scheme’s development is motivated by large uncertainties in cloud and weather simulations associated with approximations and assumptions in existing microphysics schemes. Here, a proof-of-concept study is presented using a Markov chain Monte Carlo sampling algorithm with BOSS to probabilistically estimate microphysical process rates and parameters directly from a set of synthetically generated rain observations. The framework utilized is an idealized steady-state one-dimensional column rainshaft model with specified column-top rain properties and a fixed thermodynamical profile. Different configurations of BOSS—flexibility being a key feature of this approach—are constrained via synthetic observations generated from a traditional three-moment bulk microphysics scheme. The ability to retrieve correct parameter values when the true parameter values are known is illustrated. For cases when there is no set of true parameter values, the accuracy of configurations of BOSS that have different levels of complexity is compared. It is found that addition of the sixth moment as a prognostic variable improves prediction of the third moment (proportional to bulk rain mass) and rain rate. In contrast, increasing process rate formulation complexity by adding more power terms has little benefit—a result that is explained using further-idealized experiments. BOSS rainshaft simulations are shown to well estimate the true process rates from constraint by bulk rain observations, with the additional benefit of rigorously quantified uncertainty of these estimates
Maximum likelihood estimation by monte carlo simulation:Toward data-driven stochastic modeling
We propose a gradient-based simulated maximum likelihood estimation to estimate unknown parameters in a stochastic model without assuming that the likelihood function of the observations is available in closed form. A key element is to develop Monte Carlo-based estimators for the density and its derivatives for the output process, using only knowledge about the dynamics of the model. We present the theory of these estimators and demonstrate how our approach can handle various types of model structures. We also support our findings and illustrate the merits of our approach with numerical results
A relative entropy rate method for path space sensitivity analysis of stationary complex stochastic dynamics
We propose a new sensitivity analysis methodology for complex stochastic
dynamics based on the Relative Entropy Rate. The method becomes computationally
feasible at the stationary regime of the process and involves the calculation
of suitable observables in path space for the Relative Entropy Rate and the
corresponding Fisher Information Matrix. The stationary regime is crucial for
stochastic dynamics and here allows us to address the sensitivity analysis of
complex systems, including examples of processes with complex landscapes that
exhibit metastability, non-reversible systems from a statistical mechanics
perspective, and high-dimensional, spatially distributed models. All these
systems exhibit, typically non-gaussian stationary probability distributions,
while in the case of high-dimensionality, histograms are impossible to
construct directly. Our proposed methods bypass these challenges relying on the
direct Monte Carlo simulation of rigorously derived observables for the
Relative Entropy Rate and Fisher Information in path space rather than on the
stationary probability distribution itself. We demonstrate the capabilities of
the proposed methodology by focusing here on two classes of problems: (a)
Langevin particle systems with either reversible (gradient) or non-reversible
(non-gradient) forcing, highlighting the ability of the method to carry out
sensitivity analysis in non-equilibrium systems; and, (b) spatially extended
Kinetic Monte Carlo models, showing that the method can handle high-dimensional
problems
Progressive construction of a parametric reduced-order model for PDE-constrained optimization
An adaptive approach to using reduced-order models as surrogates in
PDE-constrained optimization is introduced that breaks the traditional
offline-online framework of model order reduction. A sequence of optimization
problems constrained by a given Reduced-Order Model (ROM) is defined with the
goal of converging to the solution of a given PDE-constrained optimization
problem. For each reduced optimization problem, the constraining ROM is trained
from sampling the High-Dimensional Model (HDM) at the solution of some of the
previous problems in the sequence. The reduced optimization problems are
equipped with a nonlinear trust-region based on a residual error indicator to
keep the optimization trajectory in a region of the parameter space where the
ROM is accurate. A technique for incorporating sensitivities into a
Reduced-Order Basis (ROB) is also presented, along with a methodology for
computing sensitivities of the reduced-order model that minimizes the distance
to the corresponding HDM sensitivity, in a suitable norm. The proposed reduced
optimization framework is applied to subsonic aerodynamic shape optimization
and shown to reduce the number of queries to the HDM by a factor of 4-5,
compared to the optimization problem solved using only the HDM, with errors in
the optimal solution far less than 0.1%
Pathwise Sensitivity Analysis in Transient Regimes
The instantaneous relative entropy (IRE) and the corresponding instanta-
neous Fisher information matrix (IFIM) for transient stochastic processes are
pre- sented in this paper. These novel tools for sensitivity analysis of
stochastic models serve as an extension of the well known relative entropy rate
(RER) and the corre- sponding Fisher information matrix (FIM) that apply to
stationary processes. Three cases are studied here, discrete-time Markov
chains, continuous-time Markov chains and stochastic differential equations. A
biological reaction network is presented as a demonstration numerical example
- …