49,033 research outputs found

    Derivatives of the Stochastic Growth Rate

    Full text link
    We consider stochastic matrix models for population driven by random environments which form a Markov chain. The top Lyapunov exponent aa, which describes the long-term growth rate, depends smoothly on the demographic parameters (represented as matrix entries) and on the parameters that define the stochastic matrix of the driving Markov chain. The derivatives of aa -- the "stochastic elasticities" -- with respect to changes in the demographic parameters were derived by \cite{tuljapurkar1990pdv}. These results are here extended to a formula for the derivatives with respect to changes in the Markov chain driving the environments. We supplement these formulas with rigorous bounds on computational estimation errors, and with rigorous derivations of both the new and the old formulas.Comment: 35 page

    Granular synthesis for display of time-varying probability densities

    Get PDF
    We present a method for displaying time-varying probabilistic information to users using an asynchronous granular synthesis technique. We extend the basic synthesis technique to include distribution over waveform source, spatial position, pitch and time inside waveforms. To enhance the synthesis in interactive contexts, we "quicken" the display by integrating predictions of user behaviour into the sonification. This includes summing the derivatives of the distribution during exploration of static densities, and using Monte-Carlo sampling to predict future user states in nonlinear dynamic systems. These techniques can be used to improve user performance in continuous control systems and in the interactive exploration of high dimensional spaces. This technique provides feedback from users potential goals, and their progress toward achieving them; modulating the feedback with quickening can help shape the users actions toward achieving these goals. We have applied these techniques to a simple nonlinear control problem as well as to the sonification of on-line probabilistic gesture recognition. We are applying these displays to mobile, gestural interfaces, where visual display is often impractical. The granular synthesis approach is theoretically elegant and easily applied in contexts where dynamic probabilistic displays are required

    Price Calibration of basket default swap: Evidence from Japanese market

    Get PDF
    The aim of this paper is the price calibration of basket default swap from Japanese market data. The value of this instruments depend on the number of factors including credit rating of the obligors in the basket, recovery rates, intensity of default, basket size and the correlation of obligors in the basket. A fundamental part of the pricing framework is the estimation of the instantaneous default probabilities for each obligor. Because default probabilities depend on the credit quality of the considered obligor, well-calibrated credit curves are a main ingredient for constructing default times. The calibration of credit curves take into account internal information on credit migrations and default history. We refer to Japan Credit Rating Agency to obtain rating transition matrix and cumulative default rates. Default risk is often considered as a rare-event and then, many studies have shown that many distributions have fatter tails than those captured by the normal distribution. Subsequently, the choice of copula and the choice of procedures for rare-event simulation govern the pricing of basket credit derivatives. Joshi and Kainth (2004) introduced an Importance Sampling technique for rare-event that forces a predetermined number of defaults to occur on each path. We consider using Gaussian copula and t-student copula and study their impact on basket credit derivative prices. We will present an application of the Canonical Maximum Likelihood Method (CML) for calibrating t-student copula to Japanese market data.Basket Default Swaps, Credit Curve, Monte Carlo method, Gaussian copula, t-student copula, Japanese market data, CML, Importance Sampling

    Techniques for the Fast Simulation of Models of Highly dependable Systems

    Get PDF
    With the ever-increasing complexity and requirements of highly dependable systems, their evaluation during design and operation is becoming more crucial. Realistic models of such systems are often not amenable to analysis using conventional analytic or numerical methods. Therefore, analysts and designers turn to simulation to evaluate these models. However, accurate estimation of dependability measures of these models requires that the simulation frequently observes system failures, which are rare events in highly dependable systems. This renders ordinary Simulation impractical for evaluating such systems. To overcome this problem, simulation techniques based on importance sampling have been developed, and are very effective in certain settings. When importance sampling works well, simulation run lengths can be reduced by several orders of magnitude when estimating transient as well as steady-state dependability measures. This paper reviews some of the importance-sampling techniques that have been developed in recent years to estimate dependability measures efficiently in Markov and nonMarkov models of highly dependable system

    Monte Carlo Implementation of Gaussian Process Models for Bayesian Regression and Classification

    Full text link
    Gaussian processes are a natural way of defining prior distributions over functions of one or more input variables. In a simple nonparametric regression problem, where such a function gives the mean of a Gaussian distribution for an observed response, a Gaussian process model can easily be implemented using matrix computations that are feasible for datasets of up to about a thousand cases. Hyperparameters that define the covariance function of the Gaussian process can be sampled using Markov chain methods. Regression models where the noise has a t distribution and logistic or probit models for classification applications can be implemented by sampling as well for latent values underlying the observations. Software is now available that implements these methods using covariance functions with hierarchical parameterizations. Models defined in this way can discover high-level properties of the data, such as which inputs are relevant to predicting the response

    Efficient estimation of blocking probabilities in non-stationary loss networks

    Get PDF
    This paper considers estimation of blocking probabilities in a nonstationary loss network. Invoking the so called MOL (Modified Offered Load) approximation, the problem is transformed into one requiring the solution of blocking probabilities in a sequence of stationary loss networks with time varying loads. To estimate the blocking probabilities Monte Carlo simulation is used and to increase the efficiency of the simulation, we develop a likelihood ratio method that enables samples drawn at a one time point to be used at later time points. This reduces the need to draw new samples every time independently as a new time point is considered, thus giving substantial savings in the computational effort of evaluating time dependent blocking probabilities. The accuracy of the method is analyzed by using Taylor series approximations of the variance indicating the direct dependence of the accuracy on the rate of change of the actual load. Finally, three practical applications of the method are provided along with numerical examples to demonstrate the efficiency of the method
    • 

    corecore