198,596 research outputs found

    An Extended Structural Credit Risk Model (forthcoming in the Icfai Journal of Financial Risk Management; all copyrights rest with the Icfai University Press)

    Get PDF
    This paper presents an extended structural credit risk model that pro- vides closed form solutions for fixed and floating coupon bonds and credit default swaps. This structural model is an "extended" one in the following sense. It allows for the default free term structure to be driven by the a multi-factor Gaussian model, rather than by a single factor one. Expected default occurs as a latent diffusion process first hits the default barrier, but the diffusion process is not the value of the firm's assets. Default can be "expected" or "unexpected". Liquidity risk is correlated with credit risk. It is not necessary to disentangle the risk of unexpected default from liquidity risk. A tractable and accurate recovery assumption is proposed.structural credit risk model, Vasicek model, Gaussian term structure model, bond pricing, credit default swap pricing, unexpected default, liquidity risk.

    Multidimensional Membership Mixture Models

    Full text link
    We present the multidimensional membership mixture (M3) models where every dimension of the membership represents an independent mixture model and each data point is generated from the selected mixture components jointly. This is helpful when the data has a certain shared structure. For example, three unique means and three unique variances can effectively form a Gaussian mixture model with nine components, while requiring only six parameters to fully describe it. In this paper, we present three instantiations of M3 models (together with the learning and inference algorithms): infinite, finite, and hybrid, depending on whether the number of mixtures is fixed or not. They are built upon Dirichlet process mixture models, latent Dirichlet allocation, and a combination respectively. We then consider two applications: topic modeling and learning 3D object arrangements. Our experiments show that our M3 models achieve better performance using fewer topics than many classic topic models. We also observe that topics from the different dimensions of M3 models are meaningful and orthogonal to each other.Comment: 9 pages, 7 figure

    Essays in Dynamic Duration and Count Modelling

    Get PDF
    In this dissertation we study the dynamic and static probabilistic structure of the distribution of equity transaction times on financial markets. We propose dynamic, non-linear, non-Gaussian state space models to investigate both the structure of the associated inter-trade durations, and the properties of the number of transactions over a mesh of fixed length. The economic motivation of the study lies in the relationship between the properties of transaction times and those of the time-varying volatility of equity returns and of market liquidity measures such as bid-ask spreads. We use high-frequency data extracted from the Trade and Quotes database to recover transaction time-stamps recorded down to the second or millisecond time scale depending on the sample of analysis. We focus our attention to a randomly selected sub-sample of the S&P100 index traded on U.S. financial markets. Starting from the work of Chen et al. (2013), we propose a dynamic duration model that is able to capture the salient features of the empirical distribution of inter-trade durations for the most recent samples, namely, over-dispersion, long-memory, transaction clustering and simultaneous trading. We employ this model to study the structural change in the properties of the transaction process by assessing its ability of fitting the data and its forecasting accuracy over a long span of time (1993-2013). As an alternative tool for the analysis of the transaction times process, and motivated by the necessity of reducing the computational burdens induced by the appearance of data-sets of unprecedented size, we propose a dynamic, long-memory model for the number of transactions over a mesh of fixed length, based on the Markov Switching Multifractal model proposed by Calvet and Fisher (2008). We perform goodness-of-fit and forecasting accuracy comparisons against competing models and find that the proposed model provides a superior performance

    Inference for spatial processes using imperfect data from measurements and numerical simulations

    Get PDF
    This is the final version of the article. Available from the arXiv.org via the link in this record.We present a framework for inference for spatial processes that have actual values imperfectly represented by data. Environmental processes represented as spatial fields, either at fixed time points, or aggregated over fixed time periods, are studied. Data from both measurements and simulations performed by complex computer models are used to infer actual values of the spatial fields. Methods from geostatistics and statistical emulation are used to explicitly capture discrepancies between a spatial field's actual and simulated values. A geostatistical model captures spatial discrepancy: the difference in spatial structure between simulated and actual values. An emulator represents the intensity discrepancy: the bias in simulated values of given intensity. Measurement error is also represented. Gaussian process priors represent each source of error, which gives an analytical expression for the posterior distribution for the actual spatial field. Actual footprints for 50 European windstorms, which represent maximum wind gust speeds on a grid over a 72-hour period, are derived from wind gust speed measurements taken at stations across Europe and output simulated from a downscaled version of the Met Office Unified Model. The derived footprints have realistic spatial structure, and gust speeds closer to the measurements than originally simulated.We thank Phil Sansom for helpful discussion. We thank the Willis Research Network for supporting this work, the Met Office for providing the windstorm measurement data, and Julia Roberts for help with data provision

    Inference for spatial processes using imperfect data from measurements and numerical simulations

    Get PDF
    This is the final version of the article. Available from the arXiv.org via the link in this record.We present a framework for inference for spatial processes that have actual values imperfectly represented by data. Environmental processes represented as spatial fields, either at fixed time points, or aggregated over fixed time periods, are studied. Data from both measurements and simulations performed by complex computer models are used to infer actual values of the spatial fields. Methods from geostatistics and statistical emulation are used to explicitly capture discrepancies between a spatial field's actual and simulated values. A geostatistical model captures spatial discrepancy: the difference in spatial structure between simulated and actual values. An emulator represents the intensity discrepancy: the bias in simulated values of given intensity. Measurement error is also represented. Gaussian process priors represent each source of error, which gives an analytical expression for the posterior distribution for the actual spatial field. Actual footprints for 50 European windstorms, which represent maximum wind gust speeds on a grid over a 72-hour period, are derived from wind gust speed measurements taken at stations across Europe and output simulated from a downscaled version of the Met Office Unified Model. The derived footprints have realistic spatial structure, and gust speeds closer to the measurements than originally simulated.We thank Phil Sansom for helpful discussion. We thank the Willis Research Network for supporting this work, the Met Office for providing the windstorm measurement data, and Julia Roberts for help with data provision
    • …
    corecore