124 research outputs found

    Identification and estimation of continuous time dynamic systems with exogenous variables using panel data

    Get PDF
    This paper deals with the identification and maximum likelihood estimation of the parameters of a stochastic differential equation from discrete time sampling. Score function and maximum likelihood equations are derived explicitly. The stochastic differential equation system is extended to allow for random effects and the analysis of panel data. In addition, we investigate the identifiability of the continuous time parameters, in particular the impact of the inclusion of exogenous variables

    Forecasting Credit Portfolio Risk

    Get PDF
    The main challenge of forecasting credit default risk in loan portfolios is forecasting the default probabilities and the default correlations. We derive a Merton-style threshold-value model for the default probability which treats the asset value of a firm as unknown and uses a factor model instead. In addition, we demonstrate how default correlations can be easily modeled. The empirical analysis is based on a large data set of German firms provided by Deutsche Bundesbank. We find that the inclusion of variables which are correlated with the business cycle improves the forecasts of default probabilities. Asset and default correlations depend on the factors used to model default probabilities. The better the point-in-time calibration of the estimated default probabilities, the smaller the estimated correlations. Thus, correlations and default probabilities should always be estimated simultaneously. --asset correlation,bank regulation,Basel II,credit risk,default correlation,default probability,logit model,probit model

    Credit Risk Factor Modeling and the Basel II IRB Approach

    Get PDF
    Default probabilities (PDs) and correlations play a crucial role in the New Basel Capital Accord. In commercial credit risk models they are an important constituent. Yet, modeling and estimation of PDs and correlations is still under active discussion. We show how the Basel II one factor model which is used to calibrate risk weights can be extended to a model for estimating PDs and correlations. The important advantage of this model is that it uses actual information about the point in time of the credit cycle. Thus, uncertainties about the parameters which are needed for Value-at-Risk calculations in portfolio models may be substantially reduced. First empirical evidence for the appropriateness of the models and underlying risk factors is given with S&P data. --Credit Risk,Credit Ratings,Probability of Default,Bank Regulation

    A Combined GEE/Buckley-James Method for Estimating an Accelerated Failure Time Model of Multivariate Failure Times

    Get PDF
    The present paper deals with the estimation of a frailty model of multivariate failure times. The failure times are modeled by an Accelerated Failure Time Model including observed covariates and an unobservable frailty component. The frailty is assumed random and differs across elementary units, but is constant across the spells of a unit or a group. We develop an estimator (of the regression parameters) that combines the GEE approach (Liang and Zeger, 1986) with the Buckley-James estimator for censored data. This estimator is robust against violations of the correlation structure and the distributional assumptions. Some simulation studies are conducted in order to study the empirical performance of the estimator. Finally, the methods are applied to data of repeated appearances of malign ventricular arrhythmias at patients with implanted defibrillator

    Estimation of multivariate probit models: A mixed generalized estimating/pseudo-score equations approach and some finite sample results

    Get PDF
    In the present paper a mixed approach is proposed for the simultaneously estimation of regression and correlation structure parameters in multivariate probit models using generalized estimating equations for the former and pseudo-score equations for the latter. The finite sample properties of the corresponding estimators are compared to estimators proposed by Qu, Williams, Beck and Medendorp (1992) and Qu, Piedmonte and Williams (1994) using generalized estimating equations for both sets of parameters via a Monte Carlo experiment. As a `reference' estimator for an equicorrelation model, the maximum likelihood (ML) estimator of the random effects probit model is calculated. The results show the mixed approach to be the most robust approach in the sense that the number of datasets for which the corresponding estimates converged was largest relative to the other two approaches. Furthermore, the mixed approach led to the most efficient non-ML estimators and to very efficient estimators for regression and correlation structure parameters relative to the ML estimator if individual covariance matrices were used

    Regression Models with Correlated Binary Response Variables: A Comparison of Different Methods in Finite Samples

    Get PDF
    The present paper deals with the comparison of the performance of different estimation methods for regression models with correlated binary responses. Throughout, we consider probit models where an underlying latent continous random variable crosses a threshold. The error variables in the unobservable latent model are assumed to be normally distributed. The estimation procedures considered are (1) marginal maximum likelihood estimation using Gauss-Hermite quadrature, (2) generalized estimation equations (GEE) techniques with an extension to estimate tetrachoric correlations in a second step, and, (3) the MECOSA approach proposed by Schepers, Arminger and KĂŒsters (1991) using hierarchical mean and covariance structure models. We present the results of a simulation study designed to evaluate the small sample properties of the different estimators and to make some comparisons with respect to technical aspects of the estimation procedures and to bias and mean squared error of the estimators. The results show that the calculation of the ML estimator requires the most computing time, followed by the MECOSA estimator. For small and moderate sample sizes the calculation of the MECOSA estimator is problematic because of problems of convergence as well as a tendency of underestimating the variances. In large samples with moderate or high correlations of the errors in the latent model, the MECOSA estimators are not as efficient as ML or GEE estimators. The higher the `true' value of an equicorrelation structure in the latent model and the larger the sample sizes are, the more is the efficiency gain of the ML estimator compared to the GEE and MECOSA estimators. Using the GEE approach, the ML estimates of tetrachoric correlations calculated in a second step are biased to a smaller extent than using the MECOSA approach

    Ein Mehr-Zustands-Mehr-Episoden-Modell in diskreter Zeit zur Analyse klinischer Studien unter Beruecksichtigung unbeobachteter Heterogenitaet

    Get PDF
    Die in der Theorie der Verweildaueranalyse ueberwiegenden zeitstetigen Ansaetze gehen davon aus, dass keine Bindungen existieren. Diese Voraussetzung ist in der Praxis jedoch problematisch, so dass zeitdiskrete Ansaetze der Datensituation besser angepasst sind. Im medizinischen Kontext steht zumeist die Ueberlebenszeit von Patienten im Mittelpunkt. Neben diesem Ereignis koennen jedoch andere sich gegenseitig ausschliessende Ereignisse/Zustaende (competing risks) in verschiedenen Episoden, wie etwa der Gesundheitszustand im Krankheitsverlauf, von Interesse sein. In der vorliegenden Arbeit wird ein zeitdiskreter parametrischer Ansatz zur Analyse von competing risks fuer Mehr-Episoden-Modellen vorgestellt. Angelehnt an die Theorie der generalisierten linearen Modelle wird ein Multinomiales-Logit-Modell zur Modellierung der Hazardrate verwendet. Fuer die neben den beobachteten Einflussgroessen bestehende unbeobachtete Heterogenitaet wird eine Normalverteilungsannahme getroffen. Die Maximum-Likelihood-Schaetzung wird mittels des Newton-Raphson-Verfahrens durchgefuehrt, die noetige Integralapproximation erfolgt ueber die Gauss-Hermite-Quadraturtechnik. Mit der vorgestellten Methode werden Daten von 476 Patienten einer Hirntumorstudie ausgewertet

    Semiparametric EM-estimation of censored linear regression models for durations

    Get PDF
    This paper investigates the sensitivity of maximum quasi likelihood estimators of the covariate effects in duration models in the presence of misspecification due to neglected heterogeneity or misspecification of the hazard function. We consider linear models for r(T) where T is duration and r is a known, strictly increasing function. This class of models is also referred to as location-scale models. In the absence of censoring, Gould and Lawless (1988) have shown that maximum likelihood estimators of the regression parameters are consistent and asymptotically normally distributed under the assumption that the location-scale structure of the model is of the correct form. In the presence of censoring, however, model misspecification leads to inconsistent estimates of the regression coefficients for most of the censoring mechanisms that are widely used in practice. We propose a semiparametric EM-estimator, following ideas of Ritov (1990), and Buckley and James (1979). This estimator is robust against misspecification and is highly recommended if there is heavy censoring and if there may be specification errors. We present the results of simulation experiments illustrating the performance of the proposed estimator

    Systematic risk of CDOs and CDO arbitrage

    Get PDF
    “Arbitrage CDOs” have recorded an explosive growth during the years before the outbreak of the financial crisis. In the present paper we discuss potential sources of such arbitrage opportunities, in particular arbitrage gains due to mispricing. For this purpose we examine the risk profiles of Collateralized Debt Obligations (CDOs) in some detail. The analyses reveal significant differences in the risk profile between CDO tranches and corporate bonds, in particular concerning the considerably increased sensitivity to systematic risks. This has farreaching consequences for risk management, pricing and regulatory capital requirements. A simple analytical valuation model based on the CAPM and the single-factor Merton model is used in order to keep the model framework simple. Then, the conditional expected loss curve (EL profile) is studied in some detail. In the next step, the asset correlation associated with a CDO tranche is estimated treating the structured instrument as a single-name credit instrument (i.e., a loan equivalent). While tractable, the loan-equivalent approach requires appropriate parameterization to achieve a reasonable approximation of the tranche®s risk profile. We consider the tranche as a “virtual” borrower or bond for which a single-factor model holds. Then, the correlation parameter is calculated via a non-linear optimization. This “bond representation” allows to approximate the risk profile (expressed by the EL profile) using a single-factor model and to express the dependence on the systematic risk factor via the corresponding asset correlation. It turns out that the resulting asset correlation is many times higher than that of straight bonds. Then, the Merton type valuation model for the corresponding bond representations is applied for valuation of the CDO tranches. Using a sample CDO portfolio, some opportunities for “CDO arbitrage” are described where it is assumed that investors are guided solely by the tranches’ rating and ignore the increased systematic risk for pricing. In the next section we discuss how tranches with high systematic risk can be generated and how CDO arrangers can exploit this to their advantage. It comes as no surprise that precisely these types of structures featured in many of the CDOs issued prior to the outbreak of the financial crisis. --Collateralized debt obligations (CDO),arbitrage CDOs,credit rating,expected loss profile,bond representation,systematic risk of CDO tranches,CDO pricing

    Incorporating prediction and estimation risk in point-in-time credit portfolio models

    Get PDF
    In this paper we focus on the analysis of the effect of prediction and estimation risk on the loss distribution, risk measures and economic capital. When variables for the determination of probability of default and loss distribution have to be predicted because they are not available at the time the prediction is made, the prediction is prone to errors. The model parameters for the estimation of probability of default or asset correlation are not available, and usually have to be estimated using historical data. The incorporation of prediction and estimation risk generally leads to broader loss distributions and therefore to rising values of risk parameters such as Value at Risk or Expected Shortfall. The level of economic capital required may be strongly underestimated if prediction and estimation risk are ignored. --probability of default,PD,credit risk,default correlation,asset correlation,point in time,value at risk,estimation risk
    • 

    corecore