75 research outputs found

    Operations research in consumer finance: challenges for operational research

    No full text
    Consumer finance has become one of the most important areas of banking both because of the amount of money being lent and the impact of such credit on the global economy and the realisation that the credit crunch of 2008 was partly due to incorrect modelling of the risks in such lending. This paper reviews the development of credit scoring,-the way of assessing risk in consumer finance- and what is meant by a credit score. It then outlines ten challenges for Operational Research to support modelling in consumer finance. Some of these are to developing more robust risk assessment systems while others are to expand the use of such modelling to deal with the current objectives of lenders and the new decisions they have to make in consumer financ

    Consumer finance: challenges for operational research

    No full text
    Consumer finance has become one of the most important areas of banking, both because of the amount of money being lent and the impact of such credit on global economy and the realisation that the credit crunch of 2008 was partly due to incorrect modelling of the risks in such lending. This paper reviews the development of credit scoring—the way of assessing risk in consumer finance—and what is meant by a credit score. It then outlines 10 challenges for Operational Research to support modelling in consumer finance. Some of these involve developing more robust risk assessment systems, whereas others are to expand the use of such modelling to deal with the current objectives of lenders and the new decisions they have to make in consumer finance. <br/

    Modelling loss given default of corporate bonds and bank loans

    Get PDF
    Loss given default (LGD) modelling has become increasingly important for banks as they are required to comply with the Basel Accords for their internal computations of economic capital. Banks and financial institutions are encouraged to develop separate models for different types of products. In this thesis we apply and improve several new algorithms including support vector machine (SVM) techniques and mixed effects models to predict LGD for both corporate bonds and retail loans. SVM techniques are known to be powerful for classification problems and have been successfully applied to credit scoring and rating business. We improve the support vector regression models by modifying the SVR model to account for heterogeneity of bond seniorities to increase the predictive accuracy of LGD. We find the proposed improved versions of support vector regression techniques outperform other methods significantly at the aggregated level, and the support vector regression methods demonstrate significantly better predictive abilities compared with the other statistical models at the segmented level. To further investigate the impacts of unobservable firm heterogeneity on modelling recovery rates of corporate bonds a mixed effects model is considered, and we find that an obligor-varying linear factor model presents significant improvements in explaining the variations of recovery rates with a remarkably high intra-class correlation being observed. Our study emphasizes that the inclusion of an obligor-varying random effect term has effectively explained the unobservable firm level information shared by instruments of the same issuer. At last we incorporate the SVM techniques into a two-stage modelling framework to predict recovery rates of credit cards. The two-stage model with a support vector machine classifier is found to be advantageous on an out-of-time sample compared with other methods, suggesting that an SVM model is preferred to a logistic regression at the classification stage. We suggest that the choice of regression models is less influential in prediction of recovery rates than the choice of classification methods in the first step of two-stage models based on the empirical evidence. The risk weighted assets of financial institutions are determined by the estimates of LGD together with PD and EAD. A robust and accurate LGD model impacts banks when making business decisions including setting credit risk strategies and pricing credit products. The regulatory capital determined by the expected and unexpected losses is also important to the financial market stability which should be carefully examined by the regulators. In summary this research highlights the importance of LGD models and provides a new perspective for practitioners and regulators to manage credit risk quantitatively

    A model proposal for IFRS 16 IBR adjustment based on bond market pricing

    Get PDF
    The Incremental Borrowing Rate (IBR) is generally used by companies for discounting future lease payments and calculating the value of the lease assets and liabilities under IFRS 16. According to this standard, leased asset must be considered as a collateral, and therefore the yield to be used should reflect an adequate Loss-Given Default (LGD), which may vary depending on the estimated recovery rate of the asset (machinery, real estate, vehicles, etc.). There is a lack of accounting and finance literature focused on analysing how a standard IBR should be adjusted to reflect the expected underlying asset LGD in line with IFRS principles. In this context, we propose a model that uses bond quoted information as a basis for introducing an adjustment to the standard “unsecured” IBR. The model consists of replicating the change in a certain bond yield when there is a change in the LGD (usually due to a change in the seniority level). We empirically demonstrate that the model works by using data from real bond quotations (97 outstanding bonds quoted on several secondary markets such as NY, Vienna, Frankfurt and London). The empirical analysis has been performed for two different time periods: pre- COVID 19 and post-COVID 19

    Basel II compliant credit risk modelling: model development for imbalanced credit scoring data sets, loss given default (LGD) and exposure at default (EAD)

    No full text
    The purpose of this thesis is to determine and to better inform industry practitioners to the most appropriate classification and regression techniques for modelling the three key credit risk components of the Basel II minimum capital requirement; probability of default (PD), loss given default (LGD), and exposure at default (EAD). The Basel II accord regulates risk and capital management requirements to ensure that a bank holds enough capital proportional to the exposed risk of its lending practices. Under the advanced internal ratings based (IRB) approach Basel II allows banks to develop their own empirical models based on historical data for each of PD, LGD and EAD.In this thesis, first the issue of imbalanced credit scoring data sets, a special case of PD modelling where the number of defaulting observations in a data set is much lower than the number of observations that do not default, is identified, and the suitability of various classification techniques are analysed and presented. As well as using traditional classification techniques this thesis also explores the suitability of gradient boosting, least square support vector machines and random forests as a form of classification. The second part of this thesis focuses on the prediction of LGD, which measures the economic loss, expressed as a percentage of the exposure, in case of default. In this thesis, various state-of-the-art regression techniques to model LGD are considered. In the final part of this thesis we investigate models for predicting the exposure at default (EAD). For off-balance-sheet items (for example credit cards) to calculate the EAD one requires the committed but unused loan amount times a credit conversion factor (CCF). Ordinary least squares (OLS), logistic and cumulative logistic regression models are analysed, as well as an OLS with Beta transformation model, with the main aim of finding the most robust and comprehensible model for the prediction of the CCF. Also a direct estimation of EAD, using an OLS model, will be analysed. All the models built and presented in this thesis have been applied to real-life data sets from major global banking institutions

    Time matters: How default resolution times impact final loss rates

    Get PDF
    Using access to a unique bank loss database, we find positive dependencies of default resolution times (DRTs) of defaulted bank loan contracts and final loan loss rates (losses given default, LGDs). Due to this interconnection, LGD predictions made at the time of default and during resolution are subject to censoring. Pure (standard) LGD models are not able to capture effects of censoring. Accordingly, their LGD predictions may be biased and underestimate loss rates of defaulted loans. In this paper, we develop a Bayesian hierarchical modelling framework for DRTs and LGDs. In comparison to previous approaches, we derive final DRT estimates for loans in default which enables consistent LGD predictions conditional on the time in default. Furthermore, adequate unconditional LGD predictions can be derived. The proposed method is applicable to duration processes in general where the final outcomes depend on the duration of the process and are affected by censoring. By this means, we avoid bias of parameter estimates to ensure adequate predictions

    Simulation-based optimisation of the timing of loan recovery across different portfolios

    Full text link
    A novel procedure is presented for the objective comparison and evaluation of a bank's decision rules in optimising the timing of loan recovery. This procedure is based on finding a delinquency threshold at which the financial loss of a loan portfolio (or segment therein) is minimised. Our procedure is an expert system that incorporates the time value of money, costs, and the fundamental trade-off between accumulating arrears versus forsaking future interest revenue. Moreover, the procedure can be used with different delinquency measures (other than payments in arrears), thereby allowing an indirect comparison of these measures. We demonstrate the system across a range of credit risk scenarios and portfolio compositions. The computational results show that threshold optima can exist across all reasonable values of both the payment probability (default risk) and the loss rate (loan collateral). In addition, the procedure reacts positively to portfolios afflicted by either systematic defaults (such as during an economic downturn) or episodic delinquency (i.e., cycles of curing and re-defaulting). In optimising a portfolio's recovery decision, our procedure can better inform the quantitative aspects of a bank's collection policy than relying on arbitrary discretion alone.Comment: Accepted by the journal "Expert Systems with Applications". 25 pages (including appendix), 9 figures. arXiv admin note: text overlap with older arXiv:1907.1261

    Essays on Improvements to the Regulatory Capital Framework for Credit Risk

    Get PDF
    This cumulative thesis consists of three essays: 1) Does the Finalised Basel III Accord Treat Leasing Exposures Adequately? Evidence from a European Leasing Dataset; 2) Is the Regulatory Downturn LGD Adequate? Performance Analysis and Alternative Methods; 3) How A Credit Run Affects Asset Correlation and Financial Stability. All of which address various issues with the current credit risk framework of the Basel III Accord. The first essay points out the excessiveness of the Basel regulatory capital requirement for leasing exposures. Based on a dataset consisting of 2.4 million leasing contracts pooled by Leaseurope, the unexpected loss of a leasing portfolio is simulated via Monte Carlo simulations. The main messages from this analysis are: 1) the current regulatory capital requirement can reach five- to eightfold of the portfolio’s unexpected loss and 2) if the current capital requirement is reduced for 30%, it can remain neutral in the sense that there will be no incentive for institutions to favour offering leases over secured loans from the capital requirement perspective alone. The second essay investigates the mismatch between the downturn definition from the downturn LGD guidelines and the downturn definition in the conditional PD formula from the Internal Ratings-Based Approach. Based on an 18 years default dataset pooled by GCD, we confirm via Monte Carlo simulations that the downturn LGD (based on the guidelines) does not pass the minimum survival probability of 99.9%, as traditionally required in the Internal Ratings-Based Approach. A latent based downturn LGD is offered as an alternative, which suggests that there is at least a solution to the downturn LGD issue. The third essay raises a fundamental problem inherited in the foundation of the Internal Ratings-Based Approach, which inspects the constant asset correlation assumption in the Asymptotic Single-Risk Factor model. Since its introduction with the Basel II Accord, the asset correlation has never been updated. The argument is that the financial crisis should have played a role and may shift the asset correlation temporarily. In particular, the run-like behaviour by borrowers shortly before the financial crisis may cause a higher concentration of a specific asset class and thereby increasing the asset correlation. However, relaxing the assumption is not an easy task from the technical perspective. By a slight model adjustment, the change of the asset correlation can be observed. This analysis is supported by 18 years default dataset pooled by GCD. The results of these essays are aimed for a long-term improvement of the credit risk framework within the regulatory capital requirement to ensure stability in the financial ecosystem
    corecore