284,777 research outputs found

    On Reduced Form Intensity-based Model with Trigger Events

    Full text link
    Corporate defaults may be triggered by some major market news or events such as financial crises or collapses of major banks or financial institutions. With a view to develop a more realistic model for credit risk analysis, we introduce a new type of reduced-form intensity-based model that can incorporate the impacts of both observable "trigger" events and economic environment on corporate defaults. The key idea of the model is to augment a Cox process with trigger events. Both single-default and multiple-default cases are considered in this paper. In the former case, a simple expression for the distribution of the default time is obtained. Applications of the proposed model to price defaultable bonds and multi-name Credit Default Swaps (CDSs) are provided

    The relationship between default and economic cycles for retail portfolios across countries

    No full text
    In this paper, we collect consumer delinquency data from several economic shocks in order to study the creation of stress-testing models. We leverage the dual-time dynamics modeling technique to better isolate macroeconomic impacts whenever vintage-level performance data is available. The stress-testing models follow a framework described here of focusing on consumer-centric macroeconomic variables so that the models are as robust as possible when predicting the impacts of future shocks

    Asset liability management using stochastic programming

    Get PDF
    This chapter sets out to explain an important financial planning model called asset liability management (ALM); in particular, it discusses why in practice, optimum planning models are used. The ability to build an integrated approach that combines liability models with that of asset allocation decisions has proved to be desirable and more efficient in that it can lead to better ALM decisions. The role of uncertainty and quantification of risk in these planning models is considered

    Predicting time to graduation at a large enrollment American university

    Full text link
    The time it takes a student to graduate with a university degree is mitigated by a variety of factors such as their background, the academic performance at university, and their integration into the social communities of the university they attend. Different universities have different populations, student services, instruction styles, and degree programs, however, they all collect institutional data. This study presents data for 160,933 students attending a large American research university. The data includes performance, enrollment, demographics, and preparation features. Discrete time hazard models for the time-to-graduation are presented in the context of Tinto's Theory of Drop Out. Additionally, a novel machine learning method: gradient boosted trees, is applied and compared to the typical maximum likelihood method. We demonstrate that enrollment factors (such as changing a major) lead to greater increases in model predictive performance of when a student graduates than performance factors (such as grades) or preparation (such as high school GPA).Comment: 28 pages, 11 figure

    On the predictability of emerging market sovereign credit spreads

    Get PDF
    This paper examines the quarter-ahead out-of-sample predictability of Brazil, Mexico, the Philippines and Turkey credit spreads before and after the Lehman Brothers’ default. A model based on the country-specific credit spread curve factors predicts no better than the random walk and slope regression benchmarks. Model extensions with the global yield curve factors and with both global and domestic uncertainty indicators notably outperform both benchmarks post-Lehman. The finding that bond prices better reflect fundamental information after the Lehman Brothers’ failure indicates that this landmark of the recent global financial crisis had wake-up call effects on emerging market bond investors

    Estimating the historical and future probabilities of large terrorist events

    Full text link
    Quantities with right-skewed distributions are ubiquitous in complex social systems, including political conflict, economics and social networks, and these systems sometimes produce extremely large events. For instance, the 9/11 terrorist events produced nearly 3000 fatalities, nearly six times more than the next largest event. But, was this enormous loss of life statistically unlikely given modern terrorism's historical record? Accurately estimating the probability of such an event is complicated by the large fluctuations in the empirical distribution's upper tail. We present a generic statistical algorithm for making such estimates, which combines semi-parametric models of tail behavior and a nonparametric bootstrap. Applied to a global database of terrorist events, we estimate the worldwide historical probability of observing at least one 9/11-sized or larger event since 1968 to be 11-35%. These results are robust to conditioning on global variations in economic development, domestic versus international events, the type of weapon used and a truncated history that stops at 1998. We then use this procedure to make a data-driven statistical forecast of at least one similar event over the next decade.Comment: Published in at http://dx.doi.org/10.1214/12-AOAS614 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore