928 research outputs found

    Generalized Extreme Value Regression for Binary Rare Events Data: an Application to Credit Defaults

    Get PDF
    The most used regression model with binary dependent variable is the logistic regression model. When the dependent variable represents a rare event, the logistic regression model shows relevant drawbacks. In order to overcome these drawbacks we propose the Generalized Extreme Value (GEV) regression model. In particular, in a Generalized Linear Model (GLM) with binary dependent variable we suggest the quantile function of the GEV distribution as link function, so our attention is focused on the tail of the response curve for values close to one. The estimation procedure is the maximum likelihood method. This model accommodates skewness and it presents a generalization of GLMs with log-log link function. In credit risk analysis a pivotal topic is the default probability estimation. Since defaults are rare events, we apply the GEV regression to empirical data on Italian Small and Medium Enterprises (SMEs) to model their default probabilities.

    Bankruptcy Prediction of Small and Medium Enterprises Using a Flexible Binary Generalized Extreme Value Model

    Full text link
    We introduce a binary regression accounting-based model for bankruptcy prediction of small and medium enterprises (SMEs). The main advantage of the model lies in its predictive performance in identifying defaulted SMEs. Another advantage, which is especially relevant for banks, is that the relationship between the accounting characteristics of SMEs and response is not assumed a priori (e.g., linear, quadratic or cubic) and can be determined from the data. The proposed approach uses the quantile function of the generalized extreme value distribution as link function as well as smooth functions of accounting characteristics to flexibly model covariate effects. Therefore, the usual assumptions in scoring models of symmetric link function and linear or pre-specied covariate-response relationships are relaxed. Out-of-sample and out-of-time validation on Italian data shows that our proposal outperforms the commonly used (logistic) scoring model for different default horizons

    Crafting multicultural science education with preservice teachers through service-learning

    Get PDF
    Many science educators, in the US and elsewhere, support the idea that all students should have fair and equal opportunities to become scientifically literate through authentic, real problem-based science education. However, this challenge requires teachers to find ways to help all students feel comfortable with, and connected to, science. Despite the general consensus around the ideal of science for all, science teacher education programmes have had little or no impact on pre-service teachersā€™ philosophies of teaching and learning, especially as it relates to serving under-served populations in science. In this paper, I explore community service-learning as one way of addressing the multicultural dimension of pre-service education with the following three questions: In what ways does involving pre-service science teachers in community service-learning influence their views on multicultural science education, in theory and practice? What qualities of community service-learning make multicultural science education a realistic objective? How might service-learning be used to push our collective understanding of what an inclusive and liberatory multicultural science teaching practice could be? I explore these questions and propose further areas of research by using a case study involving service-learning from my own teaching-research with pre-service students

    Citizen(sā€™) Science. A Response to The Future of Citizen Science

    Get PDF
    Citizen science is fundamentally about participation within and for communities. Attempts to merge citizen science with schooling must call not only for a democratization of schooling and science but also for the democratization of the ways in which science is taken up by, with, and for citizen participants. Using this stance, along with critical studies of place, I build on the criticisms of citizen science outlined in The Future of Citizen Science to argue for the centrality of place. Using a case of urban youths working toward transparency and cross-cultural dialogue regarding energy production in their community, I complicate the proposed immersion model to suggest a further reconstruction of citizen science in ways that account for youths\u27 deep and critical connections to the geohistorical and sociocultural dimensions of place

    A new approach to measure systemic risk:A bivariate copula model for dependent censored data

    Get PDF
    We propose a novel approach based on the Marshall-Olkin (MO) copula to estimate the impact of systematic and idiosyncratic components on cross-border systemic risk. To use the data on non-failed banks in the suggested method, we consider the time to bank failure as a censored variable. Therefore, we propose a pseudo-maximum likelihood estimation procedure for the MO copula for a Type I censored sample. We derive the log-likelihood function, the copula parameter estimator and the bootstrap confidence intervals. Empirical data on the banking system of three European countries (Germany, Italy and the UK) shows that the proposed censored model can accurately estimate the systematic component of cross-border systemic risk. (C) 2019 Elsevier B.V. All rights reserved

    Announcement from Publisher

    Full text link
    No Abstract.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/64529/1/20362_ftp.pd

    The effectiveness of TARP-CPP on the US banking industry: A new copula-based approach

    Get PDF
    Following the 2008 financial crisis, regulatory authorities and governments provided distressed banks with equity infusions in order to strengthen national banking systems. However, the effectiveness of these interventions for financial stability has not been extensively researched in the literature. In order to understand the effectiveness of these bailouts for the solvency of banks this paper proposes a new model: the Longitudinal Binary Generalised Extreme Value (LOBGEV) model. Differing from the existing models, the LOBGEV model allows us to analyse the temporal structure of the probability of failure for banks, for both those that received a bailout and for those that did not. In particular, it encompasses both the flexibility of the D-vine copula and the accuracy of the generalised extreme value model in estimating the probability of bank failure and of banks receiving approval for capital injection. We apply this new model to the US banking system from 2008 to 2013 in order to investigate how and to what extent the Troubled Asset Relief Program (TARP)- Capital Purchase Program (CPP) reduced the probability of the failure of commercial banks. We specifically identify a set of macroeconomic and bank-specific factors that affect the probability of bank failure for TARP-CCP recipients and for those that did not receive capital under TARP-CCP. Our results suggest that TARP-CPP provided only short-term relief for US commercial banks
    • ā€¦
    corecore