45 research outputs found

    The artist-curator as active citizen: curatorial research, institutions and community space

    Get PDF
    The institutions that we attend, that are provided for us within our communities, or that we self-build and are community-led, create frameworks that we use to establish cultural meanings. Where such cultural infrastructure is missing from a community the artist-curatorial researcher can employ new institutional strategies of public dialogue and civic action in exploration of the nature and possibilities for implementing new frameworks that hold significance for the communities that utilise them. The inquiry has explored the wider discourses of new institutionalism to develop understandings of how such methodologies can operate outside of established arts world frameworks to develop contemporary art exhibitions and counter-public space within a semi-rural community. Thus reaching new audiences through a critical curatorial practice, more often located in a city based sphere of contemporary arts. An important feature of this work is its concern to explore how such a critical agenda – that attempts to challenge how various identified boundaries between artists, audiences and institutions can come about at a local level. This discussion then, centres primarily on projects and initiatives, developed in the Northwich area of Cheshire West and Chester. It involves specific dialogue with council officials, artists and communities of that locality. As such, this is to be seen as a case study with specific characteristics, which may not be universal, but nonetheless constitute substantial insights for further research of this kind

    A Three Species Model to Simulate Application of Hyperbaric Oxygen Therapy to Chronic Wounds

    Get PDF
    Chronic wounds are a significant socioeconomic problem for governments worldwide. Approximately 15% of people who suffer from diabetes will experience a lower-limb ulcer at some stage of their lives, and 24% of these wounds will ultimately result in amputation of the lower limb. Hyperbaric Oxygen Therapy (HBOT) has been shown to aid the healing of chronic wounds; however, the causal reasons for the improved healing remain unclear and hence current HBOT protocols remain empirical. Here we develop a three-species mathematical model of wound healing that is used to simulate the application of hyperbaric oxygen therapy in the treatment of wounds. Based on our modelling, we predict that intermittent HBOT will assist chronic wound healing while normobaric oxygen is ineffective in treating such wounds. Furthermore, treatment should continue until healing is complete, and HBOT will not stimulate healing under all circumstances, leading us to conclude that finding the right protocol for an individual patient is crucial if HBOT is to be effective. We provide constraints that depend on the model parameters for the range of HBOT protocols that will stimulate healing. More specifically, we predict that patients with a poor arterial supply of oxygen, high consumption of oxygen by the wound tissue, chronically hypoxic wounds, and/or a dysfunctional endothelial cell response to oxygen are at risk of nonresponsiveness to HBOT. The work of this paper can, in some way, highlight which patients are most likely to respond well to HBOT (for example, those with a good arterial supply), and thus has the potential to assist in improving both the success rate and hence the cost-effectiveness of this therapy

    A review of mathematical models for the formation of vascular networks

    Get PDF
    Two major mechanisms are involved in the formation of blood vasculature: vasculogenesis and angiogenesis. The former term describes the formation of a capillary-like network from either a dispersed or a monolayered population of endothelial cells, reproducible also in vitro by specific experimental assays. The latter term describes the sprouting of new vessels from an existing capillary or post-capillary venule. Similar mechanisms are also involved in the formation of the lymphatic system through a process generally called lymphangiogenesis. A number of mathematical approaches have been used to analyse these phenomena. In this article, we review the different types of models, with special emphasis on their ability to reproduce different biological systems and to predict measurable quantities which describe the overall processes. Finally, we highlight the advantages specific to each of the different modelling approaches. The research that led to the present paper was partially supported by a grant of the group GNFM of INdA

    Modelling exposure at default without using conversion factors

    No full text
    "November 10, 2015" --title pageBibliography: pages 75-771. Executive summary -- 2. Background to credit risk -- 3. Literature review -- 4. Statistical modelling -- Appendices -- Bibliography.Banks accredited by their regulator to use the Advanced Internal Ratings Based (A-IRB) approach are required to provide their own estimates for calculating their minimum credit capital; these estimates rely on statistical and analytical models to predict Probability of Default (PD), Loss Given Default (LGD) and Exposure at Default (EAD). This thesis focusses on estimating EAD for banks granting revolving loans to large corporates and leverages the Global Credit Data (GCD) database. This thesis briefly discusses why risk management, particularly credit risk management, is important for banks and we survey the existing EAD modelling literature which to date has had less focus than PD and LGD modelling. Our prosed methodology models both loan balance at default (EAD) and changes in loan limit at default as random variables, modelling their joint dynamics via a two stage model– the first stage estimates the probability that limits decrease while the second stage estimates EAD conditional on changing limits. To the best of our knowledge, our approach is the first to estimate EAD and changes in loan limit directly for large corporate revolving facilities using the GCD database. Our model suggests that the key drivers of EAD include: limit; balance; utilisation; risk rating; and time to maturity. We also find evidence that banks actively manage limits in the lead up to default, and that these changes in limits have substantial effects on the outcomes of realised EAD.Mode of access: World Wide Web.Mode of access: World wide web1 online resource (1 online resource (77 pages

    Lifetime probability of default: Cox model with time-dependent covariates using maximum likelihood

    No full text
    Empirical thesis.Bibliography: pages 65-68.1. Introduction -- 2. Background to survival analysis -- 3. Literature review -- 4. Maximum likelihood estimation for Cox model with time-varying covariates -- 5. Results -- 6. Conclusion and discussion -- 7. Appendix -- Supplementary material -- References.Credit granting institutions need to estimate the probability of default, the chance a customer fails to make repayments as promised (BIS (2006) and IASB (2014)). The Cox model with time-varying covariates (Cox (1972), Crowley and Hu (1977)) is a technique often applied due to its substantial benefits beyond classification approaches (such as logistic regression) whilst achieving similar accuracy (Lessmann et al. (2015), Bellotti and Crook(2009)).However partial likelihood estimation of this model has two shortcomings that remain unaddressed in the literature: (1) the baseline hazard is not estimated, so calculating probabilities requires a further estimation step; and (2) a covariance matrix for both regression parameters and the baseline hazard is not produced.We address these by developing a maximum likelihood method that jointly estimates regression coefficients and the baseline hazard using constrained optimisation that ensures the baseline hazard’s non-negativity. We show in a simulation our technique is more accurate in moderate sized samples and when applied to real home loan data it produces a smoother estimate of the baseline hazard than the Breslow (1972) estimator. Our model could be used to predict life-time probability of default, required under the International Financial Reporting Standard (IFRS) 9 accounting standard.Mode of access: World wide web1 online resource ([vii], 68 pages

    Survival analysis: applications to credit risk default modelling

    No full text
    Thesis by publication.Includes bibliographic references1 Introduction -- 2 Literature Review -- 3 PAPER 1: On Maximum Likelihood Estimation of the Semi-Parametric Cox Model with Time-Varying Covariates -- 4 PAPER 2: On Maximum Likelihood Estimation of Competing Risks using the Cause-Specific Semi-Parametric Cox Model with Time-Varying Covariates - an Application to Credit Risk -- 5 PAPER 3: Maximum Likelihood Estimation of the Mixture Cure Semi-Parametric Cox Model -an Application to Credit Risk -- 6 Conclusion and Discussion.Credit-granting institutions lend money to customers, some of which may fail to make contractual repayments (namely principal, interest and fees) thereby defaulting on their obligation. Firms employ quantitative credit risk management techniques to estimate and appropriately control their credit risk, ensuring the firm's risk profile remains within its risk appetite, thus contributing to a safely run firm and stability of the wider economy. Quantitative credit risk management techniques are used to estimate: Probability of Default (PD); Exposure at Default (EAD); and Loss Given Default (LGD). These are inputs to calculate expected loss (EL) (for loan-loss provisions required under international accounting standards (IASB (2014), FASB (2016)), aswell unexpected loss (UL) (required by institutions granted regulatory approval under the BaselAccords (BIS, 2006) to use theAdvanced Internal Ratings Based (A-IRB) Approach for minimum credit capital). This thesis focuses on applying survival analysis to quantifying the risk of credit default used for PD. Institutions already use their own internal data and leverage analytical techniques to quantify the risk of credit default, so the refinements in this thesis could further assist firms control their credit risk profile. To be granted regulatory and audit approval, quantitative credit risk models need to have intuitive drivers and functional form. Therefore regression approaches are regularly adopted, and while logistic regression is common (Baesens et al. (2003), Lessmann et al. (2015)), survival models achieve comparable accuracy to logistic regression but provide additional benefits, such as including censored data and estimations over multiple time horizons (Bellotti and Crook (2009), Stepanova and Thomas (2002) and Tong et al. (2012)). Survival analysis describes studies where subjects are followed in anticipation they encounter an event of interest. Originating with Edmund Halley's life table of human mortality (1693) and its extension by Daniel Bernoulli (1760) demonstrating the increase in human survival if the competing risk of small pox were eliminated as a cause of death, survival analysis spans applications across multiple disciplines, such as biomedical science, industrial life testing (Kalbfleisch and Prentice, 2002) and finance (Lessmann et al., 2015). Regression techniques and method of partial likelihood were introduced by David Cox (1972, 1975), and remain prominent (Hosmer et al., 2008). This model has since been extended, particularly by Crowley and Hu (1977) to cater for time-varying covariates, and by (for example) Sy and Taylor (2000) to cater for mixture-cure models. This thesis explores over three chapters, via two published papers and one manuscript prepared for publication, computational enhancements to the application of survival analysis, competing risk analysis, and mixture-cure analysis, to estimating the risk of credit default. These enhancements are: (1) joint estimation of regression coefficients and baseline hazard using constrained maximum likelihood, where the constraint ensures the latter's nonnegativity; (2) calculation of an asymptotic variance-covariance matrix that allows inferences to be drawn for regression estimates; (3) improved accuracy of parameters in certain settings as demonstrated via simulation. Applied to credit risk modelling, the methods in this thesis provide comparably accurate regression parameters to those obtained using partial likelihood but with the added benefit of also returning an estimate of a baseline hazard estimate with relatively low variability along with asymptotic variance estimates for the baseline hazard and all regression parameters. This further information allows clearer resolution of the shape and statistical significance of the underlying baseline hazard for the risk of credit default. For survival analysis and competing risk analysis approaches in this thesis, time-varying covariates are included providing additional flexibility of including into the models covariates whose values change over time.Mode of access: Internet.1 online resource (xiv, 190 pages) : illustration
    corecore