16,503 research outputs found
Optimization Heuristics for Determining Internal Rating Grading Scales
Basel II imposes regulatory capital on banks related to the default risk of their credit portfolio. Banks using an internal rating approach compute the regulatory capital from pooled probabilities of default. These pooled probabilities can be calculated by clustering credit borrowers into different buckets and computing the mean PD for each bucket. The clustering problem can become very complex when Basel II regulations and real-world constraints are taken into account. Search heuristics have already proven remarkable performance in tackling this problem as complex as it is. A Threshold Accepting algorithm is proposed, which exploits the inherent discrete nature of the clustering problem. This algorithm is found to outperform alternative methodologies already proposed in the literature, such as standard k-means and Differential Evolution. Besides considering several clustering objectives for a given number of buckets, we extend the analysis further by introducing new methods to determine the optimal number of buckets in which to cluster banks' clients.credit risk, probability of default, clustering, Threshold Accepting, Differential Evolution
Optimization Heuristics for Determining Internal Rating Grading Scales
Basel II imposes regulatory capital on banks related to the default risk of their credit portfolio. Banks using an internal rating approach compute the regulatory capital from pooled probabilities of default. These pooled probabilities can be calculated by clustering credit borrowers into different buckets and computing the mean PD for each bucket. The clustering problem can become very complex when Basel II regulations and real-world constraints are taken into account. Search heuristics have already proven remarkable performance in tackling this problem as complex as it is. A Threshold Accepting algorithm is proposed, which exploits the inherent discrete nature of the clustering problem. This algorithm is found to outperform alternative methodologies already proposed in the literature, such as standard k-means and Differential Evolution. Besides considering several clustering objectives for a given number of buckets, we extend the analysis further by introducing new methods to determine the optimal number of buckets in which to cluster banks' clients.credit risk, probability of default, clustering, Threshold Accepting, Differential Evolution
Recommended from our members
High-Performance Integrated Window and Façade Solutions for California
The researchers developed a new generation of high-performance façade systems and supporting design and management tools to support industry in meeting California’s greenhouse gas reduction targets, reduce energy consumption, and enable an adaptable response to minimize real-time demands on the electricity grid. The project resulted in five outcomes: (1) The research team developed an R-5, 1-inch thick, triplepane, insulating glass unit with a novel low-conductance aluminum frame. This technology can help significantly reduce residential cooling and heating loads, particularly during the evening. (2) The team developed a prototype of a windowintegrated local ventilation and energy recovery device that provides clean, dry fresh air through the façade with minimal energy requirements. (3) A daylight-redirecting louver system was prototyped to redirect sunlight 15–40 feet from the window. Simulations estimated that lighting energy use could be reduced by 35–54 percent without glare. (4) A control system incorporating physics-based equations and a mathematical solver was prototyped and field tested to demonstrate feasibility. Simulations estimated that total electricity costs could be reduced by 9-28 percent on sunny summer days through adaptive control of operable shading and daylighting components and the thermostat compared to state-of-the-art automatic façade controls in commercial building perimeter zones. (5) Supporting models and tools needed by industry for technology R&D and market transformation activities were validated. Attaining California’s clean energy goals require making a fundamental shift from today’s ad-hoc assemblages of static components to turnkey, intelligent, responsive, integrated building façade systems. These systems offered significant reductions in energy use, peak demand, and operating cost in California
Recommended from our members
Software engineering: Testing real-time embedded systems using timed automata based approaches
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Real-time Embedded Systems (RTESs) have an increasing role in controlling society infrastructures that we use on a day-to-day basis. RTES behaviour is not based solely on the interactions it might have with its surrounding environment, but also on the timing requirements it induces. As a result, ensuring that an RTES behaves correctly is non-trivial, especially after adding time as a new dimension to the complexity of the testing process. This research addresses the problem of testing RTESs from Timed Automata (TA) specification by the following. First, a new Priority-based Approach (PA) for testing RTES modelled formally as UPPAAL timed automata (TA variant) is introduced. Test cases generated according to a proposed timed adequacy criterion (clock region coverage) are divided into three sets of priorities, namely boundary, out-boundary and in-boundary. The selection of which set is most appropriate for a System Under Test (SUT) can be decided by the tester according to the system type, time specified for the testing process and its budget. Second, PA is validated in comparison with four well-known timed testing approaches based on TA using Specification Mutation Analysis (SMA). To enable the validation, a set of timed and functional mutation operators based on TA is introduced. Three case studies are used to run SMA. The effectiveness of timed testing approaches are determined and contrasted according to the mutation score which shows that our PA achieves high mutation adequacy score compared with others. Third, to enhance the applicability of PA, a new testing tool (GeTeX) that deploys PA is introduced. In its current version, GeTeX supports Control Area Network (CAN) applications. GeTeX is validated by developing a prototype for that purpose. Using GeTeX, PA is also empirically validated in comparison with some TA testing approaches using a complete industrial-strength test bed. The assessment is based on fault coverage, structural coverage, the length of generated test cases and a proposed assessment factor. The assessment is based on fault coverage, structural coverage, the length of generated test cases and a proposed assessment factor. The assessment results confirmed the superiority of PA over the other test approaches. The overall assessment factor showed that structural and fault coverage scores of PA with respect to the length of its tests were better than the others proving the applicability of PA. Finally, an Analytical Hierarchy Process (AHP) decision-making framework for our PA is developed. The framework can provide testers with a systematic approach by which they can prioritise the available PA test sets that best fulfils their testing requirements. The AHP framework developed is based on the data collected heuristically from the test bed and data collected by interviewing testing experts. The framework is then validated using two testing scenarios. The decision outcomes of the AHP framework were significantly correlated to those of testing experts which demonstrated the soundness and validity of the framework.This study is funded by Damascus University, Syri
The development of a program analysis environment for Ada
A unit level, Ada software module testing system, called Query Utility Environment for Software Testing of Ada (QUEST/Ada), is described. The project calls for the design and development of a prototype system. QUEST/Ada design began with a definition of the overall system structure and a description of component dependencies. The project team was divided into three groups to resolve the preliminary designs of the parser/scanner: the test data generator, and the test coverage analyzer. The Phase 1 report is a working document from which the system documentation will evolve. It provides history, a guide to report sections, a literature review, the definition of the system structure and high level interfaces, descriptions of the prototype scope, the three major components, and the plan for the remainder of the project. The appendices include specifications, statistics, two papers derived from the current research, a preliminary users' manual, and the proposal and work plan for Phase 2
Space station advanced automation
In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software
Aerodynamic parameter identification for an unmanned aerial vehicle
A dissertation submitted to the Faculty of Engineering and the Built Environment, School of Mechanical, Industrial and Aeronautical Engineering, University of the Witwatersrand, in fulfilment of the requirements for the degree of Master of Science in Engineering.
Johannesburg, May 2016The present work describes the practical implementation of systems identification techniques to the development of a linear aerodynamic model for a small low-cost UAV equipped with a basic navigational and inertial measurement systems. The assessment of the applicability of the techniques were based on determining whether adequate aerodynamic models could be developed to aid in the reduction of wind tunnel testing when characterising new UAVs. The identification process consisted of postulating a model structure, flight test manoeuvre design, data reconstruction, aerodynamic parameter estimation, and model validation. The estimators that were used for the post-flight identification were the output error maximum likelihood method and an iterated extended Kalman filter with a global smoother. SIDPAC and FVSysID systems identification toolboxes were utilised and modified where appropriate. The instrumentation system on board the UAV consisted of three-axis accelerometers and gyroscopes, a three-axis vector magnetometer and GPS tracking while data was logged at 25 Hz. The angle of attack and angle of sideslip were not measured directly and were estimated using tailored data reconstruction methods. Adequate time domain lateral model correlation with flight data was achieved for the cruise flight condition. Adequacy was assessed against Theil’s inequality coefficients and Theil’s covariance. It was found that the simplified estimation algorithms based on the linearized equations of motion yielded the most promising model matches. Due to the high correlation between the pitch damping derivatives, the longitudinal analysis did not yield valid model parameter estimates. Even though the accuracy of the resulting models was below initial expectations, the detailed data compatibility analysis provided valuable insight into estimator limitations, instrumentation requirements and test procedures for systems identification on low-cost UAVs.MT201
Threshold Accepting for Credit Risk Assessment and Validation
According to the latest Basel framework of Banking Supervision, financial institutions should internally assign their borrowers into a number of homogeneous groups. Each group is assigned a probability of default which distinguishes it from other groups. This study aims at determining the optimal number and size of groups that allow for statistical ex post validation of the efficiency of the credit risk assignment system. Our credit risk assignment approach is based on Threshold Accepting, a local search optimization technique, which has recently performed reliably in credit risk clustering especially when considering several realistic constraints. Using a relatively large real-world retail credit portfolio, we propose a new technique to validate ex post the precision of the grading system.credit risk assignment, Threshold Accepting, statistical validation
- …