8 research outputs found

    Cost sensitive meta-learning

    Get PDF
    Classification is one of the primary tasks of data mining and aims to assign a class label to unseen examples by using a model learned from a training dataset. Most of the accepted classifiers are designed to minimize the error rate but in practice data mining involves costs such as the cost of getting the data, and cost of making an error. Hence the following question arises:Among all the available classification algorithms, and in considering a specific type of data and cost, which is the best algorithm for my problem?It is well known to the machine learning community that there is no single algorithm that performs best for all domains. This observation motivates the need to develop an “algorithm selector” which is the work of automating the process of choosing between different algorithms given a specific domain of application. Thus, this research develops a new meta-learning system for recommending cost-sensitive classification methods. The system is based on the idea of applying machine learning to discover knowledge about the performance of different data mining algorithms. It includes components that repeatedly apply different classification methods on data sets and measuring their performance. The characteristics of the data sets, combined with the algorithm and the performance provide the training examples. A decision tree algorithm is applied on the training examples to induce the knowledge which can then be applied to recommend algorithms for new data sets, and then active learning is used to automate the ability to choose the most informative data set that should enter the learning process.This thesis makes contributions to both the fields of meta-learning, and cost sensitive learning in that it develops a new meta-learning approach for recommending cost-sensitive methods. Although, meta-learning is not new, the task of accelerating the learning process remains an open problem, and the thesis develops a novel active learning strategy based on clustering that gives the learner the ability to choose which data to learn from and accordingly, speed up the meta-learning process.Both the meta-learning system and use of active learning are implemented in the WEKA system and evaluated by applying them on different datasets and comparing the results with existing studies available in the literature. The results show that the meta-learning system developed produces better results than METAL, a well-known meta-learning system and that the use of clustering and active learning has a positive effect on accelerating the meta-learning process, where all tested datasets show a decrement of error rate prediction by 75 %

    The Multispectral Imaging Science Working Group. Volume 3: Appendices

    Get PDF
    The status and technology requirements for using multispectral sensor imagery in geographic, hydrologic, and geologic applications are examined. Critical issues in image and information science are identified

    NEGOTIATION-BASED RISK MANAGEMENT FOR PPP-BOT INFRASTRUCTURE PROJECTS

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    1995-1996 Louisiana Tech University Catalog

    Get PDF
    The Louisiana Tech University Catalog includes announcements and course descriptions for courses offered at Louisiana Tech University for the academic year of 1995-1996.https://digitalcommons.latech.edu/university-catalogs/1019/thumbnail.jp

    The Role of Systemic Risk, Regulation and Efficiency within the Banking Competition and Financial Stability Relationship

    Get PDF
    This thesis provides empirical evidence of the banking competition-stability nexus from the Basel jurisdictions with a main focus on the United States (US) banking sector from 2000 to 2015. In order to assess this relationship, three papers in the format of journal articles were used to explore different theoretical concepts. The first paper, is a systematic literature review of 4,859 abstracts to identify the different types of systemic risk measures and the challenges regulators face in addressing systemic risk. 56 measures of systemic risk developed post-2000 were identified and critically appraised to inform academics and regulators of the models' vulnerabilities. Additionally, a number of measures were calculated using US bank data. The findings of this paper suggests that the majority of these measures tend to focus on individual financial institutions' risk rather than the entire system stability. This directly reflects the current regulations, which aim to ensure individual institutions' soundness. As macro-prudential regulation evolves, policy-makers face the issues of understanding contagion and how such regulation should be implemented. The second paper is an empirical analysis of banking cost efficiency, the aim of this paper is threefold, firstly to conduct an empirical literature review of banking sector efficiency over the last two decades, thereby identifying banking risk and regulatory variables used to access efficiency. Secondly, Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) are applied to measure efficiency within the Basel jurisdiction's banks. Thirdly, it aims to investigate the determinates of cost efficiency in the US banks by employing System Generalised Methods of Moments (GMM) regression analysis using panel data. This paper found the GMM analysis econometric measures of efficiency provided more statistically significant regression models than when using accounting based measures of efficiency. Also it was found that credit and liquidity risks are negatively associated with efficiency, and regulations designed to mitigate these risks have a negative impact on efficiency. The final paper combines the literature and calculations from papers one and two, to examine the role of risk, regulation and efficiency within the banking competition and financial stability relationship. Using GMM regression, this paper found a neutral view of the competition-stability nexus within the US banking sector, where both competition and concentration fragility co-exist. In addition, a unique polynomial competition-fragility relationship was found. Interestingly using the Composite Index of Systemic Stress (CISS) as a measure of systemic risk, altered the competition-stability relationship to identify a concave relationship. This suggests that the competition-stability nexus within one country can differ at the microeconomic (financial stability) and macroeconomic (systemic risk) level. In regards to increased risk, credit, leverage, diversification and liquidity risk was found to be negatively associated with financial stability. Whilst increased capital requirements as proposed by Basel III enhanced stability, the Net Stable Funding Ratio (NSFR) was unexpectedly found to hinder stability, providing caution to regulators as this is currently implemented. The findings within this thesis provide an incentive for further academic research in the area of liquidity & systemic risk, which would be relevant to practitioners and policy-makers to enhance their understanding of banking competition and financial stability
    corecore