1,570 research outputs found

    The probability of default in internal ratings based (IRB) models in Basel II: an application of the rough sets methodology

    Get PDF
    El nuevo Acuerdo de Capital de junio de 2004 (Basilea II) da cabida e incentiva la implantación de modelos propios para la medición de los riesgos financieros en las entidades de crédito. En el trabajo que presentamos nos centramos en los modelos internos para la valoración del riesgo de crédito (IRB) y concretamente en la aproximación a uno de sus componentes: la probabilidad de impago (PD). Los métodos tradicionales usados para la modelización del riesgo de crédito, como son el análisis discriminante y los modelos logit y probit, parten de una serie de restricciones estadísticas. La metodología rough sets se presenta como una alternativa a los métodos estadísticos clásicos, salvando las limitaciones de estos. En nuestro trabajo aplicamos la metodología rought sets a una base de datos, compuesta por 106 empresas, solicitantes de créditos, con el objeto de obtener aquellos ratios que mejor discriminan entre empresas sanas y fallidas, así como una serie de reglas de decisión que ayudarán a detectar las operaciones potencialmente fallidas, como primer paso en la modelización de la probabilidad de impago. Por último, enfrentamos los resultados obtenidos con los alcanzados con el análisis discriminante clásico, para concluir que la metodología de los rough sets presenta mejores resultados de clasificación, en nuestro caso.The new Capital Accord of June 2004 (Basel II) opens the way for and encourages credit entities to implement their own models for measuring financial risks. In the paper presented, we focus on the use of internal rating based (IRB) models for the assessment of credit risk and specifically on the approach to one of their components: probability of default (PD). In our study we apply the rough sets methodology to a database composed of 106 companies, applicants for credit, with the object of obtaining those ratios that discriminate best between healthy and bankrupt companies, together with a series of decision rules that will help to detect the operations potentially in default, as a first step in modelling the probability of default. Lastly, we compare the results obtained against those obtained using classic discriminant análisis. We conclude that the rough sets methodology presents better risk classification results.Junta de Andalucía P06-SEJ-0153

    Adaptive quick reduct for feature drift detection

    Get PDF
    Data streams are ubiquitous and related to the proliferation of low-cost mobile devices, sensors, wireless networks and the Internet of Things. While it is well known that complex phenomena are not stationary and exhibit a concept drift when observed for a sufficiently long time, relatively few studies have addressed the related problem of feature drift. In this paper, a variation of the QuickReduct algorithm suitable to process data streams is proposed and tested: it builds an evolving reduct that dynamically selects the relevant features in the stream, removing the redundant ones and adding the newly relevant ones as soon as they become such. Tests on five publicly available datasets with an artificially injected drift have confirmed the effectiveness of the proposed method

    Optimized superpixel and AdaBoost classifier for human thermal face recognition

    Get PDF
    Infrared spectrum-based human recognition systems offer straightforward and robust solutions for achieving an excellent performance in uncontrolled illumination. In this paper, a human thermal face recognition model is proposed. The model consists of four main steps. Firstly, the grey wolf optimization algorithm is used to find optimal superpixel parameters of the quick-shift segmentation method. Then, segmentation-based fractal texture analysis algorithm is used for extracting features and the rough set-based methods are used to select the most discriminative features. Finally, the AdaBoost classifier is employed for the classification process. For evaluating our proposed approach, thermal images from the Terravic Facial infrared dataset were used. The experimental results showed that the proposed approach achieved (1) reasonable segmentation results for the indoor and outdoor thermal images, (2) accuracy of the segmented images better than the non-segmented ones, and (3) the entropy-based feature selection method obtained the best classification accuracy. Generally, the classification accuracy of the proposed model reached to 99% which is better than some of the related work with around 5%

    基于邻域粗糙集的多标记属性约简算法

    Get PDF
    在多标记学习中,属性约简是解决多标记数据维数灾难的一个关键技术.针对邻域粗糙集属性约简在计算正域代价较大和多标记数据中标记具有不同的强弱性问题,提出了基于邻域粗糙集的多标记属性约简算法.该算法首先利用样本在整个属性空间下到其异类样本的平均距离与到其同类样本的平均距离的差值对标记进行加权;其次,利用取整函数对样本空间进行划分,提出了一种新的多标记邻域粗糙集快速计算正域的方法;最后,根据前向贪心搜索算法进行属性约简,以获得一组新的属性排序.实验给出了5个多标记数据集在4个评价准则上的对比结果,实验结果分析表明了所提算法的有效性.国家青年科学基金项目(N61603173);;福建省自然科学基金项目(2018J01422);;浙江省海洋大数据挖掘与应用重点实验室开放课题(OBDMA201603

    Neutrosophic rule-based prediction system for toxicity effects assessment of biotransformed hepatic drugs

    Get PDF
    Measuring toxicity is an important step in drug development. However, the current experimental meth- ods which are used to estimate the drug toxicity are expensive and need high computational efforts. Therefore, these methods are not suitable for large-scale evaluation of drug toxicity. As a consequence, there is a high demand to implement computational models that can predict drug toxicity risks. In this paper, we used a dataset that consists of 553 drugs that biotransformed in the liver

    Herbal Remedies for Combating Irradiation: a Green Antiirradiation Approach

    Get PDF
    Plants play important roles in human life not only as suppliers of oxygen but also as a fundamental resource to sustain the human race on this earthly plane. Plants also play a major role in our nutrition by converting energy from the sun during photosynthesis. In addition, plants have been used extensively in traditional medicine since time immemorial. Information in the biomedical literature has indicated that many natural herbs have been investigated for their efficacy against lethal irradiation. Pharmacological studies by various groups of investigators have shown that natural herbs possess significant radioprotective activity. In view of the immense medicinal importance of natural product based radioprotective agents, this review aims at compiling all currently available information on radioprotective agents from medicinal plants and herbs, especially the evaluation methods and mechanisms of action. In this review we particularly emphasize on ethnomedicinal uses, botany, phytochemistry, mechanisms of action and toxicology. We also describe modern techniques for evaluating herbal samples as radioprotective agents. The usage of herbal remedies for combating lethal irradiation is a green antiirradiation approach for the betterment of human beings without high cost, side effects and toxicity

    Combining rough and fuzzy sets for feature selection

    Get PDF

    The value of superfund cleanups : evidence from U.S. Environmental Protection Agency decisions

    Get PDF
    Under the Superfund law, the U.S. Environmental Protection Agency (EPA) is responsible for inspecting hazardous waste sites and for putting those with the most serious contamination problems on a national priorities list. The EPA then oversees the cleanup of these sites, suing potentially responsible parties for the costs of cleanup when possible, and funding the cleanup of"orphaned"sites out of the Superfund, money raised taxing chemical and petroleum products. The Superfund program is controversial. Cleanups are costly and it is unclear whether the benefits of cleanup, especially the relative benefits of more permanent clenanup, are worth the costs. At many sites, imminent danger of exposure to contaminants can be removed at low cost. What raises the cost of cleanup is the decision to clean up the site for future generations - to incinerate contaminated soil, for example, or to pump and treat an aquifer for 30 years. To shed light on this debate, the authors infer the EPA's willingness to pay (or have others pay) for more permanent cleanups at Superfund sites. They do so by analyzing cleanup decisions for contaminated soils at 110 Superfund sites. They find that, other things being equal, the EPA was more likely to choose less expensive cleanup options. But, holding costs constant, the EPA was more likely to select more permanent options, such as incinerating the soil instead of capping it or putting it in a landfill. The EPA was willing to pay at least twice as much for onsite incineration of contaminated soil as it was for capping the soil. Has the EPA chosen more permanent Superfund cleanups in areas where residents are predominantly white and have high incomes? The authors find no evidence that the percentage of minority residents near a site influences the choice of cleanup selected. But offsite treatment was more likely at sites with higher incomes.General Technology,Environmental Governance,Sanitation and Sewerage,TF030632-DANISH CTF - FY05 (DAC PART COUNTRIES GNP PER CAPITA BELOW USD 2,500/AL,Environmental Economics&Policies
    corecore