2,718 research outputs found

    Software defect prediction using maximal information coefficient and fast correlation-based filter feature selection

    Get PDF
    Software quality ensures that applications that are developed are failure free. Some modern systems are intricate, due to the complexity of their information processes. Software fault prediction is an important quality assurance activity, since it is a mechanism that correctly predicts the defect proneness of modules and classifies modules that saves resources, time and developers’ efforts. In this study, a model that selects relevant features that can be used in defect prediction was proposed. The literature was reviewed and it revealed that process metrics are better predictors of defects in version systems and are based on historic source code over time. These metrics are extracted from the source-code module and include, for example, the number of additions and deletions from the source code, the number of distinct committers and the number of modified lines. In this research, defect prediction was conducted using open source software (OSS) of software product line(s) (SPL), hence process metrics were chosen. Data sets that are used in defect prediction may contain non-significant and redundant attributes that may affect the accuracy of machine-learning algorithms. In order to improve the prediction accuracy of classification models, features that are significant in the defect prediction process are utilised. In machine learning, feature selection techniques are applied in the identification of the relevant data. Feature selection is a pre-processing step that helps to reduce the dimensionality of data in machine learning. Feature selection techniques include information theoretic methods that are based on the entropy concept. This study experimented the efficiency of the feature selection techniques. It was realised that software defect prediction using significant attributes improves the prediction accuracy. A novel MICFastCR model, which is based on the Maximal Information Coefficient (MIC) was developed to select significant attributes and Fast Correlation Based Filter (FCBF) to eliminate redundant attributes. Machine learning algorithms were then run to predict software defects. The MICFastCR achieved the highest prediction accuracy as reported by various performance measures.School of ComputingPh. D. (Computer Science

    Customer requirements based ERP customization using AHP technique

    Get PDF
    Purpose– Customization is a difficult task for many organizations implementing enterprise resource planning (ERP) systems. The purpose of this paper is to develop a new framework based on customers’ requirements to examine the ERP customization choices for the enterprise. The analytical hierarchy process (AHP) technique has been applied complementarily with this framework to prioritize ERP customization choices. \ud \ud Design/methodology/approach– Based on empirical literature, the paper proposed an ERP customization framework anchored on the customer's requirements. A case study research method was used to evaluate the applicability of the framework in a real-life setting. In a case study with 15 practitioners working on the vendor's and the client's sides in an ERP implementation, the paper applied the framework jointly with the AHP technique to prioritize the feasible customization choices for ERP implementation. \ud \ud Findings– The paper demonstrates the applicability of the framework in identifying the various feasible choices for the client organization to consider when they decide to customize their selected ERP product. \ud \ud Research limitations/implications– Further case studies need to be carried out in various contexts to acquire knowledge about the generalizability of the observations. This will also contribute to refining the proposed ERP customization framework. \ud \ud Practical implications– Very few literature sources suggest methods for exploring and evaluating customization options in ERP projects from requirements engineering perspective. The proposed framework helps practitioners and consultants anchor the customization decisions on the customer's requirements and use a well-established prioritization technique, AHP, to identify the feasible customization choices for the implementing enterprise. \ud \ud Originality/value– No previously published research studies provide an approach to prioritize customization choices for ERP anchored on the customer's requirements

    A Hierarchical, Fuzzy Inference Approach to Data Filtration and Feature Prioritization in the Connected Manufacturing Enterprise

    Get PDF
    The current big data landscape is one such that the technology and capability to capture and storage of data has preceded and outpaced the corresponding capability to analyze and interpret it. This has led naturally to the development of elegant and powerful algorithms for data mining, machine learning, and artificial intelligence to harness the potential of the big data environment. A competing reality, however, is that limitations exist in how and to what extent human beings can process complex information. The convergence of these realities is a tension between the technical sophistication or elegance of a solution and its transparency or interpretability by the human data scientist or decision maker. This dissertation, contextualized in the connected manufacturing enterprise, presents an original Fuzzy Approach to Feature Reduction and Prioritization (FAFRAP) approach that is designed to assist the data scientist in filtering and prioritizing data for inclusion in supervised machine learning models. A set of sequential filters reduces the initial set of independent variables, and a fuzzy inference system outputs a crisp numeric value associated with each feature to rank order and prioritize for inclusion in model training. Additionally, the fuzzy inference system outputs a descriptive label to assist in the interpretation of the feature’s usefulness with respect to the problem of interest. Model testing is performed using three publicly available datasets from an online machine learning data repository and later applied to a case study in electronic assembly manufacture. Consistency of model results is experimentally verified using Fisher’s Exact Test, and results of filtered models are compared to results obtained by the unfiltered sets of features using a proposed novel metric of performance-size ratio (PSR)

    Induction Motors

    Get PDF
    AC motors play a major role in modern industrial applications. Squirrel-cage induction motors (SCIMs) are probably the most frequently used when compared to other AC motors because of their low cost, ruggedness, and low maintenance. The material presented in this book is organized into four sections, covering the applications and structural properties of induction motors (IMs), fault detection and diagnostics, control strategies, and the more recently developed topology based on the multiphase (more than three phases) induction motors. This material should be of specific interest to engineers and researchers who are engaged in the modeling, design, and implementation of control algorithms applied to induction motors and, more generally, to readers broadly interested in nonlinear control, health condition monitoring, and fault diagnosis

    ROBUST DETECTION OF CORONARY HEART DISEASE USING MACHINE LEARNING ALGORITHMS

    Get PDF
    Predicting whether or not someone will get heart or cardiac disease is now one of the most difficult jobs in the area of medicine. Heart disease is responsible for the deaths of about one person per minute in the contemporary age. Processing the vast amounts of data that are generated in the field of healthcare is an important application for data science. Because predicting cardiac disease is a difficult undertaking, there is a pressing need to automate the prediction process to minimize the dangers that are connected with it and provide the patient with timely warning. The chapter one in this thesis report highlights the importance of this problem and identifies the need to augment the current technological efforts to produce relatively more accurate system in facilitating the timely decision about the problem. The chapter one also presents the current literature about the theories and systems developed and assessed in this direction.This thesis work makes use of the dataset on cardiac illness that can be found in the machine learning repository at UCI. Using a variety of data mining strategies, such as Naive Bayes, Decision Tree, Support Vector Machine (SVM), K-Nearest Neighbor (K-NN), and Random Forest, the work that has been reported in this thesis estimates the likelihood that a patient would develop heart disease and can categorize the patient\u27s degree of risk. The performance of chosen classifiers is tested on chosen feature space with help of feature selection algorithm. On Cleveland heart datasets of heart disease, the models were placed for training and testing. To assess the usefulness and strength of each model, several performance metrics are utilized, including sensitivity, accuracy, AUC, specificity, ROC curve and F1-score. The effort behind this research leads to conduct a comparative analysis by computing the performance of several machine learning algorithms. The results of the experiment demonstrate that the Random Forest and Support Vector machine algorithms achieved the best level of accuracy (94.50% and 91.73% respectively) on selected feature space when compared to the other machine learning methods that were employed. Thus, these two classifiers turned out to be promising classifiers for heart disease prediction. The computational complexity of each classifier was also investigated. Based on the computational complexity and comparative experimental results, a robust heart disease prediction is proposed for an embedded platform, where benefits of multiple classifiers are accumulated. The system proposes that heart disease detection is possible with higher confidence if and only if many of these classifiers detect it. In the end, results of experimental work are concluded and possible future strategies in enhancing this effort are discussed

    CBR and MBR techniques: review for an application in the emergencies domain

    Get PDF
    The purpose of this document is to provide an in-depth analysis of current reasoning engine practice and the integration strategies of Case Based Reasoning and Model Based Reasoning that will be used in the design and development of the RIMSAT system. RIMSAT (Remote Intelligent Management Support and Training) is a European Commission funded project designed to: a.. Provide an innovative, 'intelligent', knowledge based solution aimed at improving the quality of critical decisions b.. Enhance the competencies and responsiveness of individuals and organisations involved in highly complex, safety critical incidents - irrespective of their location. In other words, RIMSAT aims to design and implement a decision support system that using Case Base Reasoning as well as Model Base Reasoning technology is applied in the management of emergency situations. This document is part of a deliverable for RIMSAT project, and although it has been done in close contact with the requirements of the project, it provides an overview wide enough for providing a state of the art in integration strategies between CBR and MBR technologies.Postprint (published version

    Advanced Approaches Applied to Materials Development and Design Predictions

    Get PDF
    This thematic issue on advanced simulation tools applied to materials development and design predictions gathers selected extended papers related to power generation systems, presented at the XIX International Colloquium on Mechanical Fatigue of Metals (ICMFM XIX), organized at University of Porto, Portugal, in 2018. In this issue, the limits of the current generation of materials are explored, which are continuously being reached according to the frontier of hostile environments, whether in the aerospace, nuclear, or petrochemistry industry, or in the design of gas turbines where efficiency of energy production and transformation demands increased temperatures and pressures. Thus, advanced methods and applications for theoretical, numerical, and experimental contributions that address these issues on failure mechanism modeling and simulation of materials are covered. As the Guest Editors, we would like to thank all the authors who submitted papers to this Special Issue. All the papers published were peer-reviewed by experts in the field whose comments helped to improve the quality of the edition. We also would like to thank the Editorial Board of Materials for their assistance in managing this Special Issue

    Condition Assessment of Concrete Bridge Decks Using Ground and Airborne Infrared Thermography

    Get PDF
    Applications of nondestructive testing (NDT) technologies have shown promise in assessing the condition of existing concrete bridges. Infrared thermography (IRT) has gradually gained wider acceptance as a NDT and evaluation tool in the civil engineering field. The high capability of IRT in detecting subsurface delamination, commercial availability of infrared cameras, lower cost compared with other technologies, speed of data collection, and remote sensing are some of the expected benefits of applying this technique in bridge deck inspection practices. The research conducted in this thesis aims at developing a rational condition assessment system for concrete bridge decks based on IRT technology, and automating its analysis process in order to add this invaluable technique to the bridge inspector’s tool box. Ground penetrating radar (GPR) has also been vastly recognized as a NDT technique capable of evaluating the potential of active corrosion. Therefore, integrating IRT and GPR results in this research provides more precise assessments of bridge deck conditions. In addition, the research aims to establish a unique link between NDT technologies and inspector findings by developing a novel bridge deck condition rating index (BDCI). The proposed procedure captures the integrated results of IRT and GPR techniques, along with visual inspection judgements, thus overcoming the inherent scientific uncertainties of this process. Finally, the research aims to explore the potential application of unmanned aerial vehicle (UAV) infrared thermography for detecting hidden defects in concrete bridge decks. The NDT work in this thesis was conducted on full-scale deteriorated reinforced concrete bridge decks located in Montreal, Quebec and London, Ontario. The proposed models have been validated through various case studies. IRT, either from the ground or by utilizing a UAV with high-resolution thermal infrared imagery, was found to be an appropriate technology for inspecting and precisely detecting subsurface anomalies in concrete bridge decks. The proposed analysis produced thermal mosaic maps from the individual IR images. The k-means clustering classification technique was utilized to segment the mosaics and identify objective thresholds and, hence, to delineate different categories of delamination severity in the entire bridge decks. The proposed integration methodology of NDT technologies and visual inspection results provided more reliable BDCI. The information that was sought to identify the parameters affecting the integration process was gathered from bridge engineers with extensive experience and intuition. The analysis process utilized the fuzzy set theory to account for uncertainties and imprecision in the measurements of bridge deck defects detected by IRT and GPR testing along with bridge inspector observations. The developed system and models should stimulate wider acceptance of IRT as a rapid, systematic and cost-effective evaluation technique for detecting bridge deck delaminations. The proposed combination of IRT and GPR results should expand their correlative use in bridge deck inspection. Integrating the proposed BDCI procedure with existing bridge management systems can provide a detailed and timely picture of bridge health, thus helping transportation agencies in identifying critical deficiencies at various service life stages. Consequently, this can yield sizeable reductions in bridge inspection costs, effective allocation of limited maintenance and repair funds, and promote the safety, mobility, longevity, and reliability of our highway transportation assets

    Recent Advances and Applications of Machine Learning in Metal Forming Processes

    Get PDF
    Machine learning (ML) technologies are emerging in Mechanical Engineering, driven by the increasing availability of datasets, coupled with the exponential growth in computer performance. In fact, there has been a growing interest in evaluating the capabilities of ML algorithms to approach topics related to metal forming processes, such as: Classification, detection and prediction of forming defects; Material parameters identification; Material modelling; Process classification and selection; Process design and optimization. The purpose of this Special Issue is to disseminate state-of-the-art ML applications in metal forming processes, covering 10 papers about the abovementioned and related topics
    • …
    corecore