69 research outputs found

    Identification of wood defect using pattern recognition technique

    Get PDF
    This study proposed a classification model for timber defect classification based on an artificial neural network (ANN). Besides that, the research also focuses on determining the appropriate parameters for the neural network model in optimizing the defect identification performance, such as the number of hidden layers nodes and the number of epochs in the neural network. The neural network's performance is compared with other standard classifiers such as Naïve Bayes, K-Nearest Neighbours, and J48 Decision Tree in finding their significant differences across the multiple timber species. The classifier's performance is measured based on the F-measure due to the imbalanced dataset of the timber species. The experimental results show that the proposed classification model based on the neural network outperforms the other standard classifiers in detecting many types of defects across multiple timber species with an F-measure of 84.01%. This research demonstrates that ANN can accurately classify the defects across multiple species while defining appropriate parameters (hidden layers and epochs) for the neural network model in optimizing defect identification performance

    An improved bees algorithm local search mechanism for numerical dataset

    Get PDF
    Bees Algorithm (BA), a heuristic optimization procedure, represents one of the fundamental search techniques is based on the food foraging activities of bees. This algorithm performs a kind of exploitative neighbourhoods search combined with random explorative search. However, the main issue of BA is that it requires long computational time as well as numerous computational processes to obtain a good solution, especially in more complicated issues. This approach does not guarantee any optimum solutions for the problem mainly because of lack of accuracy. To solve this issue, the local search in the BA is investigated by Simple swap, 2-Opt and 3-Opt were proposed as Massudi methods for Bees Algorithm Feature Selection (BAFS). In this study, the proposed extension methods is 4-Opt as search neighbourhood is presented. This proposal was implemented and comprehensively compares and analyse their performances with respect to accuracy and time. Furthermore, in this study the feature selection algorithm is implemented and tested using most popular dataset from Machine Learning Repository (UCI). The obtained results from experimental work confirmed that the proposed extension of the search neighbourhood including 4-Opt approach has provided better accuracy with suitable time than the Massudi methods

    Identification Of Wood Defect Using Pattern Recognition Technique

    Get PDF
    This study proposed a classification model for timber defect classification based on an artificial neural network (ANN). Besides that, the research also focuses on determining the appropriate parameters for the neural network model in optimizing the defect identification performance, such as the number of hidden layers nodes and the number of epochs in the neural network. The neural network's performance is compared with other standard classifiers such as Naïve Bayes, K-Nearest Neighbours, and J48 Decision Tree in finding their significant differences across the multiple timber species. The classifier's performance is measured based on the Fmeasure due to the imbalanced dataset of the timber species. The experimental results show that the proposed classification model based on the neural network outperforms the other standard classifiers in detecting many types of defects across multiple timber species with an F-measure of 84.01%. This research demonstrates that ANN can accurately classify the defects across multiple species while defining appropriate parameters (hidden layers and epochs) for the neural network model in optimizing defect identification performanc

    Evaluation of texture feature based on basic local binary pattern for wood defect classification

    Get PDF
    Wood defects detection has been studied a lot recently to detect the defects on the wood surface and assist the manufacturers in having a clear wood to be used to produce a high-quality product. Therefore, the defects on the wood affect and reduce the quality of wood. This research proposes an effective feature extraction technique called the local binary pattern (LBP) with a common classifier called Support Vector Machine (SVM). Our goal is to classify the natural defects on the wood surface. First, preprocessing was applied to convert the RGB images into grayscale images. Then, the research applied the LBP feature extraction technique with eight neighbors (P=8) and several radius (R) values. After that, we apply the SVM classifier for the classification and measure the proposed technique's performance. The experimental result shows that the average accuracy achieved is 65% on the balanced dataset with P=8 and R=1. It indicates that the proposed technique works moderately well to classify wood defects. This study will consequently contribute to the overall wood defect detection framework, which generally benefits the automated inspection of the wood defects

    Enhanced grey wolf optimisation algorithm for feature selection in anomaly detection

    Get PDF
    Anomaly detection deals with identification of items that do not conform to an expected pattern or items present in a dataset. The performance of different mechanisms utilized to perform the anomaly detection depends heavily on the group of features used. Thus, not all features in the dataset can be used in the classification process since some features may lead to low performance of classifier. Feature selection (FS) is a good mechanism that minimises the dimension of high-dimensional datasets by deleting the irrelevant features. Modified Binary Grey Wolf Optimiser (MBGWO) is a modern metaheuristic algorithm that has successfully been used for FS for anomaly detection. However, the MBGWO has several issues in finding a good quality solution. Thus, this study proposes an enhanced binary grey wolf optimiser (EBGWO) algorithm for FS in anomaly detection to overcome the algorithm issues. The first modification enhances the initial population of the MBGWO using a heuristic based Ant Colony Optimisation algorithm. The second modification develops a new position update mechanism using the Bat Algorithm movement. The third modification improves the controlled parameter of the MBGWO algorithm using indicators from the search process to refine the solution. The EBGWO algorithm was evaluated on NSL-KDD and six (6) benchmark datasets from the University California Irvine (UCI) repository against ten (10) benchmark metaheuristic algorithms. Experimental results of the EBGWO algorithm on the NSL-KDD dataset in terms of number of selected features and classification accuracy are superior to other benchmark optimisation algorithms. Moreover, experiments on the six (6) UCI datasets showed that the EBGWO algorithm is superior to the benchmark algorithms in terms of classification accuracy and second best for the number of selected features. The proposed EBGWO algorithm can be used for FS in anomaly detection tasks that involve any dataset size from various application domains

    Intelligent classification algorithms in enhancing the performance of support vector machine

    Get PDF
    Performing feature subset and tuning support vector machine (SVM) parameter processes in parallel with the aim to increase the classification accuracy is the current research direction in SVM. Common methods associated in tuning SVM parameters will discretize the continuous value of these parameters which will result in low classification performance. This paper presents two intelligent algorithms that hybridized between ant colony optimization (ACO) and SVM for tuning SVM parameters and selecting feature subset without having to discretize the continuous values. This can be achieved by simultaneously executing the selection of feature subset and tuning SVM parameters simultaneously. The algorithms are called ACOMVSVM and IACOMV-SVM. The difference between the algorithms is the size of the solution archive. The size of the archive in ACOMV is fixed while in IACOMV, the size of solution archive increases as the optimization procedure progress. Eight benchmark datasets from UCI were used in the experiments to validate the performance of the proposed algorithms. Experimental results obtained from the proposed algorithms are better when compared with other approaches in terms of classification accuracy. The average classification accuracies for the proposed ACOMV–SVM and IACOMV-SVM algorithms are 97.28 and 97.91 respectively. The work in this paper also contributes to a new direction for ACO that can deal with mixed variable ACO

    Analysis Of Texture Features For Wood Defect Classification

    Get PDF
    Selecting important features in classifying wood defects remains a challenging issue to the automated visual inspection domain. This study aims to address the extraction and analysis of features based on statistical texture on images of wood defects. A series of procedures including feature extraction using the Grey Level Dependence Matrix (GLDM) and feature analysis were executed in order to investigate the appropriate displacement and quantisation parameters that could significantly classify wood defects. Samples were taken from the Kembang Semangkuk (KSK), Meranti and Merbau wood species. Findings from visual analysis and classification accuracy measures suggest that the feature set with the displacement parameter, d=2, and quantisation level, q=128, shows the highest classification accuracy. However, to achieve less computational cost, the feature set with quantisation level, q=32, shows acceptable performance in terms of classification accurac

    Improved Texture Feature Extraction and Selection Methods for Image Classification Applications

    Get PDF
    Classification is an important process in image processing applications, and image texture is the preferable source of information in images classification, especially in the context of real-world applications. However, the output of a typical texture feature descriptor often does not represent a wide range of different texture characteristics. Many research studies have contributed different descriptors to improve the extraction of features from texture. Among the various descriptors, the Local Binary Patterns (LBP) descriptor produces powerful information from texture by simple comparison between a central pixel and its neighbour pixels. In addition, to obtain sufficient information from texture, many research studies have proposed solutions based on combining complementary features together. Although feature-level fusion produces satisfactory results for certain applications, it suffers from an inherent and well-known problem called “the curse of dimensionality’’. Feature selection deals with this problem effectively by reducing the feature dimensions and selecting only the relevant features. However, large feature spaces often make the process of seeking optimum features complicated. This research introduces improved feature extraction methods by adopting a new approach based on new texture descriptors called Local Zone Binary Patterns (LZBP) and Local Multiple Patterns (LMP), which are both based on the LBP descriptor. The produced feature descriptors are combined with other complementary features to yield a unified vector. Furthermore, the combined features are processed by a new hybrid selection approach based on the Artificial Bee Colony and Neighbourhood Rough Set (ABC-NRS) to efficiently reduce the dimensionality of the resulting features from the feature fusion stage. Comprehensive experimental testing and evaluation is carried out for different components of the proposed approach, and the novelty and limitation of the proposed approach have been demonstrated. The results of the evaluation prove the ability of the LZBP and LMP texture descriptors in improving feature extraction compared to the conventional LBP descriptor. In addition, the use of the hybrid ABC-NRS selection method on the proposed combined features is shown to improve the classification performance while achieving the shortest feature length. The overall proposed approach is demonstrated to provide improved texture-based image classification performance compared to previous methods using benchmarks based on outdoor scene images. These research contributions thus represent significant advances in the field of texture-based image classification

    Hybrid ACO and SVM algorithm for pattern classification

    Get PDF
    Ant Colony Optimization (ACO) is a metaheuristic algorithm that can be used to solve a variety of combinatorial optimization problems. A new direction for ACO is to optimize continuous and mixed (discrete and continuous) variables. Support Vector Machine (SVM) is a pattern classification approach originated from statistical approaches. However, SVM suffers two main problems which include feature subset selection and parameter tuning. Most approaches related to tuning SVM parameters discretize the continuous value of the parameters which will give a negative effect on the classification performance. This study presents four algorithms for tuning the SVM parameters and selecting feature subset which improved SVM classification accuracy with smaller size of feature subset. This is achieved by performing the SVM parameters’ tuning and feature subset selection processes simultaneously. Hybridization algorithms between ACO and SVM techniques were proposed. The first two algorithms, ACOR-SVM and IACOR-SVM, tune the SVM parameters while the second two algorithms, ACOMV-R-SVM and IACOMV-R-SVM, tune the SVM parameters and select the feature subset simultaneously. Ten benchmark datasets from University of California, Irvine, were used in the experiments to validate the performance of the proposed algorithms. Experimental results obtained from the proposed algorithms are better when compared with other approaches in terms of classification accuracy and size of the feature subset. The average classification accuracies for the ACOR-SVM, IACOR-SVM, ACOMV-R and IACOMV-R algorithms are 94.73%, 95.86%, 97.37% and 98.1% respectively. The average size of feature subset is eight for the ACOR-SVM and IACOR-SVM algorithms and four for the ACOMV-R and IACOMV-R algorithms. This study contributes to a new direction for ACO that can deal with continuous and mixed-variable ACO
    • …
    corecore