2,628 research outputs found

    New Learning Models for Generating Classification Rules Based on Rough Set Approach

    Get PDF
    Data sets, static or dynamic, are very important and useful for presenting real life features in different aspects of industry, medicine, economy, and others. Recently, different models were used to generate knowledge from vague and uncertain data sets such as induction decision tree, neural network, fuzzy logic, genetic algorithm, rough set theory, and others. All of these models take long time to learn for a huge and dynamic data set. Thus, the challenge is how to develop an efficient model that can decrease the learning time without affecting the quality of the generated classification rules. Huge information systems or data sets usually have some missing values due to unavailable data that affect the quality of the generated classification rules. Missing values lead to the difficulty of extracting useful information from that data set. Another challenge is how to solve the problem of missing data. Rough set theory is a new mathematical tool to deal with vagueness and uncertainty. It is a useful approach for uncovering classificatory knowledge and building a classification rules. So, the application of the theory as part of the learning models was proposed in this thesis. Two different models for learning in data sets were proposed based on two different reduction algorithms. The split-condition-merge-reduct algorithm ( SCMR) was performed on three different modules: partitioning the data set vertically into subsets, applying rough set concepts of reduction to each subset, and merging the reducts of all subsets to form the best reduct. The enhanced-split-condition-merge-reduct algorithm (E SCMR) was performed on the above three modules followed by another module that applies the rough set reduction concept again to the reduct generated by SCMR in order to generate the best reduct, which plays the same role as if all attributes in this subset existed. Classification rules were generated based on the best reduct. For the problem of missing data, a new approach was proposed based on data partitioning and function mode. In this new approach, the data set was partitioned horizontally into different subsets. All objects in each subset of data were described by only one classification value. The mode function was applied to each subset of data that has missing values in order to find the most frequently occurring value in each attribute. Missing values in that attribute were replaced by the mode value. The proposed approach for missing values produced better results compared to other approaches. Also, the proposed models for learning in data sets generated the classification rules faster than other methods. The accuracy of the classification rules by the proposed models was high compared to other models

    Efficient schemes on solving fractional integro-differential equations

    Get PDF
    Fractional integro-differential equation (FIDE) emerges in various modelling of physical phenomena. In most cases, finding the exact analytical solution for FIDE is difficult or not possible. Hence, the methods producing highly accurate numerical solution in efficient ways are often sought after. This research has designed some methods to find the approximate solution of FIDE. The analytical expression of Genocchi polynomial operational matrix for left-sided and right-sided Caputo’s derivative and kernel matrix has been derived. Linear independence of Genocchi polynomials has been proved by deriving the expression for Genocchi polynomial Gram determinant. Genocchi polynomial method with collocation has been introduced and applied in solving both linear and system of linear FIDE. The numerical results of solving linear FIDE by Genocchi polynomial are compared with certain existing methods. The analytical expression of Bernoulli polynomial operational matrix of right-sided Caputo’s fractional derivative and the Bernoulli expansion coefficient for a two-variable function is derived. Linear FIDE with mixed left and right-sided Caputo’s derivative is first considered and solved by applying the Bernoulli polynomial with spectral-tau method. Numerical results obtained show that the method proposed achieves very high accuracy. The upper bounds for th

    A DISTANCE BASED INCREMENTAL FILTER-WRAPPER ALGORITHM FOR FINDING REDUCT IN INCOMPLETE DECISION TABLES

    Get PDF
    Tolerance rough set model is an effective tool for attribute reduction in incomplete decision tables. In recent years, some incremental algorithms have been proposed to find reduct of dynamic incomplete decision tables in order to reduce computation time. However, they are classical filter algorithms, in which the classification accuracy of decision tables is computed after obtaining reduct. Therefore, the obtained reducts of these algorithms are not optimal on cardinality of reduct and classification accuracy. In this paper, we propose the incremental filter-wrapper algorithm IDS_IFW_AO to find one reduct of an incomplete desision table in case of adding multiple objects. The experimental results on some sample datasets show that the proposed filter-wrapper algorithm IDS_IFW_AO is more effective than the filter algorithm IARM-I [17] on classification accuracy and cardinality of reduc

    Geometric lattice structure of covering and its application to attribute reduction through matroids

    Full text link
    The reduction of covering decision systems is an important problem in data mining, and covering-based rough sets serve as an efficient technique to process the problem. Geometric lattices have been widely used in many fields, especially greedy algorithm design which plays an important role in the reduction problems. Therefore, it is meaningful to combine coverings with geometric lattices to solve the optimization problems. In this paper, we obtain geometric lattices from coverings through matroids and then apply them to the issue of attribute reduction. First, a geometric lattice structure of a covering is constructed through transversal matroids. Then its atoms are studied and used to describe the lattice. Second, considering that all the closed sets of a finite matroid form a geometric lattice, we propose a dependence space through matroids and study the attribute reduction issues of the space, which realizes the application of geometric lattices to attribute reduction. Furthermore, a special type of information system is taken as an example to illustrate the application. In a word, this work points out an interesting view, namely, geometric lattice to study the attribute reduction issues of information systems
    corecore