2,333 research outputs found

    Subgroup Discovery: Real-World Applications

    Get PDF
    Subgroup discovery is a data mining technique which extracts interesting rules with respect to a target variable. An important characteristic of this task is the combination of predictive and descriptive induction. In this paper, an overview about subgroup discovery is performed. In addition, di erent real-world applications solved through evolutionary algorithms where the suitability and potential of this type of algorithms for the development of subgroup discovery algorithms are presented

    Qualitative and fuzzy analogue circuit design.

    Get PDF

    A New Differential Evolution with self-terminating ability using fuzzy control and k-nearest neighbors

    Full text link
    A new Differential Evolution (DE) that incorporates fuzzy control and k-nearest neighbors algorithm to determine the terminating condition is proposed. A technique called Iteration Windows is introduced to govern the number of iteration in each searching stage. The size of the iteration windows is controlled by a fuzzy controller, which uses the information provided by the k-nearest neighbors system to analyze the population during the searching process. The controller keeps controlling the iteration windows until the end of the searching process. The wavelet based mutation process is embedded in the DE searching process to enhance the searching performance of DE. The F weight of DE is also controlled by the fuzzy controller to further speed up the searching process. A suite of benchmark test functions is employed to evaluate the performance of the proposed method. It is shown empirically that the proposed method can terminate the searching process with a reasonable number of iteration. Ā© 2010 IEEE

    On the Synthesis of fuzzy neural systems.

    Get PDF
    by Chung, Fu Lai.Thesis (Ph.D.)--Chinese University of Hong Kong, 1995.Includes bibliographical references (leaves 166-174).ACKNOWLEDGEMENT --- p.iiiABSTRACT --- p.ivChapter 1. --- Introduction --- p.1Chapter 1.1 --- Integration of Fuzzy Systems and Neural Networks --- p.1Chapter 1.2 --- Objectives of the Research --- p.7Chapter 1.2.1 --- Fuzzification of Competitive Learning Algorithms --- p.7Chapter 1.2.2 --- Capacity Analysis of FAM and FRNS Models --- p.8Chapter 1.2.3 --- Structure and Parameter Identifications of FRNS --- p.9Chapter 1.3 --- Outline of the Thesis --- p.9Chapter 2. --- A Fuzzy System Primer --- p.11Chapter 2.1 --- Basic Concepts of Fuzzy Sets --- p.11Chapter 2.2 --- Fuzzy Set-Theoretic Operators --- p.15Chapter 2.3 --- "Linguistic Variable, Fuzzy Rule and Fuzzy Inference" --- p.19Chapter 2.4 --- Basic Structure of a Fuzzy System --- p.22Chapter 2.4.1 --- Fuzzifier --- p.22Chapter 2.4.2 --- Fuzzy Knowledge Base --- p.23Chapter 2.4.3 --- Fuzzy Inference Engine --- p.24Chapter 2.4.4 --- Defuzzifier --- p.28Chapter 2.5 --- Concluding Remarks --- p.29Chapter 3. --- Categories of Fuzzy Neural Systems --- p.30Chapter 3.1 --- Introduction --- p.30Chapter 3.2 --- Fuzzification of Neural Networks --- p.31Chapter 3.2.1 --- Fuzzy Membership Driven Models --- p.32Chapter 3.2.2 --- Fuzzy Operator Driven Models --- p.34Chapter 3.2.3 --- Fuzzy Arithmetic Driven Models --- p.35Chapter 3.3 --- Layered Network Implementation of Fuzzy Systems --- p.36Chapter 3.3.1 --- Mamdani's Fuzzy Systems --- p.36Chapter 3.3.2 --- Takagi and Sugeno's Fuzzy Systems --- p.37Chapter 3.3.3 --- Fuzzy Relation Based Fuzzy Systems --- p.38Chapter 3.4 --- Concluding Remarks --- p.40Chapter 4. --- Fuzzification of Competitive Learning Networks --- p.42Chapter 4.1 --- Introduction --- p.42Chapter 4.2 --- Crisp Competitive Learning --- p.44Chapter 4.2.1 --- Unsupervised Competitive Learning Algorithm --- p.46Chapter 4.2.2 --- Learning Vector Quantization Algorithm --- p.48Chapter 4.2.3 --- Frequency Sensitive Competitive Learning Algorithm --- p.50Chapter 4.3 --- Fuzzy Competitive Learning --- p.50Chapter 4.3.1 --- Unsupervised Fuzzy Competitive Learning Algorithm --- p.53Chapter 4.3.2 --- Fuzzy Learning Vector Quantization Algorithm --- p.54Chapter 4.3.3 --- Fuzzy Frequency Sensitive Competitive Learning Algorithm --- p.58Chapter 4.4 --- Stability of Fuzzy Competitive Learning --- p.58Chapter 4.5 --- Controlling the Fuzziness of Fuzzy Competitive Learning --- p.60Chapter 4.6 --- Interpretations of Fuzzy Competitive Learning Networks --- p.61Chapter 4.7 --- Simulation Results --- p.64Chapter 4.7.1 --- Performance of Fuzzy Competitive Learning Algorithms --- p.64Chapter 4.7.2 --- Performance of Monotonically Decreasing Fuzziness Control Scheme --- p.74Chapter 4.7.3 --- Interpretation of Trained Networks --- p.76Chapter 4.8 --- Concluding Remarks --- p.80Chapter 5. --- Capacity Analysis of Fuzzy Associative Memories --- p.82Chapter 5.1 --- Introduction --- p.82Chapter 5.2 --- Fuzzy Associative Memories (FAMs) --- p.83Chapter 5.3 --- Storing Multiple Rules in FAMs --- p.87Chapter 5.4 --- A High Capacity Encoding Scheme for FAMs --- p.90Chapter 5.5 --- Memory Capacity --- p.91Chapter 5.6 --- Rule Modification --- p.93Chapter 5.7 --- Inference Performance --- p.99Chapter 5.8 --- Concluding Remarks --- p.104Chapter 6. --- Capacity Analysis of Fuzzy Relational Neural Systems --- p.105Chapter 6.1 --- Introduction --- p.105Chapter 6.2 --- Fuzzy Relational Equations and Fuzzy Relational Neural Systems --- p.107Chapter 6.3 --- Solving a System of Fuzzy Relational Equations --- p.109Chapter 6.4 --- New Solvable Conditions --- p.112Chapter 6.4.1 --- Max-t Fuzzy Relational Equations --- p.112Chapter 6.4.2 --- Min-s Fuzzy Relational Equations --- p.117Chapter 6.5 --- Approximate Resolution --- p.119Chapter 6.6 --- System Capacity --- p.123Chapter 6.7 --- Inference Performance --- p.125Chapter 6.8 --- Concluding Remarks --- p.127Chapter 7. --- Structure and Parameter Identifications of Fuzzy Relational Neural Systems --- p.129Chapter 7.1 --- Introduction --- p.129Chapter 7.2 --- Modelling Nonlinear Dynamic Systems by Fuzzy Relational Equations --- p.131Chapter 7.3 --- A General FRNS Identification Algorithm --- p.138Chapter 7.4 --- An Evolutionary Computation Approach to Structure and Parameter Identifications --- p.139Chapter 7.4.1 --- Guided Evolutionary Simulated Annealing --- p.140Chapter 7.4.2 --- An Evolutionary Identification (EVIDENT) Algorithm --- p.143Chapter 7.5 --- Simulation Results --- p.146Chapter 7.6 --- Concluding Remarks --- p.158Chapter 8. --- Conclusions --- p.159Chapter 8.1 --- Summary of Contributions --- p.160Chapter 8.1.1 --- Fuzzy Competitive Learning --- p.160Chapter 8.1.2 --- Capacity Analysis of FAM and FRNS --- p.160Chapter 8.1.3 --- Numerical Identification of FRNS --- p.161Chapter 8.2 --- Further Investigations --- p.162Appendix A Publication List of the Candidate --- p.164BIBLIOGRAPHY --- p.16

    An Approach to Pattern Recognition by Evolutionary Computation

    Get PDF
    Evolutionary Computation has been inspired by the natural phenomena of evolution. It provides a quite general heuristic, exploiting few basic concepts: reproduction of individuals, variation phenomena that affect the likelihood of survival of individuals, inheritance of parents features by offspring. EC has been widely used in the last years to effectively solve hard, non linear and very complex problems. Among the others, ECā€“based algorithms have also been used to tackle classification problems. Classification is a process according to which an object is attributed to one of a finite set of classes or, in other words, it is recognized as belonging to a set of equal or similar entities, identified by a label. Most likely, the main aspect of classification concerns the generation of prototypes to be used to recognize unknown patterns. The role of prototypes is that of representing patterns belonging to the different classes defined within a given problem. For most of the problems of practical interest, the generation of such prototypes is a very hard problem, since a prototype must be able to represent patterns belonging to the same class, which may be significantly dissimilar each other. They must also be able to discriminate patterns belonging to classes different from the one that they represent. Moreover, a prototype should contain the minimum amount of information required to satisfy the requirements just mentioned. The research presented in this thesis, has led to the definition of an ECā€“based framework to be used for prototype generation. The defined framework does not provide for the use of any particular kind of prototypes. In fact, it can generate any kind of prototype once an encoding scheme for the used prototypes has been defined. The generality of the framework can be exploited to develop many applications. The framework has been employed to implement two specific applications for prototype generation. The developed applications have been tested on several data sets and the results compared with those obtained by other approaches previously presented in the literature

    Evolutionary program induction directed by logic grammars.

    Get PDF
    by Wong Man Leung.Thesis (Ph.D.)--Chinese University of Hong Kong, 1995.Includes bibliographical references (leaves 227-236).List of Figures --- p.iiiList of Tables --- p.viChapter Chapter 1 : --- Introduction --- p.1Chapter 1.1. --- Automatic programming and program induction --- p.1Chapter 1.2. --- Motivation --- p.6Chapter 1.3. --- Contributions of the research --- p.8Chapter 1.4. --- Outline of the thesis --- p.11Chapter Chapter 2 : --- An Overview of Evolutionary Algorithms --- p.13Chapter 2.1. --- Evolutionary algorithms --- p.13Chapter 2.2. --- Genetic Algorithms (GAs) --- p.15Chapter 2.2.1. --- The canonical genetic algorithm --- p.16Chapter 2.2.1.1. --- Selection methods --- p.21Chapter 2.2.1.2. --- Recombination methods --- p.24Chapter 2.2.1.3. --- Inversion and Reordering --- p.27Chapter 2.2.2. --- Implicit parallelism and the building block hypothesis --- p.28Chapter 2.2.3. --- Steady state genetic algorithms --- p.32Chapter 2.2.4. --- Hybrid algorithms --- p.33Chapter 2.3. --- Genetic Programming (GP) --- p.34Chapter 2.3.1. --- Introduction to the traditional GP --- p.34Chapter 2.3.2. --- Automatic Defined Function (ADF) --- p.41Chapter 2.3.3. --- Module Acquisition (MA) --- p.44Chapter 2.3.4. --- Strongly Typed Genetic Programming (STGP) --- p.49Chapter 2.4. --- Evolution Strategies (ES) --- p.50Chapter 2.5. --- Evolutionary Programming (EP) --- p.55Chapter Chapter 3 : --- Inductive Logic Programming --- p.59Chapter 3.1. --- Inductive concept learning --- p.59Chapter 3.2. --- Inductive Logic Programming (ILP) --- p.62Chapter 3.2.1. --- Interactive ILP --- p.64Chapter 3.2.2. --- Empirical ILP --- p.65Chapter 3.3. --- Techniques and methods of ILP --- p.67Chapter Chapter 4 : --- Genetic Logic Programming and Applications --- p.74Chapter 4.1. --- Introduction --- p.74Chapter 4.2. --- Representations of logic programs --- p.76Chapter 4.3. --- Crossover of logic programs --- p.81Chapter 4.4. --- Genetic Logic Programming System (GLPS) --- p.87Chapter 4.5. --- Applications --- p.90Chapter 4.5.1. --- The Winston's arch problem --- p.91Chapter 4.5.2. --- The modified Quinlan's network reachability problem --- p.92Chapter 4.5.3. --- The factorial problem --- p.95Chapter Chapter 5 : --- The logic grammars based genetic programming system (LOGENPRO) --- p.100Chapter 5.1. --- Logic grammars --- p.101Chapter 5.2. --- Representations of programs --- p.103Chapter 5.3. --- Crossover of programs --- p.111Chapter 5.4. --- Mutation of programs --- p.126Chapter 5.5. --- The evolution process of LOGENPRO --- p.130Chapter 5.6. --- Discussion --- p.132Chapter Chapter 6 : --- Applications of LOGENPRO --- p.134Chapter 6.1. --- Learning functional programs --- p.134Chapter 6.1.1. --- Learning S-expressions using LOGENPRO --- p.134Chapter 6.1.2. --- The DOT PRODUCT problem --- p.137Chapter 6.1.2. --- Learning sub-functions using explicit knowledge --- p.143Chapter 6.2. --- Learning logic programs --- p.148Chapter 6.2.1. --- Learning logic programs using LOGENPRO --- p.148Chapter 6.2.2. --- The Winston's arch problem --- p.151Chapter 6.2.3. --- The modified Quinlan's network reachability problem --- p.153Chapter 6.2.4. --- The factorial problem --- p.154Chapter 6.2.5. --- Discussion --- p.155Chapter 6.3. --- Learning programs in C --- p.155Chapter Chapter 7 : --- Knowledge Discovery in Databases --- p.159Chapter 7.1. --- Inducing decision trees using LOGENPRO --- p.160Chapter 7.1.1. --- Decision trees --- p.160Chapter 7.1.2. --- Representing decision trees as S-expressions --- p.164Chapter 7.1.3. --- The credit screening problem --- p.166Chapter 7.1.4. --- The experiment --- p.168Chapter 7.2. --- Learning logic program from imperfect data --- p.174Chapter 7.2.1. --- The chess endgame problem --- p.177Chapter 7.2.2. --- The setup of experiments --- p.178Chapter 7.2.3. --- Comparison of LOGENPRO with FOIL --- p.180Chapter 7.2.4. --- Comparison of LOGENPRO with BEAM-FOIL --- p.182Chapter 7.2.5. --- Comparison of LOGENPRO with mFOILl --- p.183Chapter 7.2.6. --- Comparison of LOGENPRO with mFOIL2 --- p.184Chapter 7.2.7. --- Comparison of LOGENPRO with mFOIL3 --- p.185Chapter 7.2.8. --- Comparison of LOGENPRO with mFOIL4 --- p.186Chapter 7.2.9. --- Comparison of LOGENPRO with mFOIL5 --- p.187Chapter 7.2.10. --- Discussion --- p.188Chapter 7.3. --- Learning programs in Fuzzy Prolog --- p.189Chapter Chapter 8 : --- An Adaptive Inductive Logic Programming System --- p.192Chapter 8.1. --- Adaptive Inductive Logic Programming --- p.192Chapter 8.2. --- A generic top-down ILP algorithm --- p.196Chapter 8.3. --- Inducing procedural search biases --- p.200Chapter 8.3.1. --- The evolution process --- p.201Chapter 8.3.2. --- The experimentation setup --- p.202Chapter 8.3.3. --- Fitness calculation --- p.203Chapter 8.4. --- Experimentation and evaluations --- p.204Chapter 8.4.1. --- The member predicate --- p.205Chapter 8.4.2. --- The member predicate in a noisy environment --- p.205Chapter 8.4.3. --- The multiply predicate --- p.206Chapter 8.4.4. --- The uncle predicate --- p.207Chapter 8.5. --- Discussion --- p.208Chapter Chapter 9 : --- Conclusion and Future Work --- p.210Chapter 9.1. --- Conclusion --- p.210Chapter 9.2. --- Future work --- p.217Chapter 9.2.1. --- Applying LOGENPRO to discover knowledge from databases --- p.217Chapter 9.2.2. --- Learning recursive programs --- p.218Chapter 9.2.3. --- Applying LOGENPRO in engineering design --- p.220Chapter 9.2.4. --- Exploiting parallelism of evolutionary algorithms --- p.222Reference --- p.227Appendix A --- p.23
    • ā€¦
    corecore