10 research outputs found

    Empirical learning aided by weak domain knowledge in the form of feature importance

    Get PDF
    Standard hybrid learners that use domain knowledge require stronger knowledge that is hard and expensive to acquire. However, weaker domain knowledge can benefit from prior knowledge while being cost effective. Weak knowledge in the form of feature relative importance (FRI) is presented and explained. Feature relative importance is a real valued approximation of a feature’s importance provided by experts. Advantage of using this knowledge is demonstrated by IANN, a modified multilayer neural network algorithm. IANN is a very simple modification of standard neural network algorithm but attains significant performance gains. Experimental results in the field of molecular biology show higher performance over other empirical learning algorithms including standard backpropagation and support vector machines. IANN performance is even comparable to a theory refinement system KBANN that uses stronger domain knowledge. This shows Feature relative importance can improve performance of existing empirical learning algorithms significantly with minimal effort

    A Generalized Method for Integrating Rule-based Knowledge into Inductive Methods Through Virtual Sample Creation

    Get PDF
    Hybrid learning methods use theoretical knowledge of a domain and a set of classified examples to develop a method for classification. Methods that use domain knowledge have been shown to perform better than inductive learners. However, there is no general method to include domain knowledge into all inductive learning algorithms as all hybrid methods are highly specialized for a particular algorithm. We present an algorithm that will take domain knowledge in the form of propositional rules, generate artificial examples from the rules and also remove instances likely to be flawed. This enriched dataset then can be used by any learning algorithm. Experimental results of different scenarios are shown that demonstrate this method to be more effective than simple inductive learning

    INCREMENTAL LEARNING OF PROCEDURAL PLANNING KNOWLEDGE IN CHALLENGING ENVIRONMENTS

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/75646/1/j.1467-8640.2005.00280.x.pd

    Systemy ekspertowe w działalności bibliotecznej i informacyjnej : stan badań, problemy badawcze, przykłady zastosowań

    Get PDF
    Purpose/thesis: The purpose of this paper is to answer the questions of whether and to what extent expert systems are currently used in library and information services. Attention is drawn to the following elements of the information process: acquisition, storage and cataloging/description of information (search characteristics, search instruction), information retrieval (querying, natural language queries), transmission and implementation of information. Methods: The analysis of the literature acquired through queries in selected databases provided basis for the discussion of research trends in the field of expert systems as one of the methodologies of knowledge management using methods of artificial intelligence: problem solving and representation of knowledge (knowledge generation, valuation and management, knowledge extraction and synthesis in the construction of expertise), „self-learning”, performance evaluation. In the final part of the paper an attempt was made to identify possible areas of expert systems application in libraries and information centers. Results:The research helped to specify the number of bibliographic records containing information about expert systems and recorded in selected databases. Expert systems vocabulary presented in the example thesaurus was discussed and the literature of the field was studied. Conclusions: The quantitative analysis of the literature showed a significant decrease in the number of publications on expert systems after 2010. Nevertheless, the expert systems (both experimental and operational) are used in numerous fields. It is expected that these systems will be used in specialized libraries for the acquisition and codification of knowledge in selected areas of specialization

    Efficiently Storing and Discovering Knowledge in Databases via Inductive Logic Programming Implemented Directly in Databases

    Get PDF
    University of Minnesota M.S. thesis. July 2015. Major: Computer Science. Advisor: Richard Maclin. 1 computer file (PDF); viii, 79 pages.Inductive Logic Programming (ILP) uses inductive, statistical techniques to generate hypotheses which incorporate the given background knowledge to induce concepts that cover most of the positive examples and few of the negative examples. ILP uses techniques from both logic programming and machine learning. Research has been evolving from several years in this field and many systems are developed to solve ILP problems and most of these systems are developed in Prolog and take the input in the form of text files or other similar formats. This thesis proposes to use a relational database to store background knowledge, positive and negative examples in the form of database entities. This information is then manipulated directly uses ILP techniques efficiently in the process of generating hypotheses. The database does the heavy lifting by efficiently handling and storing a very large number of intermediate rules which are generated in the process of finding the required hypotheses. The proposed system will be helpful to generate hypotheses from relational databases. The system also provides a mechanism to store the given data into a database which exists in text files. Sequential covering algorithm is used to find the hypotheses which cover all positive examples and few or none of the negative examples. The proposed system is tested on real world datasets, Mutagenesis and Chess Endgame, and the generated hypotheses and its accuracy are similar to the results of existing systems which were tested on the same datasets. The results are promising and this encourages researchers to use the system in future to discover the knowledge for other datasets or in relational databases

    Seventh Annual Workshop on Space Operations Applications and Research (SOAR 1993), volume 1

    Get PDF
    This document contains papers presented at the Space Operations, Applications and Research Symposium (SOAR) Symposium hosted by NASA/Johnson Space Center (JSC) on August 3-5, 1993, and held at JSC Gilruth Recreation Center. SOAR included NASA and USAF programmatic overview, plenary session, panel discussions, panel sessions, and exhibits. It invited technical papers in support of U.S. Army, U.S. Navy, Department of Energy, NASA, and USAF programs in the following areas: robotics and telepresence, automation and intelligent systems, human factors, life support, and space maintenance and servicing. SOAR was concerned with Government-sponsored research and development relevant to aerospace operations. More than 100 technical papers, 17 exhibits, a plenary session, several panel discussions, and several keynote speeches were included in SOAR '93

    Evolutionary program induction directed by logic grammars.

    Get PDF
    by Wong Man Leung.Thesis (Ph.D.)--Chinese University of Hong Kong, 1995.Includes bibliographical references (leaves 227-236).List of Figures --- p.iiiList of Tables --- p.viChapter Chapter 1 : --- Introduction --- p.1Chapter 1.1. --- Automatic programming and program induction --- p.1Chapter 1.2. --- Motivation --- p.6Chapter 1.3. --- Contributions of the research --- p.8Chapter 1.4. --- Outline of the thesis --- p.11Chapter Chapter 2 : --- An Overview of Evolutionary Algorithms --- p.13Chapter 2.1. --- Evolutionary algorithms --- p.13Chapter 2.2. --- Genetic Algorithms (GAs) --- p.15Chapter 2.2.1. --- The canonical genetic algorithm --- p.16Chapter 2.2.1.1. --- Selection methods --- p.21Chapter 2.2.1.2. --- Recombination methods --- p.24Chapter 2.2.1.3. --- Inversion and Reordering --- p.27Chapter 2.2.2. --- Implicit parallelism and the building block hypothesis --- p.28Chapter 2.2.3. --- Steady state genetic algorithms --- p.32Chapter 2.2.4. --- Hybrid algorithms --- p.33Chapter 2.3. --- Genetic Programming (GP) --- p.34Chapter 2.3.1. --- Introduction to the traditional GP --- p.34Chapter 2.3.2. --- Automatic Defined Function (ADF) --- p.41Chapter 2.3.3. --- Module Acquisition (MA) --- p.44Chapter 2.3.4. --- Strongly Typed Genetic Programming (STGP) --- p.49Chapter 2.4. --- Evolution Strategies (ES) --- p.50Chapter 2.5. --- Evolutionary Programming (EP) --- p.55Chapter Chapter 3 : --- Inductive Logic Programming --- p.59Chapter 3.1. --- Inductive concept learning --- p.59Chapter 3.2. --- Inductive Logic Programming (ILP) --- p.62Chapter 3.2.1. --- Interactive ILP --- p.64Chapter 3.2.2. --- Empirical ILP --- p.65Chapter 3.3. --- Techniques and methods of ILP --- p.67Chapter Chapter 4 : --- Genetic Logic Programming and Applications --- p.74Chapter 4.1. --- Introduction --- p.74Chapter 4.2. --- Representations of logic programs --- p.76Chapter 4.3. --- Crossover of logic programs --- p.81Chapter 4.4. --- Genetic Logic Programming System (GLPS) --- p.87Chapter 4.5. --- Applications --- p.90Chapter 4.5.1. --- The Winston's arch problem --- p.91Chapter 4.5.2. --- The modified Quinlan's network reachability problem --- p.92Chapter 4.5.3. --- The factorial problem --- p.95Chapter Chapter 5 : --- The logic grammars based genetic programming system (LOGENPRO) --- p.100Chapter 5.1. --- Logic grammars --- p.101Chapter 5.2. --- Representations of programs --- p.103Chapter 5.3. --- Crossover of programs --- p.111Chapter 5.4. --- Mutation of programs --- p.126Chapter 5.5. --- The evolution process of LOGENPRO --- p.130Chapter 5.6. --- Discussion --- p.132Chapter Chapter 6 : --- Applications of LOGENPRO --- p.134Chapter 6.1. --- Learning functional programs --- p.134Chapter 6.1.1. --- Learning S-expressions using LOGENPRO --- p.134Chapter 6.1.2. --- The DOT PRODUCT problem --- p.137Chapter 6.1.2. --- Learning sub-functions using explicit knowledge --- p.143Chapter 6.2. --- Learning logic programs --- p.148Chapter 6.2.1. --- Learning logic programs using LOGENPRO --- p.148Chapter 6.2.2. --- The Winston's arch problem --- p.151Chapter 6.2.3. --- The modified Quinlan's network reachability problem --- p.153Chapter 6.2.4. --- The factorial problem --- p.154Chapter 6.2.5. --- Discussion --- p.155Chapter 6.3. --- Learning programs in C --- p.155Chapter Chapter 7 : --- Knowledge Discovery in Databases --- p.159Chapter 7.1. --- Inducing decision trees using LOGENPRO --- p.160Chapter 7.1.1. --- Decision trees --- p.160Chapter 7.1.2. --- Representing decision trees as S-expressions --- p.164Chapter 7.1.3. --- The credit screening problem --- p.166Chapter 7.1.4. --- The experiment --- p.168Chapter 7.2. --- Learning logic program from imperfect data --- p.174Chapter 7.2.1. --- The chess endgame problem --- p.177Chapter 7.2.2. --- The setup of experiments --- p.178Chapter 7.2.3. --- Comparison of LOGENPRO with FOIL --- p.180Chapter 7.2.4. --- Comparison of LOGENPRO with BEAM-FOIL --- p.182Chapter 7.2.5. --- Comparison of LOGENPRO with mFOILl --- p.183Chapter 7.2.6. --- Comparison of LOGENPRO with mFOIL2 --- p.184Chapter 7.2.7. --- Comparison of LOGENPRO with mFOIL3 --- p.185Chapter 7.2.8. --- Comparison of LOGENPRO with mFOIL4 --- p.186Chapter 7.2.9. --- Comparison of LOGENPRO with mFOIL5 --- p.187Chapter 7.2.10. --- Discussion --- p.188Chapter 7.3. --- Learning programs in Fuzzy Prolog --- p.189Chapter Chapter 8 : --- An Adaptive Inductive Logic Programming System --- p.192Chapter 8.1. --- Adaptive Inductive Logic Programming --- p.192Chapter 8.2. --- A generic top-down ILP algorithm --- p.196Chapter 8.3. --- Inducing procedural search biases --- p.200Chapter 8.3.1. --- The evolution process --- p.201Chapter 8.3.2. --- The experimentation setup --- p.202Chapter 8.3.3. --- Fitness calculation --- p.203Chapter 8.4. --- Experimentation and evaluations --- p.204Chapter 8.4.1. --- The member predicate --- p.205Chapter 8.4.2. --- The member predicate in a noisy environment --- p.205Chapter 8.4.3. --- The multiply predicate --- p.206Chapter 8.4.4. --- The uncle predicate --- p.207Chapter 8.5. --- Discussion --- p.208Chapter Chapter 9 : --- Conclusion and Future Work --- p.210Chapter 9.1. --- Conclusion --- p.210Chapter 9.2. --- Future work --- p.217Chapter 9.2.1. --- Applying LOGENPRO to discover knowledge from databases --- p.217Chapter 9.2.2. --- Learning recursive programs --- p.218Chapter 9.2.3. --- Applying LOGENPRO in engineering design --- p.220Chapter 9.2.4. --- Exploiting parallelism of evolutionary algorithms --- p.222Reference --- p.227Appendix A --- p.23

    A knowledge-intensive approach to learning relational concepts

    No full text

    Winston, P.H., Binford, T.O., Katz, B., Lowry, M. (1983). Learning physical descriptions

    No full text
    g neural networks. Proceedings of the First International Workshop on Multistrategy Learning (pp. 257-272). Fairfax, VA: George Mason University. Utgoff, P.E. (1989). Incremental induction of decision trees. Machine Learning, 4, 161-186. 44 Nakamura, G.V. (1985). Knowledge-based classification of ill-defined categories. Memory and Cognition, 13, 377-84. Ourston, D. & Mooney, R. (1990). Changing the rules: A comprehensive approach to theory refinement. Proceedings of the Eighth National Conference on Artificial Intelligence (pp. 815-820). Cambridge, MA: MIT Press. Pazzani, M.J. (1991). Influence of prior knowledge on concept acquisition: Experimental and computational results. Journal of Experimental Psychology: Learning, Memory, and Cognition, 17, 416-432. Pazzani, M.J., Brunk, C, & Silverstein, G. (1991). A knowledge-intensive approach to learning relational concepts. Proceedings of the Eighth International Workshop on Machine Learning (pp. 432-436). San Mateo, CA: Morgan Kaufm
    corecore