Article thumbnail
Location of Repository

Towards comprehensive foundations of computational intelligence

By Prof Wlodzislaw Duch


Although computational intelligence (CI) covers a vast variety of different methods it still lacks an integrative theory. Several proposals for CI foundations are discussed: computing and cognition as compression, meta-learning as search in the space of data models, (dis)similarity based methods providing a framework for such meta-learning, and a more general approach based on chains of transformations. Many useful transformations that extract information from features are discussed. Heterogeneous adaptive systems are presented as particular example of transformation-based systems, and the goal of learning is redefined to facilitate creation of simpler data models. The need to understand data structures leads to techniques for logical and prototype-based rule extraction, and to generation of multiple alternative models, while the need to increase predictive power of adaptive models leads to committees of competent models. Learning from partial observations is a natural extension towards reasoning based on perceptions, and an approach to intuitive solving of such problems is presented. Throughout the paper neurocognitive inspirations are frequently used and are especially important in modeling of the higher cognitive functions. Promising directions such as liquid and laminar computing are identified and many open problems presented. \ud \u

Topics: Language, Machine Learning, Artificial Intelligence, Neural Nets
Publisher: Springer
Year: 2007
OAI identifier:

Suggested articles


  1. (2001). A new methodology of extraction, optimization and application of crisp and fuzzy logical rules.
  2. (2003). Adaptive resonance theory.
  3. (1987). Algorithmic Information Theory.
  4. An Adaptive Neural Network. The Cerebral Cortex.
  5. (1999). An empirical comparison of voting classification algorithms: bagging, boosting and variants.
  6. (1998). Bias-variance, regularization, instability and stabilization.
  7. (1999). Boosted mixture of experts: An ensemble learning scheme.
  8. (2000). Classification, association and pattern completion using neural similarity based methods.
  9. (2003). Committees of undemocratic competent models.
  10. (2002). Competent undemocratic committees.
  11. (1994). Complex systems, information theory and neural networks.
  12. (2004). Computational intelligence methods for understanding of data.
  13. (2001). Constructive density estimation network based on several different separable transfer functions.
  14. (2001). Dynamic classifier selection based on multiple classifier behaviour.
  15. (2001). Evolution of functional link networks.
  16. (1987). Exploratory projection pursuit.
  17. (1998). Extraction of logical rules from backpropagation networks.
  18. (1995). Feature space mapping as a universal adaptive system.
  19. (2000). Feature space mapping neural network applied to structure-activity relationship problems.
  20. (1995). Feature space mapping: a neurofuzzy network for system identification.
  21. (2004). Fuzzy rule-based systems derived from similarity to prototypes.
  22. (2002). Heterogeneous adaptive systems.
  23. (2004). Introduction to the special issue on metalearning.
  24. (2002). Kernel independent component analysis.
  25. (2000). Measurements of membership functions: Theoretical and empirical work.
  26. (2002). Meta-learning via search combined with parameter optimization.
  27. (1993). Multistrategy Learning.
  28. (1995). Neural Networks for Pattern Recognition.
  29. (1988). Neurocomputing - foundations of research.
  30. (1997). New developments in the feature space mapping model.
  31. (2006). Non-local estimation of manifold structure.
  32. (1990). Nonlinear Multivariate Analysis.
  33. (2004). Quo vadis computational intelligence? In
  34. Search and global minimization in similarity-based methods.
  35. (2006). Selection of prototypes rules context searching via clustering.
  36. (2002). Spiking Neuron Models. Single Neurons, Populations, Plasticity.
  37. Survey of neural transfer functions.
  38. (1999). Swarm Intelligence: From Natural to Artificial Systems.
  39. (1982). Syntactic Pattern Recognition and Applications.
  40. (2003). Task clustering and gating for bayesian multitask learning.
  41. (2000). Taxonomy of neural transfer functions.
  42. (1990). The cascade-correlation learning architecture.
  43. (2006). The curse of highly variable functions for local kernel machines.
  44. (2003). The pdf projection theorem and the class-specific method.
  45. (1956). Theory of Approximation. Frederick Ungar,
  46. (2006). Training a support vector machine in the primal. Neural Computation, in print,
  47. (2001). Transfer functions: hidden possibilities for better neural networks.
  48. (2006). What is a structural representation? a proposal for a representational formalism. Pattern Recognition,
  49. (2005). What is a structural representation? fifth variation.

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.