Skip to main content
Article thumbnail
Location of Repository

Using Feature Weights to Improve Performance of Neural Networks

By Ridwan Al Iqbal


Different features have different relevance to a particular learning problem. Some features are less relevant; while some very important. Instead of selecting the most relevant features using feature selection, an algorithm can be given this knowledge of feature importance based on expert opinion or prior learning. Learning can be faster and more accurate if learners take feature importance into account. Correlation aided Neural Networks (CANN) is presented which is such an algorithm. CANN treats feature importance as the correlation coefficient between the target attribute and the features. CANN modifies normal feed-forward Neural Network to fit both correlation values and training data. Empirical evaluation shows that CANN is faster and more accurate than applying the two step approach of feature selection and then using normal learning algorithms

Topics: Artificial Intelligence, Machine Learning, Neural Nets
Year: 2011
OAI identifier:
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles


    1. (1991). A practical guide to knowledge acquisition. doi
    2. (1994). An Introduction to Computational Learning Theory.
    3. (2003). An Introduction to Variable and Feature Selection.
    4. (1997). Artificial neural networks.
    5. (1993). C4.5: Programs for Machine Learning.
    6. (2003). Distributional word clusters vs. words for text categorization.
    7. (2011). Empirical learning aided by weak knowledge in the form of feature importance.
    8. (1990). Feature Selection Using a Multilayer Perceptron. doi
    9. (1994). Knowledge-based artificial neural networks. doi
    10. (2002). Knowledge-Based Support Vector Machine Classifiers.
    11. (1998). Neural Networks: A Comprehensive Foundation. 2nd ed.
    12. (2010). Ontology-based Clustering Algorithm with Feature Weights.
    13. (1989). Special issue on knowledge acquisition.
    14. (1998). Statistical Learning Theory. doi
    15. (1992). Tangent prop-A formalism for specifying selected invariances in an adaptive network.
    16. (2009). The Feature Importance Ranking Measure.
    17. (1984). Thirteen ways to look at the correlation coefficient.

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.