Article thumbnail
Location of Repository

Attribute weighted Naive Bayes classifier using a local optimization

By Sona Taheri, John Yearwood, Musa Mammadov and Sattar Seifollahi


The Naive Bayes classifier is a popular classification technique for data mining and machine learning. It has been shown to be very effective on a variety of data classification problems. However, the strong assumption that all attributes are conditionally independent given the class is often violated in real-world applications. Numerous methods have been proposed in order to improve the performance of the Naive Bayes classifier by alleviating the attribute independence assumption. However, violation of the independence assumption can increase the expected error. Another alternative is assigning the weights for attributes. In this paper, we propose a novel attribute weighted Naive Bayes classifier by considering weights to the conditional probabilities. An objective function is modeled and taken into account, which is based on the structure of the Naive Bayes classifier and the attribute weights. The optimal weights are determined by a local optimization method using the quasisecant method. In the proposed approach, the Naive Bayes classifier is taken as a starting point. We report the results of numerical experiments on several real-world data sets in binary classification, which show the efficiency of the proposed method

Topics: 1702 Cognitive Science, 0801 Artificial Intelligence and Image Processing, Classification, Naive bayes, Attribute weighting, Local optimization
Year: 2013
DOI identifier: 10.1007/s00521-012-1329-z
OAI identifier:

Suggested articles

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.