Skip to main content
Article thumbnail
Location of Repository

Data Fusion with Entropic Priors

By Francesco Palmieri and Domenico Ciuonzo


Abstract. In classification problems, lack of knowledge of the prior distribution may make the application of Bayes ’ rule inadequate. Uniform or arbitrary priors may often provide classification answers that, even in simple examples, may end up contradicting our common sense about the problem. Entropic priors, via application of the maximum entropy principle, seem to provide a much better answer and can be easily derived and applied to classification tasks when no more than the likelihood funtions are available. In this paper we present an application example in which the use of the entropic priors is compared to the results of the application of Dempster-Shafer theory

Topics: Artificial Intelligence, Propagation of Belief, Data Fusion
Year: 2010
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.