Skip to main content
Article thumbnail
Location of Repository

Extending Maximum Entropy Techniques to Entropy Constraints Gang Xiang

By Philips Healthcare and Vladik Kreinovich


Abstract—In many practical situations, we have only partial information about the probabilities. In some cases, we have crisp (interval) bounds on the probabilities and/or on the related statistical characteristics. In other situations, we have fuzzy bounds, i.e., different interval bounds with different degrees of certainty. In a situation with uncertainty, we do not know the exact value of the desired characteristic. In such situations, it is desirable to find its worst possible value, its best possible value, and its “typical ” value – corresponding to the “most probable” probability distribution. Usually, as such a “typical ” distribution, we select the one with the largest value of the entropy. This works perfectly well in usual cases when the information about the distribution consists of the values of moments and other characteristics. For example, if we only know the first and th

Year: 2011
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.