173 research outputs found

    Maximum Probability/Entropy translating of contiguous categorical observations into frequencies

    Get PDF
    Maximum Probability method is used to translate possibly contiguous and overlapping categorical observations into frequencies.contiguous categorical observations, maximum probability, maximum entropy, inverse problem

    Gibbs conditioning extended, Boltzmann conditioning introduced

    Full text link
    Conditional Equi-concentration of Types on I-projections (ICET) and Extended Gibbs Conditioning Principle (EGCP) provide an extension of Conditioned Weak Law of Large Numbers and of Gibbs Conditioning Principle to the case of non-unique Relative Entropy Maximizing (REM) distribution (aka I-projection). ICET and EGCP give a probabilistic justification to REM under rather general conditions. mu-projection variants of the results are introduced. They provide a probabilistic justification to Maximum Probability (MaxProb) method. 'REM/MaxEnt or MaxProb?' question is discussed, briefly. Jeffreys Conditioning Principle is mentioned.Comment: Three major changes: 1) Definition of proper I-projection has been changed. 2) An argument preceding Eq. (7) at the proof of ICET is now correctly stated. 3) Abstract was rewritten. To appear at Proceedings of MaxEnt 2004 worksho

    MiniMax entropy and maximum likelihood: Complementarity of tasks, identity of solutions

    Get PDF
    • …
    corecore