Weighted Updating generalizes Bayesian updating, allowing for biased beliefs
by weighting the likelihood function and prior distribution with positive real
exponents. I provide a rigorous foundation for the model by showing that
transforming a distribution by exponential weighting (and normalizing)
systematically affects the information entropy of the resulting distribution.
For weights greater than one the resulting distribution has less information
entropy than the original distribution, and vice versa. The entropy of a
distribution measures how informative a decision maker is treating the
underlying observation(s), so this result suggests a useful interpretation of
the weights. For example, a weight greater than one on a likelihood function
models an individual who is treating the associated observation(s) as being
more informative than a perfect Bayesian would