181 research outputs found
Measurement Invariance, Entropy, and Probability
We show that the natural scaling of measurement for a particular problem
defines the most likely probability distribution of observations taken from
that measurement scale. Our approach extends the method of maximum entropy to
use measurement scale as a type of information constraint. We argue that a very
common measurement scale is linear at small magnitudes grading into logarithmic
at large magnitudes, leading to observations that often follow Student's
probability distribution which has a Gaussian shape for small fluctuations from
the mean and a power law shape for large fluctuations from the mean. An inverse
scaling often arises in which measures naturally grade from logarithmic to
linear as one moves from small to large magnitudes, leading to observations
that often follow a gamma probability distribution. A gamma distribution has a
power law shape for small magnitudes and an exponential shape for large
magnitudes. The two measurement scales are natural inverses connected by the
Laplace integral transform. This inversion connects the two major scaling
patterns commonly found in nature. We also show that superstatistics is a
special case of an integral transform, and thus can be understood as a
particular way in which to change the scale of measurement. Incorporating
information about measurement scale into maximum entropy provides a general
approach to the relations between measurement, information and probability
A simple derivation and classification of common probability distributions based on information symmetry and measurement scale
Commonly observed patterns typically follow a few distinct families of
probability distributions. Over one hundred years ago, Karl Pearson provided a
systematic derivation and classification of the common continuous
distributions. His approach was phenomenological: a differential equation that
generated common distributions without any underlying conceptual basis for why
common distributions have particular forms and what explains the familial
relations. Pearson's system and its descendants remain the most popular
systematic classification of probability distributions. Here, we unify the
disparate forms of common distributions into a single system based on two
meaningful and justifiable propositions. First, distributions follow maximum
entropy subject to constraints, where maximum entropy is equivalent to minimum
information. Second, different problems associate magnitude to information in
different ways, an association we describe in terms of the relation between
information invariance and measurement scale. Our framework relates the
different continuous probability distributions through the variations in
measurement scale that change each family of maximum entropy distributions into
a distinct family.Comment: 17 pages, 0 figure
Nonstandard utilities for lexicographically decomposable orderings
Using a basic theorem from mathematical logic, I show that there are field-extensions of R on which a class of orderings that do not admit any real-valued utility functions can be represented by uncountably large families of utility functions. These are the lexicographically decomposable orderings studied in Beardon et al. (2002a). A corollary to this result yields an uncountably large family of very simple utility functions for the lexicographic ordering of the real Cartesian plane. I generalise these results to the lexicographic ordering of R^n, for every n > 2, and to lexicographic products of lexicographically decomposable chains. I conclude by showing how almost all of these results may be obtained without any appeal to the Axiom of Choice
Divergent mathematical treatments in utility theory
In this paper I study how divergent mathematical treatments affect mathematical modelling, with a special focus on utility theory. In particular I examine recent work on the ranking of information states and the discounting of future utilities, in order to show how, by replacing the standard analytical treatment of the models involved with one based on the framework of Nonstandard Analysis, diametrically opposite results are obtained. In both cases, the choice between the standard and nonstandard treatment amounts to a selection of set-theoretical parameters that cannot be made on purely empirical grounds. The analysis of this phenomenon gives rise to a simple logical account of the relativity of impossibility theorems in economic theory, which concludes the paper
Recommended from our members
Quantum-like model of subjective expected utility
We present a very general quantum-like model of lottery selection based on representation of beliefs of an agent by pure quantum states. Subjective probabilities are mathematically realized in the framework of quantum probability (QP). Utility functions are borrowed from the classical decision theory. But in the model they are represented not only by their values. Heuristically one can say that each value ui=u(xi) is surrounded by a cloud of information related to the event (A,xi). An agent processes this information by using the rules of quantum information and QP. This process is very complex; it combines counterfactual reasoning for comparison between preferences for different outcomes of lotteries which are in general complementary. These comparisons induce interference type effects (constructive or destructive). The decision process is mathematically represented by the comparison operator and the outcome of this process is determined by the sign of the value of corresponding quadratic form on the belief state. This operational process can be decomposed into a few subprocesses. Each of them can be formally treated as a comparison of subjective expected utilities and interference factors (the latter express, in particular, risks related to lottery selection). The main aim of this paper is to analyze the mathematical structure of these processes in the most general situation: representation of lotteries by noncommuting operators
Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond
Many historians of the calculus deny significant continuity between
infinitesimal calculus of the 17th century and 20th century developments such
as Robinson's theory. Robinson's hyperreals, while providing a consistent
theory of infinitesimals, require the resources of modern logic; thus many
commentators are comfortable denying a historical continuity. A notable
exception is Robinson himself, whose identification with the Leibnizian
tradition inspired Lakatos, Laugwitz, and others to consider the history of the
infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies,
Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly
demonstrating the inconsistency of reasoning with historical infinitesimal
magnitudes. We argue that Robinson, among others, overestimates the force of
Berkeley's criticisms, by underestimating the mathematical and philosophical
resources available to Leibniz. Leibniz's infinitesimals are fictions, not
logical fictions, as Ishiguro proposed, but rather pure fictions, like
imaginaries, which are not eliminable by some syncategorematic paraphrase. We
argue that Leibniz's defense of infinitesimals is more firmly grounded than
Berkeley's criticism thereof. We show, moreover, that Leibniz's system for
differential calculus was free of logical fallacies. Our argument strengthens
the conception of modern infinitesimals as a development of Leibniz's strategy
of relating inassignable to assignable quantities by means of his
transcendental law of homogeneity.Comment: 69 pages, 3 figure
The Algebraic versus the Topological Approach to Additive Representations
It is proved that, under a nontriviality assumption, an additive function on a Cartesian product of connected topological spaces is continuous, whenever the preference relation, represented by this function, is continuous. The result is used to generalize a theorem of Debreu ((1960). Mathematical methods in the social sciences (pp. 16–26). Stanford: Stanford Univ. Press) on additive representations and to argue that the algebraic approach of KLST to additive conjoint measurement is preferable to the more customary topological approach. Applications to the representation of strength of preference relations and to the characterization of subjective expected utility maximization are given
Recommended from our members
Is there a conjunction fallacy in legal probabilistic decision making?
Classical probability theory (CPT) has represented the rational standard for decision making in human cognition. Even though CPT has provided many descriptively excellent decision models, there have also been some empirical results persistently problematic for CPT accounts. The tension between the normative prescription of CPT and human behavior is particularly acute in cases where we have higher expectations for rational decisions. One such case concerns legal decision making from legal experts, such as attorneys and prosecutors and, more so, judges. In the present research we explore one of the most influential CPT decision fallacies, the conjunction fallacy (CF), in a legal decision making task, involving assessing evidence that the same suspect had committed two separate crimes. The information for the two crimes was presented consecutively. Each participant was asked to provide individual ratings for the two crimes in some cases and conjunctive probability rating for both crimes in other cases, after all information had been presented. Overall, 360 probability ratings for guilt were collected from 120 participants, comprised of 40 judges, 40 attorneys and prosecutors, and 40 individuals without legal education. Our results provide evidence for a double conjunction fallacy (in this case, a higher probability of committing both crimes than the probability of committing either crime individually), in the group of individuals without legal education. These results are discussed in terms of their applied implications and in relation to a recent framework for understanding such results, quantum probability theory (QPT)
- …