181 research outputs found

    Measurement Invariance, Entropy, and Probability

    Full text link
    We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint. We argue that a very common measurement scale is linear at small magnitudes grading into logarithmic at large magnitudes, leading to observations that often follow Student's probability distribution which has a Gaussian shape for small fluctuations from the mean and a power law shape for large fluctuations from the mean. An inverse scaling often arises in which measures naturally grade from logarithmic to linear as one moves from small to large magnitudes, leading to observations that often follow a gamma probability distribution. A gamma distribution has a power law shape for small magnitudes and an exponential shape for large magnitudes. The two measurement scales are natural inverses connected by the Laplace integral transform. This inversion connects the two major scaling patterns commonly found in nature. We also show that superstatistics is a special case of an integral transform, and thus can be understood as a particular way in which to change the scale of measurement. Incorporating information about measurement scale into maximum entropy provides a general approach to the relations between measurement, information and probability

    Axiomatische Meßtheorie

    Get PDF

    A simple derivation and classification of common probability distributions based on information symmetry and measurement scale

    Full text link
    Commonly observed patterns typically follow a few distinct families of probability distributions. Over one hundred years ago, Karl Pearson provided a systematic derivation and classification of the common continuous distributions. His approach was phenomenological: a differential equation that generated common distributions without any underlying conceptual basis for why common distributions have particular forms and what explains the familial relations. Pearson's system and its descendants remain the most popular systematic classification of probability distributions. Here, we unify the disparate forms of common distributions into a single system based on two meaningful and justifiable propositions. First, distributions follow maximum entropy subject to constraints, where maximum entropy is equivalent to minimum information. Second, different problems associate magnitude to information in different ways, an association we describe in terms of the relation between information invariance and measurement scale. Our framework relates the different continuous probability distributions through the variations in measurement scale that change each family of maximum entropy distributions into a distinct family.Comment: 17 pages, 0 figure

    Nonstandard utilities for lexicographically decomposable orderings

    Get PDF
    Using a basic theorem from mathematical logic, I show that there are field-extensions of R on which a class of orderings that do not admit any real-valued utility functions can be represented by uncountably large families of utility functions. These are the lexicographically decomposable orderings studied in Beardon et al. (2002a). A corollary to this result yields an uncountably large family of very simple utility functions for the lexicographic ordering of the real Cartesian plane. I generalise these results to the lexicographic ordering of R^n, for every n > 2, and to lexicographic products of lexicographically decomposable chains. I conclude by showing how almost all of these results may be obtained without any appeal to the Axiom of Choice

    Divergent mathematical treatments in utility theory

    Get PDF
    In this paper I study how divergent mathematical treatments affect mathematical modelling, with a special focus on utility theory. In particular I examine recent work on the ranking of information states and the discounting of future utilities, in order to show how, by replacing the standard analytical treatment of the models involved with one based on the framework of Nonstandard Analysis, diametrically opposite results are obtained. In both cases, the choice between the standard and nonstandard treatment amounts to a selection of set-theoretical parameters that cannot be made on purely empirical grounds. The analysis of this phenomenon gives rise to a simple logical account of the relativity of impossibility theorems in economic theory, which concludes the paper

    Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond

    Full text link
    Many historians of the calculus deny significant continuity between infinitesimal calculus of the 17th century and 20th century developments such as Robinson's theory. Robinson's hyperreals, while providing a consistent theory of infinitesimals, require the resources of modern logic; thus many commentators are comfortable denying a historical continuity. A notable exception is Robinson himself, whose identification with the Leibnizian tradition inspired Lakatos, Laugwitz, and others to consider the history of the infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies, Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly demonstrating the inconsistency of reasoning with historical infinitesimal magnitudes. We argue that Robinson, among others, overestimates the force of Berkeley's criticisms, by underestimating the mathematical and philosophical resources available to Leibniz. Leibniz's infinitesimals are fictions, not logical fictions, as Ishiguro proposed, but rather pure fictions, like imaginaries, which are not eliminable by some syncategorematic paraphrase. We argue that Leibniz's defense of infinitesimals is more firmly grounded than Berkeley's criticism thereof. We show, moreover, that Leibniz's system for differential calculus was free of logical fallacies. Our argument strengthens the conception of modern infinitesimals as a development of Leibniz's strategy of relating inassignable to assignable quantities by means of his transcendental law of homogeneity.Comment: 69 pages, 3 figure

    The Algebraic versus the Topological Approach to Additive Representations

    Get PDF
    It is proved that, under a nontriviality assumption, an additive function on a Cartesian product of connected topological spaces is continuous, whenever the preference relation, represented by this function, is continuous. The result is used to generalize a theorem of Debreu ((1960). Mathematical methods in the social sciences (pp. 16–26). Stanford: Stanford Univ. Press) on additive representations and to argue that the algebraic approach of KLST to additive conjoint measurement is preferable to the more customary topological approach. Applications to the representation of strength of preference relations and to the characterization of subjective expected utility maximization are given
    corecore