6,123 research outputs found
Generating statistical distributions without maximizing the entropy
We show here how to use pieces of thermodynamics' first law to generate
probability distributions for generalized ensembles when only level-population
changes are involved. Such microstate occupation modifications, if properly
constrained via first law ingredients, can be associated not exclusively to
heat and acquire a more general meaning.Comment: 6 pages, no figures, Conferenc
A simple derivation and classification of common probability distributions based on information symmetry and measurement scale
Commonly observed patterns typically follow a few distinct families of
probability distributions. Over one hundred years ago, Karl Pearson provided a
systematic derivation and classification of the common continuous
distributions. His approach was phenomenological: a differential equation that
generated common distributions without any underlying conceptual basis for why
common distributions have particular forms and what explains the familial
relations. Pearson's system and its descendants remain the most popular
systematic classification of probability distributions. Here, we unify the
disparate forms of common distributions into a single system based on two
meaningful and justifiable propositions. First, distributions follow maximum
entropy subject to constraints, where maximum entropy is equivalent to minimum
information. Second, different problems associate magnitude to information in
different ways, an association we describe in terms of the relation between
information invariance and measurement scale. Our framework relates the
different continuous probability distributions through the variations in
measurement scale that change each family of maximum entropy distributions into
a distinct family.Comment: 17 pages, 0 figure
Entropy Concentration and the Empirical Coding Game
We give a characterization of Maximum Entropy/Minimum Relative Entropy
inference by providing two `strong entropy concentration' theorems. These
theorems unify and generalize Jaynes' `concentration phenomenon' and Van
Campenhout and Cover's `conditional limit theorem'. The theorems characterize
exactly in what sense a prior distribution Q conditioned on a given constraint,
and the distribution P, minimizing the relative entropy D(P ||Q) over all
distributions satisfying the constraint, are `close' to each other. We then
apply our theorems to establish the relationship between entropy concentration
and a game-theoretic characterization of Maximum Entropy Inference due to
Topsoe and others.Comment: A somewhat modified version of this paper was published in Statistica
Neerlandica 62(3), pages 374-392, 200
From Physics to Economics: An Econometric Example Using Maximum Relative Entropy
Econophysics, is based on the premise that some ideas and methods from
physics can be applied to economic situations. We intend to show in this paper
how a physics concept such as entropy can be applied to an economic problem. In
so doing, we demonstrate how information in the form of observable data and
moment constraints are introduced into the method of Maximum relative Entropy
(MrE). A general example of updating with data and moments is shown. Two
specific econometric examples are solved in detail which can then be used as
templates for real world problems. A numerical example is compared to a large
deviation solution which illustrates some of the advantages of the MrE method.Comment: This paper has been accepted in Physica A. 19 Pages, 3 Figure
Not Just Cyberwarfare
© Springer Science+Business Media Dordrecht 2015Bringsjord and Licato provide a general meta-argument that cyberwarfare is so different from traditional kinetic warfare that no argument from analogy can allow the just war theory of Augustine and Aquinas (hereinafter called JWT) to be pulled over from traditional (modern) warfare to cyberwarfare. I believe that this meta- argument is sound and that it applies not just to cyberwarfare: in particular, on my reading of the meta-argument, argument from analogy has never been adequate to allow JWT to be applied to the kind of warfare that we are familiar with now.Peer reviewedSubmitted Versio
An integrated remote sensing approach for identifying ecological range sites
A model approach for identifying ecological range sites was applied to high elevation sagebrush-dominated rangelands on Parker Mountain, in south-central Utah. The approach utilizes map information derived from both high altitude color infrared photography and LANDSAT digital data, integrated with soils, geological, and precipitation maps. Identification of the ecological range site for a given area requires an evaluation of all relevant environmental factors which combine to give that site the potential to produce characteristic types and amounts of vegetation. A table is presented which allows the user to determine ecological range site based upon an integrated use of the maps which were prepared. The advantages of identifying ecological range sites through an integrated photo interpretation/LANDSAT analysis are discussed
Entropic criterion for model selection
Model or variable selection is usually achieved through ranking models
according to the increasing order of preference. One of methods is applying
Kullback-Leibler distance or relative entropy as a selection criterion. Yet
that will raise two questions, why uses this criterion and are there any other
criteria. Besides, conventional approaches require a reference prior, which is
usually difficult to get. Following the logic of inductive inference proposed
by Caticha, we show relative entropy to be a unique criterion, which requires
no prior information and can be applied to different fields. We examine this
criterion by considering a physical problem, simple fluids, and results are
promising.Comment: 10 pages. Accepted for publication in Physica A, 200
- …