141 research outputs found
Simulating Ability: Representing Skills in Games
Throughout the history of games, representing the abilities of the various
agents acting on behalf of the players has been a central concern. With
increasingly sophisticated games emerging, these simulations have become more
realistic, but the underlying mechanisms are still, to a large extent, of an ad
hoc nature. This paper proposes using a logistic model from psychometrics as a
unified mechanism for task resolution in simulation-oriented games
Set Theory and its Place in the Foundations of Mathematics:a new look at an old question
This paper reviews the claims of several main-stream candidates to be the foundations of mathematics, including set theory. The review concludes that at this level of mathematical knowledge it would be very unreasonable to settle with any one of these foundations and that the only reasonable choice is a pluralist one
Phase transition in the Jarzynski estimator of free energy differences
The transition between a regime in which thermodynamic relations apply only
to ensembles of small systems coupled to a large environment and a regime in
which they can be used to characterize individual macroscopic systems is
analyzed in terms of the change in behavior of the Jarzynski estimator of
equilibrium free energy differences from nonequilibrium work measurements.
Given a fixed number of measurements, the Jarzynski estimator is unbiased for
sufficiently small systems. In these systems, the directionality of time is
poorly defined and configurations that dominate the empirical average, but
which are in fact typical of the reverse process, are sufficiently well
sampled. As the system size increases the arrow of time becomes better defined.
The dominant atypical fluctuations become rare and eventually cannot be sampled
with the limited resources that are available. Asymptotically, only typical
work values are measured. The Jarzynski estimator becomes maximally biased and
approaches the exponential of minus the average work, which is the result that
is expected from standard macroscopic thermodynamics. In the proper scaling
limit, this regime change can be described in terms of a phase transition in
variants of the random energy model (REM). This correspondence is explicitly
demonstrated in several examples of physical interest: near-equilibrium
processes in which the work distribution is Gaussian, the sudden compression of
an ideal gas and adiabatic quasi-static volume changes in a dilute real gas.Comment: 29 pages, 5 figures, accepted for publication in Physical Review E
(2012
The Ehrenfest urn revisited: Playing the game on a realistic fluid model
The Ehrenfest urn process, also known as the dogs and fleas model, is
realistically simulated by molecular dynamics of the Lennard-Jones fluid. The
key variable is Delta z, i.e. the absolute value of the difference between the
number of particles in one half of the simulation box and in the other half.
This is a pure-jump stochastic process induced, under coarse graining, by the
deterministic time evolution of the atomic coordinates. We discuss the Markov
hypothesis by analyzing the statistical properties of the jumps and of the
waiting times between jumps. In the limit of a vanishing integration time-step,
the distribution of waiting times becomes closer to an exponential and,
therefore, the continuous-time jump stochastic process is Markovian. The random
variable Delta z behaves as a Markov chain and, in the gas phase, the observed
transition probabilities follow the predictions of the Ehrenfest theory.Comment: Accepted by Physical Review E on 4 May 200
A new foundational crisis in mathematics, is it really happening?
The article reconsiders the position of the foundations of mathematics after
the discovery of HoTT. Discussion that this discovery has generated in the
community of mathematicians, philosophers and computer scientists might
indicate a new crisis in the foundation of mathematics. By examining the
mathematical facts behind HoTT and their relation with the existing
foundations, we conclude that the present crisis is not one. We reiterate a
pluralist vision of the foundations of mathematics. The article contains a
short survey of the mathematical and historical background needed to understand
the main tenets of the foundational issues.Comment: Final versio
Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond
Many historians of the calculus deny significant continuity between
infinitesimal calculus of the 17th century and 20th century developments such
as Robinson's theory. Robinson's hyperreals, while providing a consistent
theory of infinitesimals, require the resources of modern logic; thus many
commentators are comfortable denying a historical continuity. A notable
exception is Robinson himself, whose identification with the Leibnizian
tradition inspired Lakatos, Laugwitz, and others to consider the history of the
infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies,
Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly
demonstrating the inconsistency of reasoning with historical infinitesimal
magnitudes. We argue that Robinson, among others, overestimates the force of
Berkeley's criticisms, by underestimating the mathematical and philosophical
resources available to Leibniz. Leibniz's infinitesimals are fictions, not
logical fictions, as Ishiguro proposed, but rather pure fictions, like
imaginaries, which are not eliminable by some syncategorematic paraphrase. We
argue that Leibniz's defense of infinitesimals is more firmly grounded than
Berkeley's criticism thereof. We show, moreover, that Leibniz's system for
differential calculus was free of logical fallacies. Our argument strengthens
the conception of modern infinitesimals as a development of Leibniz's strategy
of relating inassignable to assignable quantities by means of his
transcendental law of homogeneity.Comment: 69 pages, 3 figure
Ibn Sīnā on Analysis: 1. Proof Search. Or: Abstract State Machines as a Tool for History of Logic
Henri Poincaré: The Status of Mechanical Explanations and the Foundations of Statistical Mechanics
The first goal of this paper is to show the evolution of Poincaré’s opinion on the mechanistic reduction of the principles of thermodynamics, placing it in the context of the science of his time. The second is to present some of his work in 1890 on the foundations of statistical mechanics. He became interested first in thermodynamics and its relation with mechanics, drawing on the work of Helm-holtz on monocyclic systems. After a period of skepticism concerning the kinetic theory, he read some of Maxwell’s memories and contributed to the foundations of statistical mechanics. I also show that Poincaré's contributions to the founda-tions of statistical mechanics are closely linked to his work in celestial mechanics and its interest in probability theory and its role in physics
An impossibility theorem for paired comparisons
In several decision-making problems, alternatives should be ranked on the
basis of paired comparisons between them. We present an axiomatic approach for
the universal ranking problem with arbitrary preference intensities, incomplete
and multiple comparisons. In particular, two basic properties -- independence
of irrelevant matches and self-consistency -- are considered. It is revealed
that there exists no ranking method satisfying both requirements at the same
time. The impossibility result holds under various restrictions on the set of
ranking problems, however, it does not emerge in the case of round-robin
tournaments. An interesting and more general possibility result is obtained by
restricting the domain of independence of irrelevant matches through the
concept of macrovertex.Comment: 18 pages, 4 figure
- …