1,043 research outputs found
Multiplayer Cost Games with Simple Nash Equilibria
Multiplayer games with selfish agents naturally occur in the design of
distributed and embedded systems. As the goals of selfish agents are usually
neither equivalent nor antagonistic to each other, such games are non zero-sum
games. We study such games and show that a large class of these games,
including games where the individual objectives are mean- or discounted-payoff,
or quantitative reachability, and show that they do not only have a solution,
but a simple solution. We establish the existence of Nash equilibria that are
composed of k memoryless strategies for each agent in a setting with k agents,
one main and k-1 minor strategies. The main strategy describes what happens
when all agents comply, whereas the minor strategies ensure that all other
agents immediately start to co-operate against the agent who first deviates
from the plan. This simplicity is important, as rational agents are an
idealisation. Realistically, agents have to decide on their moves with very
limited resources, and complicated strategies that require exponential--or even
non-elementary--implementations cannot realistically be implemented. The
existence of simple strategies that we prove in this paper therefore holds a
promise of implementability.Comment: 23 page
Late Quaternary coastal evolution and aeolian sedimentation in the tectonically-active southern Atacama Desert, Chile
Analyses of aeolianites and associated dune, surficial carbonate and marine terrace sediments from north-central Chile (27° 54âČ S) yield a record of environmental change for the coastal southern Atacama Desert spanning at least the last glacial-interglacial cycle. Optically stimulated luminescence dating indicates phases of aeolian dune construction at around 130, 111â98, 77â69 and 41â28 ka. Thin-section and stable carbon and oxygen isotope analyses suggest a predominantly marine sediment source for the three oldest dune phases. Aeolianites appear to have accumulated mainly from tectonically-uplifted interglacial marine sediments that were deflated during windier and/or stormier intervals. Bedding orientations indicate that sand-transporting winds varied in direction from S-ESE during MIS 5e and WNW-ESE during MIS 5c-5a. Winds from the southeast quadrant are unusual today in this region of the Atacama, suggesting either major shifts in atmospheric circulation or topographic airflow modification. Thin-section evidence indicates that the aeolianites were cemented by two phases of vadose carbonate, tentatively linked to wetter periods around 70 and 45 ka. Tectonic uplift in the area has proceeded at an average rate of 305â542 mm kyrâ 1. The study illustrates the complexity of understanding onshore-offshore sediment fluxes in the context of Late Quaternary sea-level fluctuations for an area undergoing rapid tectonic uplift
Defining sustainable and precautionary harvest rates for data-limited short-lived stocks: a case study of sprat (Sprattus sprattus) in the English Channel
Empirical harvest control rules set catch advice based on observed indicators and are increasingly being used worldwide to manage fish stocks that lack formal assessments of stock and exploitation status. Within the International Council for the Exploration of the Sea, trend-based rules that adjust advice according to recent survey observations have been adopted; however, there is increasing evidence that such rules do not work well for short-lived pelagic species that can exhibit large inter-annual fluctuations in stock size. Constant harvest rates, removing a fixed proportion of observed biomass index, have been proposed as a suitable strategy for managing short-lived species. Unknown survey catchability has, however, remained a barrier to reliance on their application on these stocks in the past. We apply simulation testing to define a robust, sustainable constant harvest rate for a data-limited short-lived stock, using the English Channel sprat as a case study. By conditioning a management strategy evaluation framework based on existing and borrowed life-history parameters and precautionary considerations, we test and show that a constant harvest rate outperforms trend-based catch rules, maximizing yields while reducing risks of stock overexploitation, and conclude an 8.6% constant harvest rate provides sufficiently precautionary catch advice for this stock.publishedVersio
Percolation model for structural phase transitions in LiHIO mixed crystals
A percolation model is proposed to explain the structural phase transitions
found in LiHIO mixed crystals as a function of the
concentration parameter . The percolation thresholds are obtained from Monte
Carlo simulations on the specific lattices occupied by lithium atoms and
hydrogen bonds. The theoretical results strongly suggest that percolating
lithium vacancies and hydrogen bonds are indeed responsible for the solid
solution observed in the experimental range .Comment: 4 pages, 2 figure
Local structure of the set of steady-state solutions to the 2D incompressible Euler equations
It is well known that the incompressible Euler equations can be formulated in
a very geometric language. The geometric structures provide very valuable
insights into the properties of the solutions. Analogies with the
finite-dimensional model of geodesics on a Lie group with left-invariant metric
can be very instructive, but it is often difficult to prove analogues of
finite-dimensional results in the infinite-dimensional setting of Euler's
equations. In this paper we establish a result in this direction in the simple
case of steady-state solutions in two dimensions, under some non-degeneracy
assumptions. In particular, we establish, in a non-degenerate situation, a
local one-to-one correspondence between steady-states and co-adjoint orbits.Comment: 81 page
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
Intercalibration of the barrel electromagnetic calorimeter of the CMS experiment at start-up
Calibration of the relative response of the individual channels of the barrel electromagnetic calorimeter of the CMS detector was accomplished, before installation, with cosmic ray muons and test beams. One fourth of the calorimeter was exposed to a beam of high energy electrons and the relative calibration of the channels, the intercalibration, was found to be reproducible to a precision of about 0.3%. Additionally, data were collected with cosmic rays for the entire ECAL barrel during the commissioning phase. By comparing the intercalibration constants obtained with the electron beam data with those from the cosmic ray data, it is demonstrated that the latter provide an intercalibration precision of 1.5% over most of the barrel ECAL. The best intercalibration precision is expected to come from the analysis of events collected in situ during the LHC operation. Using data collected with both electrons and pion beams, several aspects of the intercalibration procedures based on electrons or neutral pions were investigated
Detector Description and Performance for the First Coincidence Observations between LIGO and GEO
For 17 days in August and September 2002, the LIGO and GEO interferometer
gravitational wave detectors were operated in coincidence to produce their
first data for scientific analysis. Although the detectors were still far from
their design sensitivity levels, the data can be used to place better upper
limits on the flux of gravitational waves incident on the earth than previous
direct measurements. This paper describes the instruments and the data in some
detail, as a companion to analysis papers based on the first data.Comment: 41 pages, 9 figures 17 Sept 03: author list amended, minor editorial
change
Measurement of the B0-anti-B0-Oscillation Frequency with Inclusive Dilepton Events
The - oscillation frequency has been measured with a sample of
23 million \B\bar B pairs collected with the BABAR detector at the PEP-II
asymmetric B Factory at SLAC. In this sample, we select events in which both B
mesons decay semileptonically and use the charge of the leptons to identify the
flavor of each B meson. A simultaneous fit to the decay time difference
distributions for opposite- and same-sign dilepton events gives ps.Comment: 7 pages, 1 figure, submitted to Physical Review Letter
- âŠ