3,522 research outputs found
An Adversarial Interpretation of Information-Theoretic Bounded Rationality
Recently, there has been a growing interest in modeling planning with
information constraints. Accordingly, an agent maximizes a regularized expected
utility known as the free energy, where the regularizer is given by the
information divergence from a prior to a posterior policy. While this approach
can be justified in various ways, including from statistical mechanics and
information theory, it is still unclear how it relates to decision-making
against adversarial environments. This connection has previously been suggested
in work relating the free energy to risk-sensitive control and to extensive
form games. Here, we show that a single-agent free energy optimization is
equivalent to a game between the agent and an imaginary adversary. The
adversary can, by paying an exponential penalty, generate costs that diminish
the decision maker's payoffs. It turns out that the optimal strategy of the
adversary consists in choosing costs so as to render the decision maker
indifferent among its choices, which is a definining property of a Nash
equilibrium, thus tightening the connection between free energy optimization
and game theory.Comment: 7 pages, 4 figures. Proceedings of AAAI-1
Prediction of infectious disease epidemics via weighted density ensembles
Accurate and reliable predictions of infectious disease dynamics can be
valuable to public health organizations that plan interventions to decrease or
prevent disease transmission. A great variety of models have been developed for
this task, using different model structures, covariates, and targets for
prediction. Experience has shown that the performance of these models varies;
some tend to do better or worse in different seasons or at different points
within a season. Ensemble methods combine multiple models to obtain a single
prediction that leverages the strengths of each model. We considered a range of
ensemble methods that each form a predictive density for a target of interest
as a weighted sum of the predictive densities from component models. In the
simplest case, equal weight is assigned to each component model; in the most
complex case, the weights vary with the region, prediction target, week of the
season when the predictions are made, a measure of component model uncertainty,
and recent observations of disease incidence. We applied these methods to
predict measures of influenza season timing and severity in the United States,
both at the national and regional levels, using three component models. We
trained the models on retrospective predictions from 14 seasons (1997/1998 -
2010/2011) and evaluated each model's prospective, out-of-sample performance in
the five subsequent influenza seasons. In this test phase, the ensemble methods
showed overall performance that was similar to the best of the component
models, but offered more consistent performance across seasons than the
component models. Ensemble methods offer the potential to deliver more reliable
predictions to public health decision makers.Comment: 20 pages, 6 figure
Computational multi-depth single-photon imaging
We present an imaging framework that is able to accurately reconstruct multiple depths at individual pixels from single-photon observations. Our active imaging method models the single-photon detection statistics from multiple reflectors within a pixel, and it also exploits the fact that a multi-depth profile at each pixel can be expressed as a sparse signal. We interpret the multi-depth reconstruction problem as a sparse deconvolution problem using single-photon observations, create a convex problem through discretization and relaxation, and use a modified iterative shrinkage-thresholding algorithm to efficiently solve for the optimal multi-depth solution. We experimentally demonstrate that the proposed framework is able to accurately reconstruct the depth features of an object that is behind a partially-reflecting scatterer and 4 m away from the imager with root mean-square error of 11 cm, using only 19 signal photon detections per pixel in the presence of moderate background light. In terms of root mean-square error, this is a factor of 4.2 improvement over the conventional method of Gaussian-mixture fitting for multi-depth recovery.This material is based upon work supported in part by a Samsung Scholarship, the US National Science Foundation under Grant No. 1422034, and the MIT Lincoln Laboratory Advanced Concepts Committee. We thank Dheera Venkatraman for his assistance with the experiments. (Samsung Scholarship; 1422034 - US National Science Foundation; MIT Lincoln Laboratory Advanced Concepts Committee)Accepted manuscrip
- …